Abstract: Standard processes, similar and limited production
lines, the production of high direct costs will be more accurate than
the use of parts of the traditional cost systems in the literature.
However, direct costs, overhead expenses, in turn, decrease the
burden of increasingly sophisticated production facilities, a situation
that led the researchers to look for the cost of traditional systems of
alternative techniques. Variety cost management approaches for
example Total quality management (TQM), just-in-time (JIT),
benchmarking, kaizen costing, targeting cost, life cycle costs (LLC),
activity-based costing (ABC) value engineering have been
introduced. Management and cost applications have changed over the
past decade and will continue to change. Modern cost systems can
provide relevant and accurate cost information. These methods
provide the decisions about customer, product and process
improvement. The aim of study is to describe and explain the
adoption and application of costing systems in SME. This purpose
reports on a survey conducted during 2014 small and medium sized
enterprises (SME) in Ankara. The survey results were evaluated
using SPSS18 package program.
Abstract: To date, one of the few comprehensive indicators for
the measurement of food security is the Global Food Security Index
(GFSI). This index is a dynamic quantitative and qualitative
benchmarking model, constructed from 28 unique indicators, that
measures drivers of food security across both developing and
developed countries. Whereas the GFSI has been calculated across a
set of 109 countries, in this paper we aim to present and compare, for
the Middle East and North Africa (MENA), 1) the Food Security
Index scores achieved and 2) the data available on affordability,
availability, and quality of food. The data for this work was taken
from the latest available report published by the creators of the GFSI,
which in turn used information from national and international
statistical sources. MENA countries rank from place 17/109 (Israel,
although with resent political turmoil this is likely to have changed)
to place 91/109 (Yemen) with household expenditure spent in food
ranging from 15.5% (Israel) to 60% (Egypt). Lower spending on food
as a share of household consumption in most countries and better
food safety net programs in the MENA have contributed to a notable
increase in food affordability. The region has also, however,
experienced a decline in food availability, owing to more limited
food supplies and higher volatility of agricultural production. In
terms of food quality and safety the MENA has the top ranking
country (Israel). The most frequent challenges faced by the countries
of the MENA include public expenditure on agricultural research and
development as well as volatility of agricultural production. Food
security is a complex phenomenon that interacts with many other
indicators of a country’s wellbeing; in the MENA it is slowly but
markedly improving.
Abstract: Applied industrial engineering is concerned with
imparting employable skills to improve the productivity for current
situation of products and services. The purpose of this case study is to
present the results of an initial research study conducted to identify
the desired professional characteristics of an industrial engineer with
an undergraduate degree and the emerging topic areas that should be
incorporated into the curriculum to prepare industrial engineering
(IE) graduates for the future workforce. Conclusions and
recommendations for applied industrial engineering syllabus have
been gathered and reported below. A two-pronged approach was
taken which included a method of benchmarking by comparing the
applied industrial engineering curricula of various universities and an
industry survey to identify job market requirements. This
methodology produced an analysis of the changing nature of
industrial engineering from learning to practical education. A
curriculum study for engineering is a relatively unexplored area of
research in the Middle East, much less for applied industrial
engineering. This work is an effort to bridge the gap between
theoretical study in the classroom and the real world work
applications in the industrial and service sectors.
Abstract: With the objective of characterizing the profile and performance of energy use by slaughterhouses, surveys and audits were performed in two different facilities located in the northeastern region of Portugal. Energy consumption from multiple energy sources was assessed monthly, along with production and costs, for the same reference year. Gathered data was analyzed to identify and quantify the main consuming processes and to estimate energy efficiency indicators for benchmarking purposes. Main results show differences between the two slaughterhouses concerning energy sources, consumption by source and sector, and global energy efficiency. Electricity is the most used source in both slaughterhouses with a contribution of around 50%, being essentially used for meat processing and refrigeration. Natural gas, in slaughterhouse A, and pellets, in slaughterhouse B, used for heating water take the second place, with a mean contribution of about 45%. On average, a 62 kgoe/t specific energy consumption (SEC) was found, although with differences between slaughterhouses. A prominent negative correlation between SEC and carcass production was found specially in slaughterhouse A. Estimated Specific Energy Cost and Greenhouse Gases Intensity (GHGI) show mean values of about 50 €/t and 1.8 tCO2e/toe, respectively. Main results show that there is a significant margin for improving energy efficiency and therefore lowering costs in this type of non-energy intensive industries.
Abstract: This paper describes a proposal for cost calculation of warehouse processes and its usage for setting standards for performance evaluation. One of the most common options of monitoring process performance is benchmarking. The typical outcome is whether the monitored object is better or worse than an average or standard. Traditional approaches, however, cannot find any specific opportunities to improve performance or eliminate inefficiencies in processes. Higher process efficiency can be achieved for example by cost reduction assuming that the same output is generated. However, costs can be reduced only if we know their structure and we are able to calculate them accurately. In the warehouse process area it is rather difficult because in most cases we have available only aggregated values with low explanatory ability. The aim of this paper is to create a suitable method for calculating the storage costs. At the end is shown a practical example of process calculation.
Abstract: The importance of supply chain and logistics
management has been widely recognised. Effective management of
the supply chain can reduce costs and lead times and improve
responsiveness to changing customer demands. This paper proposes a
multi-matrix real-coded Generic Algorithm (MRGA) based
optimisation tool that minimises total costs associated within supply
chain logistics. According to finite capacity constraints of all parties
within the chain, Genetic Algorithm (GA) often produces infeasible
chromosomes during initialisation and evolution processes. In the
proposed algorithm, chromosome initialisation procedure, crossover
and mutation operations that always guarantee feasible solutions
were embedded. The proposed algorithm was tested using three sizes
of benchmarking dataset of logistic chain network, which are typical
of those faced by most global manufacturing companies. A half
fractional factorial design was carried out to investigate the influence
of alternative crossover and mutation operators by varying GA
parameters. The analysis of experimental results suggested that the
quality of solutions obtained is sensitive to the ways in which the
genetic parameters and operators are set.
Abstract: The volume of XML data exchange is explosively
increasing, and the need for efficient mechanisms of XML data
management is vital. Many XML storage models have been proposed
for storing XML DTD-independent documents in relational database
systems. Benchmarking is the best way to highlight pros and cons of
different approaches. In this study, we use a common benchmarking
scheme, known as XMark to compare the most cited and newly
proposed DTD-independent methods in terms of logical reads,
physical I/O, CPU time and duration. We show the effect of Label
Path, extracting values and storing in another table and type of join
needed for each method-s query answering.
Abstract: Benchmarking cleaner production performance is an
effective way of pollution control and emission reduction in coal-fired
power industry. A benchmarking method using two-stage
super-efficiency data envelopment analysis for coal-fired power plants
is proposed – firstly, to improve the cleaner production performance of
DEA-inefficient or weakly DEA-efficient plants, then to select the
benchmark from performance-improved power plants. An empirical
study is carried out with the survey data of 24 coal-fired power plants.
The result shows that in the first stage the performance of 16 plants is
DEA-efficient and that of 8 plants is relatively inefficient. The target
values for improving DEA-inefficient plants are acquired by
projection analysis. The efficient performance of 24 power plants and
the benchmarking plant is achieved in the second stage. The two-stage
benchmarking method is practical to select the optimal benchmark in
the cleaner production of coal-fired power industry and will
continuously improve plants- cleaner production performance.
Abstract: The Partitioned Global Address Space (PGAS) programming
paradigm offers ease-of-use in expressing parallelism
through a global shared address space while emphasizing performance
by providing locality awareness through the partitioning of
this address space. Therefore, the interest in PGAS programming
languages is growing and many new languages have emerged and
are becoming ubiquitously available on nearly all modern parallel
architectures. Recently, new parallel machines with multiple cores
are designed for targeting high performance applications. Most of the
efforts have gone into benchmarking but there are a few examples of
real high performance applications running on multicore machines.
In this paper, we present and evaluate a parallelization technique
for implementing a local DNA sequence alignment algorithm using
a PGAS based language, UPC (Unified Parallel C) on a chip
multithreading architecture, the UltraSPARC T1.
Abstract: The volume of XML data exchange is explosively increasing, and the need for efficient mechanisms of XML data management is vital. Many XML storage models have been proposed for storing XML DTD-independent documents in relational database systems. Benchmarking is the best way to highlight pros and cons of different approaches. In this study, we use a common benchmarking scheme, known as XMark to compare the most cited and newly proposed DTD-independent methods in terms of logical reads, physical I/O, CPU time and duration. We show the effect of Label Path, extracting values and storing in another table and type of join needed for each method's query answering.
Abstract: Ensemble learning algorithms such as AdaBoost and
Bagging have been in active research and shown improvements in
classification results for several benchmarking data sets with mainly
decision trees as their base classifiers. In this paper we experiment to
apply these Meta learning techniques with classifiers such as random
forests, neural networks and support vector machines. The data sets
are from MAGIC, a Cherenkov telescope experiment. The task is to
classify gamma signals from overwhelmingly hadron and muon
signals representing a rare class classification problem. We compare
the individual classifiers with their ensemble counterparts and
discuss the results. WEKA a wonderful tool for machine learning has
been used for making the experiments.
Abstract: This paper presents a new color face image database
for benchmarking of automatic face detection algorithms and human
skin segmentation techniques. It is named the VT-AAST image
database, and is divided into four parts. Part one is a set of 286 color
photographs that include a total of 1027 faces in the original format
given by our digital cameras, offering a wide range of difference in
orientation, pose, environment, illumination, facial expression and
race. Part two contains the same set in a different file format. The
third part is a set of corresponding image files that contain human
colored skin regions resulting from a manual segmentation
procedure. The fourth part of the database has the same regions
converted into grayscale. The database is available on-line for
noncommercial use. In this paper, descriptions of the database
development, organization, format as well as information needed for
benchmarking of algorithms are depicted in detail.
Abstract: Multimedia security is an incredibly significant area
of concern. A number of papers on robust digital watermarking have
been presented, but there are no standards that have been defined so
far. Thus multimedia security is still a posing problem. The aim of
this paper is to design a robust image-watermarking scheme, which
can withstand a different set of attacks. The proposed scheme
provides a robust solution integrating image moment normalization,
content dependent watermark and discrete wavelet transformation.
Moment normalization is useful to recover the watermark even in
case of geometrical attacks. Content dependent watermarks are a
powerful means of authentication as the data is watermarked with its
own features. Discrete wavelet transforms have been used as they
describe image features in a better manner. The proposed scheme
finds its place in validating identification cards and financial
instruments.
Abstract: Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.
Abstract: In this paper, a benchmarking framework is presented
for the performance assessment of irrigations systems. Firstly, a data
envelopment analysis (DEA) is applied to measure the technical
efficiency of irrigation systems. This method, based on linear
programming, aims to determine a consistent efficiency ranking of
irrigation systems in which known inputs, such as water volume
supplied and total irrigated area, and a given output corresponding to
the total value of irrigation production are taken into account
simultaneously. Secondly, in order to examine the irrigation
efficiency in more detail, a cross – system comparison is elaborated
using a performance indicators set selected by IWMI. The above
methodologies were applied in Thessaloniki plain, located in
Northern Greece while the results of the application are presented and
discussed. The conjunctive use of DEA and performance indicators
seems to be a very useful tool for efficiency assessment and
identification of best practices in irrigation systems management.
Abstract: The UK Government has emphasized the role of Local Authorities as a key player in its flagship residential energy efficiency strategies, by identifying and targeting areas for energy efficiency improvements. Residential energy consumption in England is characterized by significant geographical variation in energy demand, which makes centralized targeting of areas for energy efficiency intervention difficult. This paper draws on research which aims to understand how demographic, social, economic, urban form and climatic factors influence the geographical variations in English residential gas consumption. The paper reports the findings of a multiple regression model that shows how 64% of the geographical variation in residential gas consumption is accounted for by variations in these factors. Results from this study, after further refinement and validation, can be used by Local Authorities to identify areas within their boundaries that have higher than expected gas consumption, these may be prime targets for energy efficiency initiatives.
Abstract: Majority of Business Software Systems (BSS)
Development and Enhancement Projects (D&EP) fail to meet criteria
of their effectiveness, what leads to the considerable financial losses.
One of the fundamental reasons for such projects- exceptionally low
success rate are improperly derived estimates for their costs and time.
In the case of BSS D&EP these attributes are determined by the work
effort, meanwhile reliable and objective effort estimation still appears
to be a great challenge to the software engineering. Thus this paper is
aimed at presenting the most important synthetic conclusions coming
from the author-s own studies concerning the main factors of
effective BSS D&EP work effort estimation. Thanks to the rational
investment decisions made on the basis of reliable and objective
criteria it is possible to reduce losses caused not only by abandoned
projects but also by large scale of overrunning the time and costs of
BSS D&EP execution.
Abstract: The purpose of this study is to identify the critical success factors (CSFs) for the effective implementation of Six Sigma in non-formal service Sectors.
Based on the survey of literature, the critical success factors (CSFs) for Six Sigma have been identified and are assessed for their importance in Non-formal service sector using Delphi Technique. These selected CSFs were put forth to the panel of expert to cluster them and prepare cognitive map to establish their relationship.
All the critical success factors examined and obtained from the review of literature have been assessed for their importance with respect to their contribution to Six Sigma effectiveness in non formal service sector.
The study is limited to the non-formal service sectors involved in the organization of religious festival only. However, the similar exercise can be conducted for broader sample of other non-formal service sectors like temple/ashram management, religious tours management etc.
The research suggests an approach to identify CSFs of Six Sigma for Non-formal service sector. All the CSFs of the formal service sector will not be applicable to Non-formal services, hence opinion of experts was sought to add or delete the CSFs. In the first round of Delphi, the panel of experts has suggested, two new CSFs-“competitive benchmarking (F19) and resident’s involvement (F28)”, which were added for assessment in the next round of Delphi. One of the CSFs-“fulltime six sigma personnel (F15)” has been omitted in proposed clusters of CSFs for non-formal organization, as it is practically impossible to deploy full time trained Six Sigma recruits.
Abstract: Leo Breimans Random Forests (RF) is a recent
development in tree based classifiers and quickly proven to be one of
the most important algorithms in the machine learning literature. It
has shown robust and improved results of classifications on standard
data sets. Ensemble learning algorithms such as AdaBoost and
Bagging have been in active research and shown improvements in
classification results for several benchmarking data sets with mainly
decision trees as their base classifiers. In this paper we experiment to
apply these Meta learning techniques to the random forests. We
experiment the working of the ensembles of random forests on the
standard data sets available in UCI data sets. We compare the
original random forest algorithm with their ensemble counterparts
and discuss the results.
Abstract: European Union candidate status provides a
strong motivation for decision-making in the candidate
countries in shaping the regional development policy where
there is an envisioned transfer of power from center to the
periphery. The process of Europeanization anticipates the
candidate countries configure their regional institutional
templates in the context of the requirements of the European
Union policies and introduces new instruments of incentive
framework of enlargement to be employed in regional
development schemes. It is observed that the contribution of
the local actors to the decision making in the design of the
allocation architectures enhances the efficiency of the funds
and increases the positive effects of the projects funded under
the regional development objectives. This study aims at
exploring the performances of the three regional development
grant schemes in Turkey, established and allocated under the
pre-accession process with a special emphasis given to the
roles of the national and local actors in decision-making for
regional development. Efficiency analyses have been
conducted using the DEA methodology which has proved to
be a superior method in comparative efficiency and
benchmarking measurements. The findings of this study as
parallel to similar international studies, provides that the
participation of the local actors to the decision-making in
funding contributes both to the quality and the efficiency of
the projects funded under the EU schemes.