Abstract: Manufacturing, production and service industries within Libya have struggled with many problems during the past two decades due to many difficulties. These problems have created a negative impact on the productivity and utilization of many industries around the country. This paper studies the implementation levels of the manufacturing control systems known as Manufacturing Resource Planning (MRPII) being adapted within some Libyan industries. A survey methodology has been applied for this research, based on the survey analysis, the results pointed out that the system within these industries has a modest strategy towards most of the areas that are considered as being very crucial in implementing these systems successfully. The findings also show a variation within these implementation levels with a respect to the key-elements that related to MRPII, giving the highest levels in the emphasise on financial data accuracy. The paper has also identified limitations within the investigated manufacturing and managerial areas and has pointed to where senior managers should take immediate actions in order to achieve effective implementation of MRPII within their business area.
Abstract: A novel calibration approach that aims to reduce
ASM2d parameter subsets and decrease the model complexity is
presented. This approach does not require high computational
demand and reduces the number of modeling parameters required to
achieve the ASMs calibration by employing a sensitivity and iteration
methodology. Parameter sensitivity is a crucial factor and the
iteration methodology enables refinement of the simulation parameter
values. When completing the iteration process, parameters values are
determined in descending order of their sensitivities. The number of
iterations required is equal to the number of model parameters of the
parameter significance ranking. This approach was used for the
ASM2d model to the evaluated EBPR phosphorus removal and it was
successful. Results of the simulation provide calibration parameters.
These included YPAO, YPO4, YPHA, qPHA, qPP, μPAO, bPAO, bPP, bPHA,
KPS, YA, μAUT, bAUT, KO2 AUT, and KNH4 AUT. Those parameters were
corresponding to the experimental data available.
Abstract: Historic preservation areas are extremely vulnerable to disasters because they are home to many vulnerable people and contain many closely spaced wooden houses. However, the narrow streets in these regions have historic meaning, which means that they cannot be widened and can become blocked easily during large disasters. Here, we describe our efforts to establish a methodology for the planning of evacuation route sin such historic preservation areas. In particular, this study aims to clarify the effectiveness of measures intended to secure two-way evacuation routes for vulnerable people during large disasters in a historic area preserved under the Cultural Properties Protection Law, Japan.
Abstract: This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.
Abstract: Information Technology (IT) projects are always
accompanied by various risks and because of high rate of failure in
such projects, managing risks in order to neutralize or at least
decrease their effects on the success of the project is strongly
essential. In this paper, fuzzy analytical hierarchy process (FAHP) is
exploited as a means of risk evaluation methodology to prioritize and
organize risk factors faced in IT projects. A real case of IT projects, a
project of design and implementation of an integrated information
system in a vehicle producing company in Iran is studied. Related
risk factors are identified and then expert qualitative judgments about
these factors are acquired. Translating these judgments to fuzzy
numbers and using them as an input to FAHP, risk factors are then
ranked and prioritized by FAHP in order to make project managers
aware of more important risks and enable them to adopt suitable
measures to deal with these highly devastative risks.
Abstract: This paper focuses on robust design and optimization
of industrial production wastes. Past literatures were reviewed to case
study Clamason Industries Limited (CIL) - a leading ladder-tops
manufacturer. A painstaking study of the firm-s practices at the shop
floor revealed that Over-production, Waiting time, Excess inventory,
and Defects are the major wastes that are impeding their progress and
profitability. Design expert8 software was used to apply Taguchi
robust design and response surface methodology in order to model,
analyse and optimise the wastes cost in CIL. Waiting time and overproduction
rank first and second in contributing to the costs of wastes
in CIL. For minimal wastes cost the control factors of overproduction,
waiting-time, defects and excess-inventory must be set at
0.30, 390.70, 4 and 55.70 respectively for CIL. The optimal value of
cost of wastes for the months studied was 22.3679. Finally, a
recommendation was made that for the company to enhance their
profitability and customer satisfaction, they must adopt the Shingeo
Shingo-s Single Minute Exchange of Dies (SMED), which will
immediately tackle the waste of waiting by drastically reducing their
setup time.
Abstract: This paper proposes an innovative methodology for
Acceptance Sampling by Variables, which is a particular category of
Statistical Quality Control dealing with the assurance of products
quality. Our contribution lies in the exploitation of machine learning
techniques to address the complexity and remedy the drawbacks of
existing approaches. More specifically, the proposed methodology
exploits Artificial Neural Networks (ANNs) to aid decision making
about the acceptance or rejection of an inspected sample. For any
type of inspection, ANNs are trained by data from corresponding
tables of a standard-s sampling plan schemes. Once trained, ANNs
can give closed-form solutions for any acceptance quality level and
sample size, thus leading to an automation of the reading of the
sampling plan tables, without any need of compromise with the
values of the specific standard chosen each time. The proposed
methodology provides enough flexibility to quality control engineers
during the inspection of their samples, allowing the consideration of
specific needs, while it also reduces the time and the cost required for
these inspections. Its applicability and advantages are demonstrated
through two numerical examples.
Abstract: As a company competitiveness depends more and more on the relationship with its stakeholders, the topic of companystakeholder fit is becoming increasingly important. This fit affects the extent to which a stakeholder perceives CSR company commitment, values and behaviors and, therefore, stakeholder identification in a company and his/her loyalty to it. Consequently, it is important to measure the alignment or the gap between stakeholder CSR demands, values, preferences and perceptions, and the company CSR disclosed commitment, values and policies. In this paper, in order to assess the company-stakeholder fit about corporate responsibility, an innovative CSR fit positioning matrix is proposed. This matrix is based on the measurement of a company CSR disclosed commitment and stakeholder perceived and required commitment. The matrix is part of a more complex methodology based on Global Reporting Initiative (GRI) indicators, content analysis and stakeholder questionnaires. This methodology provides appropriate indications for helping companies to achieve CSR company-stakeholder fit, by leveraging both CSR commitment and communication. Moreover, it could be used by top management for comparing different companies and stakeholders, and for planning specific CSR strategies, policies and activities.
Abstract: This paper analyzes the patterns of the Monte Carlo
data for a large number of variables and minterms, in order to
characterize the circuit path length behavior. We propose models
that are determined by training process of shortest path length
derived from a wide range of binary decision diagram (BDD)
simulations. The creation of the model was done use of feed forward
neural network (NN) modeling methodology. Experimental results
for ISCAS benchmark circuits show an RMS error of 0.102 for the
shortest path length complexity estimation predicted by the NN
model (NNM). Use of such a model can help reduce the time
complexity of very large scale integrated (VLSI) circuitries and
related computer-aided design (CAD) tools that use BDDs.
Abstract: Lack of resources for road infrastructure financing is a
problem that currently affects not only eastern European economies
but also many other countries especially in relation to the impact of
global financial crisis. In this context, we are talking about the socalled
short-investment problem as a result of long-term lack of
investment resources. Based on an analysis of road infrastructure
financing in the Czech Republic this article points out at weaknesses
of current system and proposes a long-term planning methodology
supported by system approach. Within this methodology and using
created system dynamic model the article predicts the development of
short-investment problem in the Country and in reaction on the
downward trend of certain sources the article presents various
scenarios resulting from the change of the structure of financial
sources. In the discussion the article focuses more closely on the
possibility of introduction of tax on vehicles instead of taxes with
declining revenue streams and estimates its approximate price in
relation to reaching various solutions of short-investment in time.
Abstract: As mobile service's subscriber is increasing; mobile
contents services are getting more and more variables. So, mobile
contents development needs not only contents design but also
guideline for just mobile. And when mobile contents are developed, it
is important to pass the limit and restriction of the mobile. The
restrictions of mobile are small browser and screen size, limited
download size and uncomfortable navigation. So each contents of
mobile guideline will be presented for user's usability, easy of
development and consistency of rule. This paper will be proposed
methodology which is each contents of mobile guideline. Mobile web
will be developed by mobile guideline which I proposed.
Abstract: Drilling is the most common machining operation and it forms the highest machining cost in many manufacturing activities including automotive engine production. The outcome of this operation depends upon many factors including utilization of proper cutting tool geometry, cutting tool material and the type of coating used to improve hardness and resistance to wear, and also cutting parameters. With the availability of a large array of tool geometries, materials and coatings, is has become a challenging task to select the best tool and cutting parameters that would result in the lowest machining cost or highest profit rate. This paper describes an algorithm developed to help achieve good performances in drilling operations by automatically determination of proper cutting tools and cutting parameters. It also helps determine machining sequences resulting in minimum tool changes that would eventually reduce machining time and cost where multiple tools are used.
Abstract: The Spiral development model has been used
successfully in many commercial systems and in a good number of
defense systems. This is due to the fact that cost-effective
incremental commitment of funds, via an analogy of the spiral model
to stud poker and also can be used to develop hardware or integrate
software, hardware, and systems. To support adaptive, semantic
collaboration between domain experts and knowledge engineers, a
new knowledge engineering process, called Spiral_OWL is proposed.
This model is based on the idea of iterative refinement, annotation
and structuring of knowledge base. The Spiral_OWL model is
generated base on spiral model and knowledge engineering
methodology. A central paradigm for Spiral_OWL model is the
concentration on risk-driven determination of knowledge engineering
process. The collaboration aspect comes into play during knowledge
acquisition and knowledge validation phase. Design rationales for the
Spiral_OWL model are to be easy-to-implement, well-organized, and
iterative development cycle as an expanding spiral.
Abstract: This paper gives an overview of how an OWL
ontology has been created to represent template knowledge models
defined in CML that are provided by CommonKADS.
CommonKADS is a mature knowledge engineering methodology
which proposes the use of template knowledge model for knowledge
modelling. The aim of developing this ontology is to present the
template knowledge model in a knowledge representation language
that can be easily understood and shared in the knowledge
engineering community. Hence OWL is used as it has become a
standard for ontology and also it already has user friendly tools for
viewing and editing.
Abstract: Smoothing or filtering of data is first preprocessing step
for noise suppression in many applications involving data analysis.
Moving average is the most popular method of smoothing the data,
generalization of this led to the development of Savitzky-Golay filter.
Many window smoothing methods were developed by convolving
the data with different window functions for different applications;
most widely used window functions are Gaussian or Kaiser. Function
approximation of the data by polynomial regression or Fourier
expansion or wavelet expansion also gives a smoothed data. Wavelets
also smooth the data to great extent by thresholding the wavelet
coefficients. Almost all smoothing methods destroys the peaks and
flatten them when the support of the window is increased. In certain
applications it is desirable to retain peaks while smoothing the data
as much as possible. In this paper we present a methodology called
as peak-wise smoothing that will smooth the data to any desired level
without losing the major peak features.
Abstract: This paper features the mathematical modeling of a single input single output based Timoshenko smart beam. Further, this mathematical model is used to design a multirate output feedback based discrete sliding mode controller using Bartoszewicz law to suppress the flexural vibrations. The first 2 dominant vibratory modes is retained. Here, an application of the discrete sliding mode control in smart systems is presented. The algorithm uses a fast output sampling based sliding mode control strategy that would avoid the use of switching in the control input and hence avoids chattering. This method does not need the measurement of the system states for feedback as it makes use of only the output samples for designing the controller. Thus, this methodology is more practical and easy to implement.
Abstract: In modern telecommunications industry, demand &
supply chain management (DSCM) needs reliable design and
versatile tools to control the material flow. The objective for efficient
DSCM is reducing inventory, lead times and related costs in order to
assure reliable and on-time deliveries from manufacturing units
towards customers. In this paper the multi-rate expert system based
methodology for developing simulation tools that would enable
optimal DSCM for multi region, high volume and high complexity
manufacturing environment was proposed.
Abstract: The permanent magnet synchronous motor (PMSM) is
very useful in many applications. Vector control of PMSM is popular
kind of its control. In this paper, at first an optimal vector control for
PMSM is designed and then results are compared with conventional
vector control. Then, it is assumed that the measurements are noisy
and linear quadratic Gaussian (LQG) methodology is used to filter
the noises. The results of noisy optimal vector control and filtered
optimal vector control are compared to each other. Nonlinearity of
PMSM and existence of inverter in its control circuit caused that the
system is nonlinear and time-variant. With deriving average model,
the system is changed to nonlinear time-invariant and then the
nonlinear system is converted to linear system by linearization of
model around average values. This model is used to optimize vector
control then two optimal vector controls are compared to each other.
Simulation results show that the performance and robustness to noise
of the control system has been highly improved.
Abstract: In this research, the diffusion of innovation regarding
smartphone usage is analysed through a consumer behaviour theory.
This research aims to determine whether a pattern surrounding the
diffusion of innovation exists. As a methodology, an empirical study
of the switch from a conventional cell phone to a smartphone was
performed. Specifically, a questionnaire survey was completed by
general consumers, and the situational and behavioural characteristics
of switching from a cell phone to a smartphone were analysed. In
conclusion, we found that the speed of the diffusion of innovation, the
consumer behaviour characteristics, and the utilities of the product
vary according to the stage of the product life cycle.
Abstract: Discrete choice model is the most used methodology for studying traveler-s mode choice and demand. However, to calibrate the discrete choice model needs to have plenty of questionnaire survey. In this study, an aggregative model is proposed. The historical data of passenger volumes for high speed rail and domestic civil aviation are employed to calibrate and validate the model. In this study, different models are compared so as to propose the best one. From the results, systematic equations forecast better than single equation do. Models with the external variable, which is oil price, are better than models based on closed system assumption.