Abstract: A fault detection and identification (FDI) technique is
presented to create a fault tolerant control system (FTC). The fault
detection is achieved by monitoring the position of the light source
using an array of light sensors. When a decision is made about the
presence of a fault an identification process is initiated to locate the
faulty component and reconfigure the controller signals. The signals
provided by the sensors are predictable; therefore the existence of a
fault is easily identified. Identification of the faulty sensor is based on
the dynamics of the frame. The technique is not restricted to a
particular type of controllers and the results show consistency.
Abstract: Interaction effects of xanthan gum (XG), carboxymethyl
cellulose (CMC), and locust bean gum (LBG) on the flow properties
of oil-in-water emulsions were investigated by a mixture design
experiment. Blends of XG, CMC and LBG were prepared according
to an augmented simplex-centroid mixture design (10 points) and used
at 0.5% (wt/wt) in the emulsion formulations. An appropriate
mathematical model was fitted to express each response as a function
of the proportions of the blend components that are able to
empirically predict the response to any blend of combination of the
components. The synergistic interaction effect of the ternary
XG:CMC:LBG blends at approximately 33-67% XG levels was
shown to be much stronger than that of the binary XG:LBG blend at
50% XG level (p < 0.05). Nevertheless, an antagonistic interaction
effect became significant as CMC level in blends was more than 33%
(p < 0.05). Yield stress and apparent viscosity (at 10 s-1) responses
were successfully fitted with a special quartic model while flow
behaviour index and consistency coefficient were fitted with a full
quartic model (R2
adjusted ≥ 0.90). This study found that a mixture
design approach could serve as a valuable tool in better elucidating
and predicting the interaction effects beyond the conventional twocomponent
blends.
Abstract: This paper presents Simulated Annealing based
approach to estimate solar cell model parameters. Single diode solar
cell model is used in this study to validate the proposed approach
outcomes. The developed technique is used to estimate different
model parameters such as generated photocurrent, saturation current,
series resistance, shunt resistance, and ideality factor that govern the
current-voltage relationship of a solar cell. A practical case study is
used to test and verify the consistency of accurately estimating
various parameters of single diode solar cell model. Comparative
study among different parameter estimation techniques is presented
to show the effectiveness of the developed approach.
Abstract: Artificial Bee Colony (ABC) algorithm is a relatively new swarm intelligence technique for clustering. It produces higher
quality clusters compared to other population-based algorithms but with poor energy efficiency, cluster quality consistency and typically slower in convergence speed. Inspired by energy saving foraging behavior of natural honey bees this paper presents a Quality and Quantity Aware Artificial Bee Colony (Q2ABC) algorithm to improve quality of cluster identification, energy efficiency and convergence speed of the original ABC. To evaluate the performance of Q2ABC algorithm, experiments were conducted on a suite of ten benchmark UCI datasets. The results demonstrate Q2ABC outperformed ABC and K-means algorithm in the quality of clusters delivered.
Abstract: The effect of artificial pozzolan (waste brick) on the
physico-chemical properties of cement manufactured was
investigated. The waste brick is generated by the manufacture of
bricks. It was used in the proportions of 0%, 5%, 10%, 15% and 20%
by mass of cement to study its effect on the physico-chemical
properties of cement incorporating artificial pozzolan. The physicochemical
properties of cement at anhydrous state and the hydrated
state (chemical composition, specific weight, fineness, consistency of
the cement paste and setting times) were studied. The experimental
results obtained show that the quantity of pozzolanic admixture
(waste brick) of cement manufactured is the principal parameter who
influences on the variation of the physico-chemical properties of the
cement tested.
Abstract: This work concerns the evolution and the maintenance
of an ontological resource in relation with the evolution of the corpus
of texts from which it had been built.
The knowledge forming a text corpus, especially in dynamic domains,
is in continuous evolution. When a change in the corpus occurs, the
domain ontology must evolve accordingly. Most methods manage
ontology evolution independently from the corpus from which it is
built; in addition, they treat evolution just as a process of knowledge
addition, not considering other knowledge changes. We propose a
methodology for managing an evolving ontology from a text corpus
that evolves over time, while preserving the consistency and the
persistence of this ontology.
Our methodology is based on the changes made on the corpus to
reflect the evolution of the considered domain - augmented surgery
in our case. In this context, the results of text mining techniques,
as well as the ARCHONTE method slightly modified, are used to
support the evolution process.
Abstract: We introduce a logic-based framework for database
updating under constraints. In our framework, the constraints are
represented as an instantiated extended logic program. When performing
an update, database consistency may be violated. We provide
an approach of maintaining database consistency, and study the
conditions under which the maintenance process is deterministic. We
show that the complexity of the computations and decision problems
presented in our framework is in each case polynomial time.
Abstract: The present work analyses different parameters of pressure die casting to minimize the casting defects. Pressure diecasting is usually applied for casting of aluminium alloys. Good surface finish with required tolerances and dimensional accuracy can be achieved by optimization of controllable process parameters such as solidification time, molten temperature, filling time, injection pressure and plunger velocity. Moreover, by selection of optimum process parameters the pressure die casting defects such as porosity, insufficient spread of molten material, flash etc. are also minimized. Therefore, a pressure die casting component, carburetor housing of aluminium alloy (Al2Si2O5) has been considered. The effects of selected process parameters on casting defects and subsequent setting of parameters with the levels have been accomplished by Taguchi-s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L18 orthogonal array. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the percent contribution of different process parameters. Confidence interval has also been estimated for 95% consistency level and three conformational experiments have been performed to validate the optimum level of different parameters. Overall 2.352% reduction in defects has been observed with the help of suggested optimum process parameters.
Abstract: The UML modeling of complex distributed systems often is a great challenge due to the large amount of parallel real-time operating components. In this paper the problems of verification of such systems are discussed. ECPN, an Extended Colored Petri Net is defined to formally describe state transitions of components and interactions among components. The relationship between sequence diagrams and Free Choice Petri Nets is investigated. Free Choice Petri Net theory helps verifying the liveness of sequence diagrams. By converting sequence diagrams to ECPNs and then comparing behaviors of sequence diagram ECPNs and statecharts, the consistency among models is analyzed. Finally, a verification process for an example model is demonstrated.
Abstract: Large scale systems such as computational Grid is
a distributed computing infrastructure that can provide globally
available network resources. The evolution of information processing
systems in Data Grid is characterized by a strong decentralization of
data in several fields whose objective is to ensure the availability and
the reliability of the data in the reason to provide a fault tolerance
and scalability, which cannot be possible only with the use of the
techniques of replication. Unfortunately the use of these techniques
has a height cost, because it is necessary to maintain consistency
between the distributed data. Nevertheless, to agree to live with
certain imperfections can improve the performance of the system by
improving competition. In this paper, we propose a multi-layer protocol
combining the pessimistic and optimistic approaches conceived
for the data consistency maintenance in large scale systems. Our
approach is based on a hierarchical representation model with tree
layers, whose objective is with double vocation, because it initially
makes it possible to reduce response times compared to completely
pessimistic approach and it the second time to improve the quality
of service compared to an optimistic approach.
Abstract: On the basis of Bayesian inference using the
maximizer of the posterior marginal estimate, we carry out phase
unwrapping using multiple interferograms via generalized mean-field
theory. Numerical calculations for a typical wave-front in remote
sensing using the synthetic aperture radar interferometry, phase
diagram in hyper-parameter space clarifies that the present method
succeeds in phase unwrapping perfectly under the constraint of
surface- consistency condition, if the interferograms are not corrupted
by any noises. Also, we find that prior is useful for extending a phase
in which phase unwrapping under the constraint of the
surface-consistency condition. These results are quantitatively
confirmed by the Monte Carlo simulation.
Abstract: This paper presents the development of a hybrid
thermal model for the EVO Electric AFM 140 Axial Flux Permanent
Magnet (AFPM) machine as used in hybrid and electric vehicles. The
adopted approach is based on a hybrid lumped parameter and finite
difference method. The proposed method divides each motor
component into regular elements which are connected together in a
thermal resistance network representing all the physical connections
in all three dimensions. The element shape and size are chosen
according to the component geometry to ensure consistency. The
fluid domain is lumped into one region with averaged heat transfer
parameters connecting it to the solid domain. Some model parameters
are obtained from Computation Fluid Dynamic (CFD) simulation and
empirical data. The hybrid thermal model is described by a set of
coupled linear first order differential equations which is discretised
and solved iteratively to obtain the temperature profile. The
computation involved is low and thus the model is suitable for
transient temperature predictions. The maximum error in temperature
prediction is 3.4% and the mean error is consistently lower than the
mean error due to uncertainty in measurements. The details of the
model development, temperature predictions and suggestions for
design improvements are presented in this paper.
Abstract: International markets driven forces are changing
continuously, therefore companies need to gain a competitive edge in
such markets. Improving the company's products, processes and
practices is no longer auxiliary. Lean production is a production
management philosophy that consolidates work tasks with minimum
waste resulting in improved productivity. Lean production practices
can be mapped into many production areas. One of these is
Manufacturing Equipment and Technology (MET). Many lean
production practices can be implemented in MET, namely, specific
equipment configurations, total preventive maintenance, visual
control, new equipment/ technologies, production process
reengineering and shared vision of perfection.The purpose of this
paper is to investigate the implementation level of these six practices
in Jordanian industries. To achieve that a questionnaire survey has
been designed according to five-point Likert scale. The questionnaire
is validated through pilot study and through experts review. A sample
of 350 Jordanian companies were surveyed, the response rate was
83%. The respondents were asked to rate the extent of
implementation for each of practices. A relationship conceptual
model is developed, hypotheses are proposed, and consequently the
essential statistical analyses are then performed. An assessment tool
that enables management to monitor the progress and the
effectiveness of lean practices implementation is designed and
presented. Consequently, the results show that the average
implementation level of lean practices in MET is 77%, Jordanian
companies are implementing successfully the considered lean
production practices, and the presented model has Cronbach-s alpha
value of 0.87 which is good evidence on model consistency and
results validation.
Abstract: As mobile service's subscriber is increasing; mobile
contents services are getting more and more variables. So, mobile
contents development needs not only contents design but also
guideline for just mobile. And when mobile contents are developed, it
is important to pass the limit and restriction of the mobile. The
restrictions of mobile are small browser and screen size, limited
download size and uncomfortable navigation. So each contents of
mobile guideline will be presented for user's usability, easy of
development and consistency of rule. This paper will be proposed
methodology which is each contents of mobile guideline. Mobile web
will be developed by mobile guideline which I proposed.
Abstract: We constructed a method of phase unwrapping for a typical wave-front by utilizing the maximizer of the posterior marginal (MPM) estimate corresponding to equilibrium statistical mechanics of the three-state Ising model on a square lattice on the basis of an analogy between statistical mechanics and Bayesian inference. We investigated the static properties of an MPM estimate from a phase diagram using Monte Carlo simulation for a typical wave-front with synthetic aperture radar (SAR) interferometry. The simulations clarified that the surface-consistency conditions were useful for extending the phase where the MPM estimate was successful in phase unwrapping with a high degree of accuracy and that introducing prior information into the MPM estimate also made it possible to extend the phase under the constraint of the surface-consistency conditions with a high degree of accuracy. We also found that the MPM estimate could be used to reconstruct the original wave-fronts more smoothly, if we appropriately tuned hyper-parameters corresponding to temperature to utilize fluctuations around the MAP solution. Also, from the viewpoint of statistical mechanics of the Q-Ising model, we found that the MPM estimate was regarded as a method for searching the ground state by utilizing thermal fluctuations under the constraint of the surface-consistency condition.
Abstract: The aim of this paper is to investigate the influence of
market share and diversification on the nonlife insurers- performance.
The underlying relationships have been investigated in different
industries and different disciplines (economics, management...), still,
no consistency exists either in the magnitude or statistical
significance of the relationship between market share (and
diversification as well) on one side and companies- performance on
the other side. Moreover, the direction of the relationship is also
somewhat questionable. While some authors find this relationship to
be positive, the others reveal its negative association. In order to test
the influence of market share and diversification on companies-
performance in Croatian nonlife insurance industry for the period
from 1999 to 2009, we designed an empirical model in which we
included the following independent variables: firms- profitability
from previous years, market share, diversification and control
variables (i.e. ownership, industrial concentration, GDP per capita,
inflation). Using the two-step generalized method of moments
(GMM) estimator we found evidence of a positive and statistically
significant influence of both, market share and diversification, on
insurers- profitability.
Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: The current paper conceptualizes the technique of
release consistency indispensable with the concept of
synchronization that is user-defined. Programming model concreted
with object and class is illustrated and demonstrated. The essence of
the paper is phases, events and parallel computing execution .The
technique by which the values are visible on shared variables is
implemented. The second part of the paper consist of user defined
high level synchronization primitives implementation and system
architecture with memory protocols. There is a proposition of
techniques which are core in deciding the validating and invalidating
a stall page .
Abstract: In this paper, we proposed a method for detecting consistency violation between UML state machine diagrams and communication diagrams using Alloy. Using input language of Alloy, the proposed method expresses system behaviors described by state machine diagrams, message sequences described by communication diagrams, and a consistency property. As a result of application for an example system, we confirmed that consistency violation could be detected using Alloy correctly.
Abstract: The mixed oxide nuclear fuel (MOX) of U and Pu contains several percent of fission products and minor actinides, such as neptunium, americium and curium. It is important to determine accurately the decay heat from Curium isotopes as they contribute significantly in the MOX fuel. This heat generation can cause samples to melt very quickly if excessive quantities of curium are present. In the present paper, we introduce a new approach that can predict the decay heat from curium isotopes. This work is a part of the project funded by King Abdulaziz City of Science and Technology (KASCT), Long-Term Comprehensive National Plan for Science, Technology and Innovations, and take place in King Abdulaziz University (KAU), Saudi Arabia. The approach is based on the numerical solution of coupled linear differential equations that describe decays and buildups of many nuclides to calculate the decay heat produced after shutdown. Results show the consistency and reliability of the approach applied.