Abstract: Test automation allows performing difficult and time
consuming manual software testing tasks efficiently, quickly and
repeatedly. However, development and maintenance of automated
tests is expensive, so it needs a proper prioritization what to automate
first. This paper describes a simple yet efficient approach for such
prioritization of test cases based on the effort needed for both manual
execution and software test automation. The suggested approach is
very flexible because it allows working with a variety of assessment
methods, and adding or removing new candidates at any time. The
theoretical ideas presented in this article have been successfully
applied in real world situations in several software companies by the
authors and their colleagues including testing of real estate websites,
cryptographic and authentication solutions, OSGi-based middleware
framework that has been applied in various systems for smart homes,
connected cars, production plants, sensors, home appliances, car head
units and engine control units (ECU), vending machines, medical
devices, industry equipment and other devices that either contain or
are connected to an embedded service gateway.
Abstract: Bottom ash from Municipal Solid Waste Incineration
(MSWI) can be viewed as a typical granular material because these
industrial by-products result from the incineration of various
domestic wastes. MSWI bottom ash is mainly used in road
engineering in substitution of the traditional natural aggregates. As
the characterization of their mechanical behavior is essential in order
to use them, specific studies have been led over the past few years. In
the first part of this paper, the mechanical behavior of MSWI bottom
ash is studied with triaxial tests. After, analysis of the experiment
results, the simulation of triaxial tests is carried out by using the
software package CESAR-LCPC. As the first approach in modeling
of this new class material, the Mohr-Coulomb model was chosen to
describe the evolution of material under the influence of external
mechanical actions.
Abstract: Maintenance and design engineers have great concern
for the functioning of rotating machineries due to the vibration
phenomenon. Improper functioning in rotating machinery originates
from the damage to rolling element bearings. The status of rolling
element bearings require advanced technologies to monitor their
health status efficiently and effectively. Avoiding vibration during
machine running conditions is a complicated process. Vibration
simulation should be carried out using suitable sensors/ transducers to
recognize the level of damage on bearing during machine operating
conditions. Various issues arising in rotating systems are interlinked
with bearing faults. This paper presents an approach for fault
diagnosis of bearings using neural networks and time/frequencydomain
vibration analysis.
Abstract: Cochlear Implantation (CI) which became a routine
procedure for the last decades is an electronic device that provides a
sense of sound for patients who are severely and profoundly deaf.
The optimal success of this implantation depends on the electrode
technology and deep insertion techniques. However, this manual
insertion procedure may cause mechanical trauma which can lead to
severe destruction of the delicate intracochlear structure.
Accordingly, future improvement of the cochlear electrode implant
insertion needs reduction of the excessive force application during
the cochlear implantation which causes tissue damage and trauma.
This study is examined tool-tissue interaction of large prototype scale
digit embedded with distributive tactile sensor based upon cochlear
electrode and large prototype scale cochlea phantom for simulating
the human cochlear which could lead to small scale digit
requirements. The digit, distributive tactile sensors embedded with
silicon-substrate was inserted into the cochlea phantom to measure
any digit/phantom interaction and position of the digit in order to
minimize tissue and trauma damage during the electrode cochlear
insertion. The digit have provided tactile information from the digitphantom
insertion interaction such as contact status, tip penetration,
obstacles, relative shape and location, contact orientation and
multiple contacts. The tests demonstrated that even devices of such a
relative simple design with low cost have potential to improve
cochlear implant surgery and other lumen mapping applications by
providing tactile sensory feedback information and thus controlling
the insertion through sensing and control of the tip of the implant
during the insertion. In that approach, the surgeon could minimize the
tissue damage and potential damage to the delicate structures within
the cochlear caused by current manual electrode insertion of the
cochlear implantation. This approach also can be applied to other
minimally invasive surgery applications as well as diagnosis and path
navigation procedures.
Abstract: Manufacturing process has been considered as one of
the most important activity in business process. It correlates with
productivity and quality of the product so industries could fulfill
customer’s demand. With the increasing demand from customer,
industries must improve their manufacturing ability such as shorten
lead-time and reduce wastes on their process. Lean manufacturing
has been considered as one of the tools to waste elimination in
manufacturing or service industry. Workforce development is one
practice in lean manufacturing that can reduce waste generated from
operator such as waste of unnecessary motion. Anthropometric
approach is proposed to determine the recommended measurement in
operator’s work area. The method will get some dimensions from
Indonesia people that related to piston workstation. The result from
this research can be obtained new design for the work area
considering ergonomic aspect.
Abstract: Background modeling and subtraction in video
analysis has been widely used as an effective method for moving
objects detection in many computer vision applications. Recently, a
large number of approaches have been developed to tackle different
types of challenges in this field. However, the dynamic background
and illumination variations are the most frequently occurred problems
in the practical situation. This paper presents a favorable two-layer
model based on codebook algorithm incorporated with local binary
pattern (LBP) texture measure, targeted for handling dynamic
background and illumination variation problems. More specifically,
the first layer is designed by block-based codebook combining with
LBP histogram and mean value of each RGB color channel. Because
of the invariance of the LBP features with respect to monotonic
gray-scale changes, this layer can produce block wise detection results
with considerable tolerance of illumination variations. The pixel-based
codebook is employed to reinforce the precision from the output of the
first layer which is to eliminate false positives further. As a result, the
proposed approach can greatly promote the accuracy under the
circumstances of dynamic background and illumination changes.
Experimental results on several popular background subtraction
datasets demonstrate very competitive performance compared to
previous models.
Abstract: River Hindon is an important river catering the
demand of highly populated rural and industrial cluster of western
Uttar Pradesh, India. Water quality of river Hindon is deteriorating at
an alarming rate due to various industrial, municipal and agricultural
activities. The present study aimed at identifying the pollution
sources and quantifying the degree to which these sources are
responsible for the deteriorating water quality of the river. Various
water quality parameters, like pH, temperature, electrical
conductivity, total dissolved solids, total hardness, calcium, chloride,
nitrate, sulphate, biological oxygen demand, chemical oxygen
demand, and total alkalinity were assessed. Water quality data
obtained from eight study sites for one year has been subjected to the
two multivariate techniques, namely, principal component analysis
and cluster analysis. Principal component analysis was applied with
the aim to find out spatial variability and to identify the sources
responsible for the water quality of the river. Three Varifactors were
obtained after varimax rotation of initial principal components using
principal component analysis. Cluster analysis was carried out to
classify sampling stations of certain similarity, which grouped eight
different sites into two clusters. The study reveals that the
anthropogenic influence (municipal, industrial, waste water and
agricultural runoff) was the major source of river water pollution.
Thus, this study illustrates the utility of multivariate statistical
techniques for analysis and elucidation of multifaceted data sets,
recognition of pollution sources/factors and understanding
temporal/spatial variations in water quality for effective river water
quality management.
Abstract: A physical model for guiding the wave in
photorefractive media is studied. Propagation of cos-Gaussian beam
as the special cases of sinusoidal-Gaussian beams in photorefractive
crystal is simulated numerically by the Crank-Nicolson method in
one dimension. Results show that the beam profile deforms as the
energy transfers from the center to the tails under propagation. This
simulation approach is of significant interest for application in optical
telecommunication. The results are presented graphically and
discussed.
Abstract: Wavelength Division Multiplexing (WDM)
technology is the most promising technology for the proper
utilization of huge raw bandwidth provided by an optical fiber. One
of the key problems in implementing the all-optical WDM network is
the packet contention. This problem can be solved by several
different techniques. In time domain approach the packet contention
can be reduced by incorporating Fiber Delay Lines (FDLs) as optical
buffer in the switch architecture. Different types of buffering
architectures are reported in literatures. In the present paper a
comparative performance analysis of three most popular FDL
architectures are presented in order to obtain the best contention
resolution performance. The analysis is further extended to consider
the effect of different fiber non-linearities on the network
performance.
Abstract: Online measurement of the product quality is a
challenging task in cement production, especially in the production of
Celitement, a novel environmentally friendly hydraulic binder. The
mineralogy and chemical composition of clinker in ordinary Portland
cement production is measured by X-ray diffraction (XRD) and
X-ray fluorescence (XRF), where only crystalline constituents can be
detected. But only a small part of the Celitement components can be
measured via XRD, because most constituents have an amorphous
structure. This paper describes the development of algorithms
suitable for an on-line monitoring of the final processing step of
Celitement based on NIR-data. For calibration intermediate products
were dried at different temperatures and ground for variable
durations. The products were analyzed using XRD and
thermogravimetric analyses together with NIR-spectroscopy to
investigate the dependency between the drying and the milling
processes on one and the NIR-signal on the other side. As a result,
different characteristic parameters have been defined. A short
overview of the Celitement process and the challenging tasks of the
online measurement and evaluation of the product quality will be
presented. Subsequently, methods for systematic development of
near-infrared calibration models and the determination of the final
calibration model will be introduced. The application of the model on
experimental data illustrates that NIR-spectroscopy allows for a quick
and sufficiently exact determination of crucial process parameters.
Abstract: The question of legal liability over injury arising out
of the import and the introduction of GM food emerges as a crucial
issue confronting to promote GM food and its derivatives. There is a
greater possibility of commercialized GM food from the exporting
country to enter importing country where status of approval shall not
be same. This necessitates the importance of fixing a liability
mechanism to discuss the damage, if any, occurs at the level of
transboundary movement or at the market. There was a widespread consensus to develop the Cartagena
Protocol on Biosafety and to give for a dedicated regime on liability
and redress in the form of Nagoya Kuala Lumpur Supplementary
Protocol on the Liability and Redress (‘N-KL Protocol’) at the
international context. The national legal frameworks based on this
protocol are not adequately established in the prevailing food
legislations of the developing countries. The developing economy
like India is willing to import GM food and its derivatives after the
successful commercialization of Bt Cotton in 2002. As a party to the
N-KL Protocol, it is indispensable for India to formulate a legal
framework and to discuss safety, liability, and regulatory issues
surrounding GM foods in conformity to the provisions of the
Protocol. The liability mechanism is also important in the case where
the risk assessment and risk management is still in implementing
stage. Moreover, the country is facing GM infiltration issues with its
neighbors Bangladesh. As a precautionary approach, there is a need
to formulate rules and procedure of legal liability to discuss any kind
of damage occurs at transboundary trade. In this context, the
proposed work will attempt to analyze the liability regime in the
existing Food Safety and Standards Act, 2006 from the applicability
and domestic compliance and to suggest legal and policy options for
regulatory authorities.
Abstract: This article discusses the passage of RDB to XML
documents (schema and data) based on metadata and semantic
enrichment, which makes the RDB under flattened shape and is
enriched by the object concept. The integration and exploitation of
the object concept in the XML uses a syntax allowing for the
verification of the conformity of the document XML during the
creation. The information extracted from the RDB is therefore
analyzed and filtered in order to adjust according to the structure of
the XML files and the associated object model. Those implemented
in the XML document through a SQL query are built dynamically. A
prototype was implemented to realize automatic migration, and so
proves the effectiveness of this particular approach.
Abstract: A myriad of environmental issues face the Nigerian
industrial region, resulting from; oil and gas production, mining,
manufacturing and domestic wastes. Amidst these, much effort has
been directed by stakeholders in the Nigerian oil producing regions,
because of the impacts of the region on the wider Nigerian economy.
Although collaborative environmental management has been noted as
an effective approach in managing environmental issues, little
attention has been given to the roles and practices of stakeholders in
effecting a collaborative environmental management framework for
the Nigerian oil-producing region. This paper produces a framework
to expand and deepen knowledge relating to stakeholders aspects of
collaborative roles in managing environmental issues in the Nigeria
oil-producing region. The knowledge is derived from analysis of
stakeholders’ practices – studied through multiple case studies using
document analysis. Selected documents of key stakeholders –
Nigerian government agencies, multi-national oil companies and host
communities, were analyzed. Open and selective coding was
employed manually during document analysis of data collected from
the offices and websites of the stakeholders. The findings showed
that the stakeholders have a range of roles, practices, interests, drivers
and barriers regarding their collaborative roles in managing
environmental issues. While they have interests for efficient resource
use, compliance to standards, sharing of responsibilities, generating
of new solutions, and shared objectives; there is evidence of major
barriers and these include resource allocation, disjointed policy,
ineffective monitoring, diverse socio- economic interests, lack of
stakeholders’ commitment and limited knowledge sharing. However,
host communities hold deep concerns over the collaborative roles of
stakeholders for economic interests, particularly, where government
agencies and multi-national oil companies are involved. With these
barriers and concerns, a genuine stakeholders’ collaboration is found
to be limited, and as a result, optimal environmental management
practices and policies have not been successfully implemented in the
Nigeria oil-producing region. A framework is produced that describes
practices that characterize collaborative environmental management
might be employed to satisfy the stakeholders’ interests. The
framework recommends critical factors, based on the findings, which
may guide a collaborative environmental management in the oil
producing regions. The recommendations are designed to re-define
the practices of stakeholders in managing environmental issues in the
oil producing regions, not as something wholly new, but as an
approach essential for implementing a sustainable environmental
policy. This research outcome may clarify areas for future research as
well as to contribute to industry guidance in the area of collaborative
environmental management.
Abstract: Bezier curves have useful properties for path
generation problem, for instance, it can generate the reference
trajectory for vehicles to satisfy the path constraints. Both algorithms
join cubic Bezier curve segment smoothly to generate the path. Some
of the useful properties of Bezier are curvature. In mathematics,
curvature is the amount by which a geometric object deviates from
being flat, or straight in the case of a line. Another extrinsic example
of curvature is a circle, where the curvature is equal to the reciprocal
of its radius at any point on the circle. The smaller the radius, the
higher the curvature thus the vehicle needs to bend sharply. In this
study, we use Bezier curve to fit highway-like curve. We use
different approach to find the best approximation for the curve so that
it will resembles highway-like curve. We compute curvature value by
analytical differentiation of the Bezier Curve. We will then compute
the maximum speed for driving using the curvature information
obtained. Our research works on some assumptions; first, the Bezier
curve estimates the real shape of the curve which can be verified
visually. Even though, fitting process of Bezier curve does not
interpolate exactly on the curve of interest, we believe that the
estimation of speed are acceptable. We verified our result with the
manual calculation of the curvature from the map.
Abstract: Customer’ needs, quality, and value creation while
reducing costs through supply chain management provides challenges
and opportunities for companies and researchers. In the light of these
challenges, modern ideas must contribute to counter these challenges
and exploit opportunities. Therefore, this paper discusses the impact
of the quality cost on revenue sharing as a most important incentive
to configure business networks. This paper develops the quality cost approach to align with the
modern era. It develops a model to measure quality costs which
might enable firms to manage revenue sharing in a supply chain. The
developed model includes five categories; besides the well-known
four categories (namely prevention costs, appraisal costs, internal
failure costs, and external failure costs), a new category has been
developed in this research as a new vision of the relationship between
quality costs and innovations in industry. This new category is
Recycle Cost. This paper also examines whether such quality costs in
supply chains influence the revenue sharing between partners. Using the author's quality cost model, the relationship between
quality costs and revenue sharing among partners is examined using a
case study in an Egyptian manufacturing company which is a part of
a supply chain. This paper argues that the revenue-sharing proportion
allocated to supplier increases as the recycle cost of supplier
increases, and the revenue-sharing proportion allocated to
manufacturer increases as the prevention and appraisal costs increase,
as well as the failure costs, the recycle costs of manufacturer, and the
recycle costs of suppliers decrease. However, the results present
surprising findings. The purposes of this study are developing quality cost approach
and understanding the relationships between quality costs and
revenue sharing in supply chains. Therefore, the present study
contributes to theory and practice by explaining how the cost of
recycling can be combined in quality cost model to better
understanding the revenue sharing among partners in supply chains.
Abstract: As one of the convenient and noninvasive sensing
approaches, the automatic limb girth measurement has been applied
to detect intention behind human motion from muscle deformation.
The sensing validity has been elaborated by preliminary researches
but still need more fundamental studies, especially on kinetic
contraction modes. Based on the novel fabric strain sensors, a soft
and smart limb girth measurement system was developed by the
authors’ group, which can measure the limb girth in-motion.
Experiments were carried out on elbow isometric flexion and elbow
isokinetic flexion (biceps’ isokinetic contractions) of 90°/s, 60°/s, and
120°/s for 10 subjects (2 canoeists and 8 ordinary people). After
removal of natural circumferential increments due to elbow position,
the joint torque is found not uniformly sensitive to the limb
circumferential strains, but declining as elbow joint angle rises,
regardless of the angular speed. Moreover, the maximum joint torque
was found as an exponential function of the joint’s angular speed.
This research highly contributes to the application of the automatic
limb girth measuring during kinetic contractions, and it is useful to
predict the contraction level of voluntary skeletal muscles.
Abstract: During the last decades, a number of food crises such
as Bovine Spongiform Encephalopathy (BSE), Mad-Cow disease,
Dioxin in chicken food, Food-and-Mouth Disease (FMD), have
certainly inflected the reliability of the food industry. Consequently,
the trend in applying different scientific methods of risk assessment
in food safety has obtained more attentions in the academic and
practice. However, lack of practical approach considering entire food
supply chain is tangible in the academic literature. In this regard, this
paper aims to apply risk assessment tool (FMEA) with integration of
Human Factor along the entire supply chain of food production and
test the method in a case study of Diary production, and analyze its
results.
Abstract: In the past years electric mobility became part of a
public discussion. The trend to fully electrified vehicles instead of
vehicles fueled with fossil energy has notably gained momentum.
Today nearly every big car manufacturer produces and sells fully
electrified vehicles, but electrified vehicles are still not as competitive
as conventional powered vehicles. As the traction battery states the
largest cost driver, lowering its price is a crucial objective. In
addition to improvements in product and production processes a nonnegligible,
but widely underestimated cost driver of production can
be found in logistics, since the production technology is not
continuous yet and neither are the logistics systems. This paper presents an approach to evaluate cost factors on
different designs of load carrier systems. Due to numerous
interdependencies, the combination of costs factors for a particular
scenario is not transparent. This is effecting actions for cost reduction
negatively, but still cost reduction is one of the major goals for
simultaneous engineering processes. Therefore a concurrent and
phase appropriate cost valuation method is necessary to serve cost
transparency. In this paper the four phases of this cost valuation
method are defined and explained, which based upon a new approach
integrating the logistics development process in to the integrated
product and process development.
Abstract: One of the global combinatorial optimization
problems in machine learning is feature selection. It concerned with
removing the irrelevant, noisy, and redundant data, along with
keeping the original meaning of the original data. Attribute reduction
in rough set theory is an important feature selection method. Since
attribute reduction is an NP-hard problem, it is necessary to
investigate fast and effective approximate algorithms. In this paper,
we proposed two feature selection mechanisms based on memetic
algorithms (MAs) which combine the genetic algorithm with a fuzzy
record to record travel algorithm and a fuzzy controlled great deluge
algorithm, to identify a good balance between local search and
genetic search. In order to verify the proposed approaches, numerical
experiments are carried out on thirteen datasets. The results show that
the MAs approaches are efficient in solving attribute reduction
problems when compared with other meta-heuristic approaches.
Abstract: Response Surface Methods (RSM) provide
statistically validated predictive models that can then be manipulated
for finding optimal process configurations. Variation transmitted to
responses from poorly controlled process factors can be accounted
for by the mathematical technique of propagation of error (POE),
which facilitates ‘finding the flats’ on the surfaces generated by
RSM. The dual response approach to RSM captures the standard
deviation of the output as well as the average. It accounts for
unknown sources of variation. Dual response plus propagation of
error (POE) provides a more useful model of overall response
variation. In our case, we implemented this technique in predicting
compressive strength of concrete of 28 days in age. Since 28 days is
quite time consuming, while it is important to ensure the quality
control process. This paper investigates the potential of using design
of experiments (DOE-RSM) to predict the compressive strength of
concrete at 28th day. Data used for this study was carried out from
experiment schemes at university of Benghazi, civil engineering
department. A total of 114 sets of data were implemented. ACI mix
design method was utilized for the mix design. No admixtures were
used, only the main concrete mix constituents such as cement, coarseaggregate,
fine aggregate and water were utilized in all mixes.
Different mix proportions of the ingredients and different water
cement ratio were used. The proposed mathematical models are
capable of predicting the required concrete compressive strength of
concrete from early ages.