Abstract: Functionalities and control behavior are both primary
requirements in design of a complex system. Automata theory plays
an important role in modeling behavior of a system. Z is an ideal
notation which is used for describing state space of a system and then
defining operations over it. Consequently, an integration of automata
and Z will be an effective tool for increasing modeling power for a
complex system. Further, nondeterministic finite automata (NFA)
may have different implementations and therefore it is needed to
verify the transformation from diagrams to a code. If we describe
formal specification of an NFA before implementing it, then
confidence over transformation can be increased. In this paper, we
have given a procedure for integrating NFA and Z. Complement of a
special type of NFA is defined. Then union of two NFAs is
formalized after defining their complements. Finally, formal
construction of intersection of NFAs is described. The specification
of this relationship is analyzed and validated using Z/EVES tool.
Abstract: This paper presents an on-going research work on the
implementation of feature-based machining via macro programming.
Repetitive machining features such as holes, slots, pockets etc can
readily be encapsulated in macros. Each macro consists of methods
on how to machine the shape as defined by the feature. The macro
programming technique comprises of a main program and
subprograms. The main program allows user to select several
subprograms that contain features and define their important
parameters. With macros, complex machining routines can be
implemented easily and no post processor is required. A case study
on machining of a part that comprised of planar face, hole and pocket
features using the macro programming technique was carried out. It
is envisaged that the macro programming technique can be extended
to other feature-based machining fields such as the newly developed
STEP-NC domain.
Abstract: The theatre-auditorium under investigation following
the highly reflective characteristics of materials used in it (marble,
painted wood, smooth plaster, etc), architectural and structural
features of the Protocol and its intended use (very multifunctional:
Auditorium, theatre, cinema, musicals, conference room) from the
analysis of the statement of fact made by the acoustic simulation
software Ramsete and supported by data obtained through a
campaign of acoustic measurements of the state of fact made on the
spot by a Fonomet Svantek model SVAN 957, appears to be
acoustically inadequate. After the completion of the 3D model
according to the specifications necessary software used forecast in
order to be recognized by him, have made three simulations, acoustic
simulation of the state of and acoustic simulation of two design
solutions.
Improved noise characteristics found in the first design solution,
compared to the state in fact consists therefore in lowering
Reverberation Time that you turn most desirable value, while the
Indicators of Clarity, the Baricentric Time, the Lateral Efficiency,
Ratio of Low Tmedia BR and defined the Speech Intelligibility
improved significantly. Improved noise characteristics found instead
in the second design solution, as compared to first design solution, is
finally mostly in a more uniform distribution of Leq and in lowering
Reverberation Time that you turn the optimum values. Indicators of
Clarity, and the Lateral Efficiency improve further but at the expense
of a value slightly worse than the BR. Slightly vary the remaining
indices.
Abstract: In this study, a frame work for verification of famous seismic codes is utilized. To verify the seismic codes performance, damage quantity of RC frames is compared with the target performance. Due to the randomness property of seismic design and earthquake loads excitation, in this paper, fragility curves are developed. These diagrams are utilized to evaluate performance level of structures which are designed by the seismic codes. These diagrams further illustrate the effect of load combination and reduction factors of codes on probability of damage exceedance. Two types of structures; very high important structures with high ductility and medium important structures with intermediate ductility are designed by different seismic codes. The Results reveal that usually lower damage ratio generate lower probability of exceedance. In addition, the findings indicate that there are buildings with higher quantity of bars which they have higher probability of damage exceedance. Life-cycle cost analysis utilized for comparison and final decision making process.
Abstract: As a part of evaluation system for R&D program, the
Korean government has applied feasibility analysis since 2008.
Various professionals put forth a great effort in order to catch up the
high degree of freedom of R&D programs, and make contributions to
evolving the feasibility analysis. We analyze diverse R&D programs
from various viewpoints, such as technology, policy, and Economics,
integrate the separate analysis, and finally arrive at a definite result;
whether a program is feasible or unfeasible. This paper describes the
concept and method of the feasibility analysis as a decision making
tool. The analysis unit and content of each criterion, which are key
elements in a comprehensive decision making structure, are examined
Abstract: Content-Based Image Retrieval (CBIR) has been
one on the most vivid research areas in the field of computer vision
over the last 10 years. Many programs and tools have been
developed to formulate and execute queries based on the visual or
audio content and to help browsing large multimedia repositories.
Still, no general breakthrough has been achieved with respect to
large varied databases with documents of difering sorts and with
varying characteristics. Answers to many questions with respect to
speed, semantic descriptors or objective image interpretations are
still unanswered. In the medical field, images, and especially
digital images, are produced in ever increasing quantities and used
for diagnostics and therapy. In several articles, content based
access to medical images for supporting clinical decision making
has been proposed that would ease the management of clinical data
and scenarios for the integration of content-based access methods
into Picture Archiving and Communication Systems (PACS) have
been created. This paper gives an overview of soft computing
techniques. New research directions are being defined that can
prove to be useful. Still, there are very few systems that seem to be
used in clinical practice. It needs to be stated as well that the goal
is not, in general, to replace text based retrieval methods as they
exist at the moment.
Abstract: Mathematical models can be used to describe the
dynamics of the spread of infectious disease between susceptibles
and infectious populations. Dengue fever is a re-emerging disease in
the tropical and subtropical regions of the world. Its incidence has
increased fourfold since 1970 and outbreaks are now reported quite
frequently from many parts of the world. In dengue endemic regions,
more cases of dengue infection in pregnancy and infancy are being
found due to the increasing incidence. It has been reported that
dengue infection was vertically transmitted to the infants. Primary
dengue infection is associated with mild to high fever, headache,
muscle pain and skin rash. Immune response includes IgM antibodies
produced by the 5th day of symptoms and persist for 30-60 days. IgG
antibodies appear on the 14th day and persist for life. Secondary
infections often result in high fever and in many cases with
hemorrhagic events and circulatory failure. In the present paper, a
mathematical model is proposed to simulate the succession of dengue
disease transmission in pregnancy and infancy. Stability analysis of
the equilibrium points is carried out and a simulation is given for the
different sets of parameter. Moreover, the bifurcation diagrams of our
model are discussed. The controlling of this disease in infant cases is
introduced in the term of the threshold condition.
Abstract: Transportation authorities need to provide the services
and facilities that are critical to every country-s well-being and
development. Management of the road network is becoming
increasingly challenging as demands increase and resources are
limited. Public sector institutions are integrating performance
information into budgeting, managing and reporting via
implementing performance measurement systems. In the face of
growing challenges, performance measurement of road networks is
attracting growing interest in many countries. The large scale of
public investments makes the maintenance and development of road
networks an area where such systems are an important assessment
tool. Transportation agencies have been using performance
measurement and modeling as part of pavement and bridge
management systems. Recently the focus has been on extending the
process to applications in road construction and maintenance
systems, operations and safety programs, and administrative
structures and procedures. To eliminate failure and dysfunctional
consequences the importance of obtaining objective data and
implementing evaluation instrument where necessary is presented in
this paper
Abstract: Online discussions are an important component of
both blended and online courses. This paper examines the varieties of
online discussions and the perils, pitfalls and possibilities of this
rather new technological tool for enhanced learning. The discussion
begins with possible perils and pitfalls inherent in this educational
tool and moves to a consideration of the advantages of the varieties
of online discussions feasible for use in teacher education programs.
Abstract: This paper presents the adaptation of the knowledge management model and intellectual capital measurement NOVA to the needs of work or research project must be developed when conducting a program of graduate-level master. Brackets are added in each of the blocks which is represented in the original model NOVA and which allows to represent those involved in each of these.
Abstract: The principal purpose of this article is to present a new method based on Adaptive Neural Network Fuzzy Inference System (ANFIS) to generate additional artificial earthquake accelerograms from presented data, which are compatible with specified response spectra. The proposed method uses the learning abilities of ANFIS to develop the knowledge of the inverse mapping from response spectrum to earthquake records. In addition, wavelet packet transform is used to decompose specified earthquake records and then ANFISs are trained to relate the response spectrum of records to their wavelet packet coefficients. Finally, an interpretive example is presented which uses an ensemble of recorded accelerograms to demonstrate the effectiveness of the proposed method.
Abstract: Source code retrieval is of immense importance in the software engineering field. The complex tasks of retrieving and extracting information from source code documents is vital in the development cycle of the large software systems. The two main subtasks which result from these activities are code duplication prevention and plagiarism detection. In this paper, we propose a Mohamed Amine Ouddan, and Hassane Essafi source code retrieval system based on two-level fingerprint representation, respectively the structural and the semantic information within a source code. A sequence alignment technique is applied on these fingerprints in order to quantify the similarity between source code portions. The specific purpose of the system is to detect plagiarism and duplicated code between programs written in different programming languages belonging to the same class, such as C, Cµ, Java and CSharp. These four languages are supported by the actual version of the system which is designed such that it may be easily adapted for any programming language.
Abstract: There are two major variants of the Simplex
Algorithm: the revised method and the standard, or tableau method.
Today, all serious implementations are based on the revised method
because it is more efficient for sparse linear programming problems.
Moreover, there are a number of applications that lead to dense linear
problems so our aim in this paper is to present some computational
results on parallel implementation of dense Simplex Method. Our
implementation is implemented on a SMP cluster using C
programming language and the Message Passing Interface MPI.
Preliminary computational results on randomly generated dense
linear programs support our results.
Abstract: Currently is characterized production engineering
together with the integration of industrial automation and robotics
such very quick view of to manufacture the products. The production
range is continuously changing, expanding and producers have to be
flexible in this regard. It means that need to offer production
possibilities, which can respond to the quick change. Engineering
product development is focused on supporting CAD software, such
systems are mainly used for product design. That manufacturers are
competitive, it should be kept procured machines made available
capable of responding to output flexibility. In response to that
problem is the development of flexible manufacturing systems,
consisting of various automated systems. The integration of flexible
manufacturing systems and subunits together with product design and
of engineering is a possible solution for this issue. Integration is
possible through the implementation of CIM systems. Such a solution
and finding a hyphen between CAD and procurement system ICIM
3000 from Festo Co. is engaged in the research project and this
contribution. This can be designed the products in CAD systems and
watch the manufacturing process from order to shipping by the
development of methods and processes of integration, This can be
modeled in CAD systems products and watch the manufacturing
process from order to shipping to develop methods and processes of
integration, which will improve support for product design
parameters by monitoring of the production process, by creating of
programs for production using the CAD and therefore accelerates the
a total of process from design to implementation.
Abstract: In this paper, a new recursive strategy is proposed for determining $\frac{(n-1)!}{2}$ of $n$th order diagrams. The generalization of $n$th diagram for cross multiplication method were proposed by Pavlovic and Bankier but the specific rule of determining $\frac{(n-1)!}{2}$ of the $n$th order diagrams for square matrix is yet to be discovered. Thus using combinatorial approach, $\frac{(n-1)!}{2}$ of the $n$th order diagrams will be presented as $\frac{(n-1)!}{2}$ starter sets. These starter sets will be generated based on exchanging one element. The advantages of this new strategy are the discarding process was eliminated and the sign of starter set is alternated to each others.
Abstract: Ethanol has become more attractive in fuel industry
either as fuel itself or an additive that helps enhancing the octane
number and combustibility of gasoline. This research studied a
pressure swing adsorption using cassava-based adsorbent prepared
from mixture of cassava starch and cassava pulp for dehydration of
ethanol vapor. The apparatus used in the experiments consisted of
double adsorption columns, an evaporator, and a vacuum pump. The
feed solution contained 90-92 %wt of ethanol. Three process
variables: adsorption temperatures (110, 120 and 130°C), adsorption
pressures (1 and 2 bar gauge) and feed vapor flow rate (25, 50 and 75
% valve opening of the evaporator) were investigated. According to
the experimental results, the optimal operating condition for this
system was found to be at 2 bar gauge for adsorption pressure, 120°C
for adsorption temperature and 25% valve opening of the evaporator.
Production of 1.48 grams of ethanol with concentration higher than
99.5 wt% per gram of adsorbent was obtained. PSA with cassavabased
adsorbent reported in this study could be an alternative method
for production of nearly anhydrous ethanol. Dehydration of ethanol
vapor achieved in this study is due to an interaction between free
hydroxyl group on the glucose units of the starch and the water
molecules.
Abstract: Regression testing is a maintenance activity applied to
modified software to provide confidence that the changed parts are
correct and that the unchanged parts have not been adversely affected
by the modifications. Regression test selection techniques reduce the
cost of regression testing, by selecting a subset of an existing test
suite to use in retesting modified programs. This paper presents the
first general regression-test-selection technique, which based on code
and allows selecting test cases for any programs written in any
programming language. Then it handles incomplete program. We
also describe RTSDiff, a regression-test-selection system that
implements the proposed technique. The results of the empirical
studied that performed in four programming languages java, C#, Cµ
and Visual basic show that the efficiency and effective in reducing
the size of test suit.
Abstract: The aim of this study is to identify the conditions of
implementation for reconfigurability in summarizing past flexible
manufacturing systems (FMS) research by drawing overall
conclusions from many separate High Performance Manufacturing
(HPM) studies. Meta-analysis will be applied to links between HPM
programs and their practices related to FMS and manufacturing
performance with particular reference to responsiveness performance.
More specifically, an application of meta-analysis will be made with
reference to two of the main steps towards the development of an
empirically-tested theory: testing the adequacy of the measurement of
variables and testing the linkages between the variables.
Abstract: Rice husk is a lignocellulosic source that can be
converted to ethanol. Three hundreds grams of rice husk was mixed
with 1 L of 0.18 N sulfuric acid solutions then was heated in an
autoclave. The reaction was expected to be at constant temperature
(isothermal), but before that temperature was achieved, reaction has
occurred. The first liquid sample was taken at temperature of 140 0C
and repeated every 5 minute interval. So the data obtained are in the
regions of non-isothermal and isothermal. It was observed that the
degradation has significant effects on the ethanol production. The
kinetic constants can be expressed by Arrhenius equation with the
frequency factors for hydrolysis and sugar degradation of 1.58 x 105
1/min and 2.29 x 108 L/mole/min, respectively, while the activation
energies are 64,350 J/mole and 76,571 J/mole. The highest ethanol
concentration from fermentation is 1.13% v/v, attained at 220 0C.
Abstract: When binary decision diagrams are formed from
uniformly distributed Monte Carlo data for a large number of
variables, the complexity of the decision diagrams exhibits a
predictable relationship to the number of variables and minterms. In
the present work, a neural network model has been used to analyze the
pattern of shortest path length for larger number of Monte Carlo data
points. The neural model shows a strong descriptive power for the
ISCAS benchmark data with an RMS error of 0.102 for the shortest
path length complexity. Therefore, the model can be considered as a
method of predicting path length complexities; this is expected to lead
to minimum time complexity of very large-scale integrated circuitries
and related computer-aided design tools that use binary decision
diagrams.