Abstract: Analysis of blood vessel mechanics in normal and
diseased conditions is essential for disease research, medical device
design and treatment planning. In this work, 3D finite element
models of normal vessel and atherosclerotic vessel with 50% plaque
deposition were developed. The developed models were meshed
using finite number of tetrahedral elements. The developed models
were simulated using actual blood pressure signals. Based on the
transient analysis performed on the developed models, the parameters
such as total displacement, strain energy density and entropy per unit
volume were obtained. Further, the obtained parameters were used to
develop artificial neural network models for analyzing normal and
atherosclerotic blood vessels. In this paper, the objectives of the
study, methodology and significant observations are presented.
Abstract: A prototype model of an emulsion separator was
designed and manufactured. Generally, it is a cylinder filled with
different fractal modules. The emulsion was fed into the reactor by a
peristaltic pump through an inlet placed at the boundary between the
two phases. For hydrodynamic design and sizing of the reactor the
assumptions of the theory of filtration were used and methods to
describe the separation process were developed. Based on this
methodology and using numerical methods and software of Autodesk
the process is simulated in different operating modes. The basic
hydrodynamic characteristics - speed and performance for different
types of fractal systems and decisions to optimize the design of the
reactor were also defined.
Abstract: Cognitive Science appeared about 40 years ago,
subsequent to the challenge of the Artificial Intelligence, as common
territory for several scientific disciplines such as: IT, mathematics,
psychology, neurology, philosophy, sociology, and linguistics. The
new born science was justified by the complexity of the problems
related to the human knowledge on one hand, and on the other by the
fact that none of the above mentioned sciences could explain alone
the mental phenomena. Based on the data supplied by the
experimental sciences such as psychology or neurology, models of
the human mind operation are built in the cognition science. These
models are implemented in computer programs and/or electronic
circuits (specific to the artificial intelligence) – cognitive systems –
whose competences and performances are compared to the human
ones, leading to the psychology and neurology data reinterpretation,
respectively to the construction of new models. During these
processes if psychology provides the experimental basis, philosophy
and mathematics provides the abstraction level utterly necessary for
the intermission of the mentioned sciences.
The ongoing general problematic of the cognitive approach
provides two important types of approach: the computational one,
starting from the idea that the mental phenomenon can be reduced to
1 and 0 type calculus operations, and the connection one that
considers the thinking products as being a result of the interaction
between all the composing (included) systems. In the field of
psychology measurements in the computational register use classical
inquiries and psychometrical tests, generally based on calculus
methods. Deeming things from both sides that are representing the
cognitive science, we can notice a gap in psychological product
measurement possibilities, regarded from the connectionist
perspective, that requires the unitary understanding of the quality –
quantity whole. In such approach measurement by calculus proves to
be inefficient. Our researches, deployed for longer than 20 years,
lead to the conclusion that measuring by forms properly fits to the
connectionism laws and principles.
Abstract: The ever-growing usage of aspect-oriented
development methodology in the field of software engineering
requires tool support for both research environments and industry. So
far, tool support for many activities in aspect-oriented software
development has been proposed, to automate and facilitate their
development. For instance, the AJaTS provides a transformation
system to support aspect-oriented development and refactoring. In
particular, it is well established that the abstract interpretation of
programs, in any paradigm, pursued in static analysis is best served
by a high-level programs representation, such as Control Flow Graph
(CFG). This is why such analysis can more easily locate common
programmatic idioms for which helpful transformation are already
known as well as, association between the input program and
intermediate representation can be more closely maintained.
However, although the current researches define the good concepts
and foundations, to some extent, for control flow analysis of aspectoriented
programs but they do not provide a concrete tool that can
solely construct the CFG of these programs. Furthermore, most of
these works focus on addressing the other issues regarding Aspect-
Oriented Software Development (AOSD) such as testing or data flow
analysis rather than CFG itself. Therefore, this study is dedicated to
build an aspect-oriented control flow graph construction tool called
AJcFgraph Builder. The given tool can be applied in many software
engineering tasks in the context of AOSD such as, software testing,
software metrics, and so forth.
Abstract: Many high-risk pathogens that cause disease in
humans are transmitted through various food items. Food-borne
disease constitutes a major public health problem. Assessment of the
quality and safety of foods is important in human health. Rapid and
easy detection of pathogenic organisms will facilitate precautionary
measures to maintain healthy food. The Polymerase Chain Reaction
(PCR) is a handy tool for rapid detection of low numbers of bacteria.
We have designed gene specific primers for most common food
borne pathogens such as Staphylococci, Salmonella and E.coli.
Bacteria were isolated from food samples of various food outlets and
identified using gene specific PCRs. We identified Staphylococci,
Salmonella and E.coli O157 using gene specific primers by rapid and
direct PCR technique in various food samples. This study helps us in
getting a complete picture of the various pathogens that threaten to
cause and spread food borne diseases and it would also enable
establishment of a routine procedure and methodology for rapid
identification of food borne bacteria using the rapid technique of
direct PCR. This study will also enable us to judge the efficiency of
present food safety steps taken by food manufacturers and exporters.
Abstract: If organizations like Mellat Bank want to identify its
customer market completely to reach its specified goals, it can
segment the market to offer the product package to the right segment.
Our objective is to offer a segmentation model for Iran banking
market in Mellat bank view. The methodology of this project is
combined by “segmentation on the basis of four part-quality
variables" and “segmentation on the basis of different in means".
Required data are gathered from E-Systems and researcher personal
observation. Finally, the research offers the organization that at first
step form a four dimensional matrix with 756 segments using four
variables named value-based, behavioral, activity style, and activity
level, and at the second step calculate the means of profit for every
cell of matrix in two distinguished work level (levels α1:normal
condition and α2: high pressure condition) and compare the segments
by checking two conditions that are 1- homogeneity every segment
with its sub segment and 2- heterogeneity with other segments, and
so it can do the necessary segmentation process. After all, the last
offer (more explained by an operational example and feedback
algorithm) is to test and update the model because of dynamic
environment, technology, and banking system.
Abstract: Evolutionary Programming (EP) represents a
methodology of Evolutionary Algorithms (EA) in which mutation is
considered as a main reproduction operator. This paper presents a
novel EP approach for Artificial Neural Networks (ANN) learning.
The proposed strategy consists of two components: the self-adaptive,
which contains phenotype information and the dynamic, which is
described by genotype. Self-adaptation is achieved by the addition of
a value, called the network weight, which depends on a total number
of hidden layers and an average number of neurons in hidden layers.
The dynamic component changes its value depending on the fitness
of a chromosome, exposed to mutation. Thus, the mutation step size
is controlled by two components, encapsulated in the algorithm,
which adjust it according to the characteristics of a predefined ANN
architecture and the fitness of a particular chromosome. The
comparative analysis of the proposed approach and the classical EP
(Gaussian mutation) showed, that that the significant acceleration of
the evolution process is achieved by using both phenotype and
genotype information in the mutation strategy.
Abstract: The purpose of this paper is to consider the
introduction of online courses to replace the current classroom-based
staff training. The current training is practical, and must be
completed before access to the financial computer system is
authorized. The long term objective is to measure the efficacy,
effectiveness and efficiency of the training, and to establish whether
a transfer of knowledge back to the workplace has occurred. This
paper begins with an overview explaining the importance of staff
training in an evolving, competitive business environment and
defines the problem facing this particular organization. A summary
of the literature review is followed by a brief discussion of the
research methodology and objective. The implementation of the
alpha version of the online course is then described. This paper may
be of interest to those seeking insights into, or new theory regarding,
practical interventions of online learning in the real world.
Abstract: How to effectively allocate system resource to process
the Client request by Gateway servers is a challenging problem. In
this paper, we propose an improved scheme for autonomous
performance of Gateway servers under highly dynamic traffic loads.
We devise a methodology to calculate Queue Length and Waiting
Time utilizing Gateway Server information to reduce response time
variance in presence of bursty traffic. The most widespread
contemplation is performance, because Gateway Servers must offer
cost-effective and high-availability services in the elongated period,
thus they have to be scaled to meet the expected load. Performance
measurements can be the base for performance modeling and
prediction. With the help of performance models, the performance
metrics (like buffer estimation, waiting time) can be determined at
the development process. This paper describes the possible queue
models those can be applied in the estimation of queue length to
estimate the final value of the memory size. Both simulation and
experimental studies using synthesized workloads and analysis of
real-world Gateway Servers demonstrate the effectiveness of the
proposed system.
Abstract: Using a methodology grounded in business process
change theory, we investigate the critical success factors that affect
ERP implementation success in United States and India.
Specifically, we examine the ERP implementation at two case study
companies, one in each country. Our findings suggest that certain
factors that affect the success of ERP implementations are not
culturally bound, whereas some critical success factors depend on the
national culture of the country in which the system is being
implemented. We believe that the understanding of these critical
success factors will deepen the understanding of ERP
implementations and will help avoid implementation mistakes,
thereby increasing the rate of success in culturally different contexts.
Implications of the findings and future research directions for both
academicians and practitioners are also discussed.
Abstract: Testing accounts for the major percentage of technical
contribution in the software development process. Typically, it
consumes more than 50 percent of the total cost of developing a
piece of software. The selection of software tests is a very important
activity within this process to ensure the software reliability
requirements are met. Generally tests are run to achieve maximum
coverage of the software code and very little attention is given to the
achieved reliability of the software. Using an existing methodology,
this paper describes how to use Bayesian Belief Networks (BBNs) to
select unit tests based on their contribution to the reliability of the
module under consideration. In particular the work examines how the
approach can enhance test-first development by assessing the quality
of test suites resulting from this development methodology and
providing insight into additional tests that can significantly reduce
the achieved reliability. In this way the method can produce an
optimal selection of inputs and the order in which the tests are
executed to maximize the software reliability. To illustrate this
approach, a belief network is constructed for a modern software
system incorporating the expert opinion, expressed through
probabilities of the relative quality of the elements of the software,
and the potential effectiveness of the software tests. The steps
involved in constructing the Bayesian Network are explained as is a
method to allow for the test suite resulting from test-driven
development.
Abstract: Monitoring of ecological systems is one of the major
issues in ecosystem research. The concepts and methodology of
mathematical systems theory provide useful tools to face this
problem. In many cases, state monitoring of a complex ecological
system consists in observation (measurement) of certain state
variables, and the whole state process has to be determined from the
observed data. The solution proposed in the paper is the design of an
observer system, which makes it possible to approximately recover
the state process from its partial observation. The method is
illustrated with a trophic chain of resource – producer – primary
consumer type and a numerical example is also presented.
Abstract: This paper proposes the concept of aerocapture with
aerodynamic-environment-adaptive variable geometry flexible
aeroshell that vehicle deploys. The flexible membrane is composed
of thin-layer film or textile as its aeroshell in order to solve some
problems obstructing realization of aerocapture technique.
Multi-objective optimization study is conducted to investigate
solutions and derive design guidelines. As a result, solutions which
can avoid aerodynamic heating and enlarge the corridor width up
to 10% are obtained successfully, so that the effectiveness of this
concept can be demonstrated. The deformation-use optimum
solution changes its drag coefficient from 1.6 to 1.1, along with the
change in dynamic pressure. Moreover, optimization results show
that deformation-use solution requires the membrane for which
upper temperature limit and strain limit are more than 700 K and
120%, respectively, and elasticity (Young-s modulus) is of order of
106 Pa.
Abstract: Studies on Simultaneous Saccharification and Fermentation (SSF) of corn flour, a major agricultural product as the substrate using starch digesting glucoamylase enzyme derived from Aspergillus niger and non starch digesting and sugar fermenting Saccharomyces cerevisiae in a batch fermentation. Experiments based on Central Composite Design (CCD) were conducted to study the effect of substrate concentration, pH, temperature, enzyme concentration on Ethanol Concentration and the above parameters were optimized using Response Surface Methodology (RSM). The optimum values of substrate concentration, pH, temperature and enzyme concentration were found to be 160 g/l, 5.5, 30°C and 50 IU respectively. The effect of inoculums age on ethanol concentration was also investigated. The corn flour solution equivalent to 16% initial starch concentration gave the highest ethanol concentration of 63.04 g/l after 48 h of fermentation at optimum conditions of pH and temperature. Monod model and Logistic model were used for growth kinetics and Leudeking – Piret model was used for product formation kinetics.
Abstract: Dengue disease is an infectious vector-borne viral
disease that is commonly found in tropical and sub-tropical regions,
especially in urban and semi-urban areas, around the world and
including Malaysia. There is no currently available vaccine or
chemotherapy for the prevention or treatment of dengue disease.
Therefore prevention and treatment of the disease depend on vector
surveillance and control measures. Disease risk mapping has been
recognized as an important tool in the prevention and control
strategies for diseases. The choice of statistical model used for
relative risk estimation is important as a good model will
subsequently produce a good disease risk map. Therefore, the aim of
this study is to estimate the relative risk for dengue disease based
initially on the most common statistic used in disease mapping called
Standardized Morbidity Ratio (SMR) and one of the earliest
applications of Bayesian methodology called Poisson-gamma model.
This paper begins by providing a review of the SMR method, which
we then apply to dengue data of Perak, Malaysia. We then fit an
extension of the SMR method, which is the Poisson-gamma model.
Both results are displayed and compared using graph, tables and
maps. Results of the analysis shows that the latter method gives a
better relative risk estimates compared with using the SMR. The
Poisson-gamma model has been demonstrated can overcome the
problem of SMR when there is no observed dengue cases in certain
regions. However, covariate adjustment in this model is difficult and
there is no possibility for allowing spatial correlation between risks in
adjacent areas. The drawbacks of this model have motivated many
researchers to propose other alternative methods for estimating the
risk.
Abstract: In this study, we propose a network architecture for
providing secure access to information resources of enterprise
network from remote locations in a wireless fashion. Our proposed
architecture offers a very promising solution for organizations which
are in need of a secure, flexible and cost-effective remote access
methodology. Security of the proposed architecture is based on
Virtual Private Network technology and a special role based access
control mechanism with location and time constraints. The flexibility
mainly comes from the use of Internet as the communication medium
and cost-effectiveness is due to the possibility of in-house
implementation of the proposed architecture.
Abstract: Border Gateway Protocol (BGP) is the standard routing protocol between various autonomous systems (AS) in the internet. In the event of failure, a considerable delay in the BGP convergence has been shown by empirical measurements. During the convergence time the BGP will repeatedly advertise new routes to some destination and withdraw old ones until it reach a stable state. It has been found that the KEEPALIVE message timer and the HOLD time are tow parameters affecting the convergence speed. This paper aims to find the optimum value for the KEEPALIVE timer and the HOLD time that maximally reduces the convergence time without increasing the traffic. The KEEPALIVE message timer optimal value founded by this paper is 30 second instead of 60 seconds, and the optimal value for the HOLD time is 90 seconds instead of 180 seconds.
Abstract: The main aim of this research is to develop a methodology to encourage people's awareness, knowledge and understanding on the participation of flood management for cultural heritage, as the cooperation and interaction among government section, private section, and public section through role-play gaming simulation theory. The format of this research is to develop Role-play gaming simulation from existing documents, game or role-playing from several sources and existing data of the research site. We found that role-play gaming simulation can be implemented to help improving the understanding of the existing problem and the impact of the flood on cultural heritage, and the role-play game can be developed into the tool to improve people's knowledge, understanding and awareness about people's participation for flood management on cultural heritage, moreover the cooperation among the government, private section and public section will be improved through the theory of role-play gaming simulation.
Abstract: This paper applies Bayesian Networks to support
information extraction from unstructured, ungrammatical, and
incoherent data sources for semantic annotation. A tool has been
developed that combines ontologies, machine learning, and
information extraction and probabilistic reasoning techniques to
support the extraction process. Data acquisition is performed with the
aid of knowledge specified in the form of ontology. Due to the
variable size of information available on different data sources, it is
often the case that the extracted data contains missing values for
certain variables of interest. It is desirable in such situations to
predict the missing values. The methodology, presented in this paper,
first learns a Bayesian network from the training data and then uses it
to predict missing data and to resolve conflicts. Experiments have
been conducted to analyze the performance of the presented
methodology. The results look promising as the methodology
achieves high degree of precision and recall for information
extraction and reasonably good accuracy for predicting missing
values.
Abstract: Rice husk is one of the alternative fuels for Thailand because of its high potential and environmental benefits. Nonetheless, the environmental profile of the electricity production from rice husk must be assessed to ensure reduced environmental damage. A 10 MW pilot plant using rice husk as feedstock is the study site. The environmental impacts from rice husk power plant are evaluated by using the Life Cycle Assessment (LCA) methodology. Energy, material and carbon balances have been determined for tracing the system flow. Carbon closure has been used for describing of the net amount of CO2 released from the system in relation to the amount being recycled between the power plant and the CO2 adsorbed by rice husk. The transportation of rice husk to the power plant has significant on global warming, but not on acidification and photo-oxidant formation. The results showed that the impact potentials from rice husk power plant are lesser than the conventional plants for most of the categories considered; except the photo-oxidant formation potential from CO. The high CO from rice husk power plant may be due to low boiler efficiency and high moisture content in rice husk. The performance of the study site can be enhanced by improving the combustion efficiency.