Abstract: Overall cost is a significant consideration in any
decision-making process. Although many studies were carried out on
overall cost in construction, little has treated the uncertainties of real
life cycle development. On the basis of several case studies, a
feedback process was performed on the historical data of studied
buildings. This process enabled to identify some factors causing
uncertainty during the operational period. As a result, the research
proposes a new method for assessing the overall cost during a part of
the building-s life cycle taking account of the building actual value,
its end-of-life value and the influence of the identified life cycle
uncertainty factors. The findings are a step towards a higher level of
reliability in overall cost evaluation taking account of some usually
unexpected uncertainty factors.
Abstract: Requirement engineering has been the subject of large
volume of researches due to the significant role it plays in the
software development life cycle. However, dynamicity of software
industry is much faster than advances in requirements engineering
approaches. Therefore, this paper aims to systematically review and
evaluate the current research in requirement engineering and identify
new research trends and direction in this field. In addition, various
research methods associated with the Evaluation-based techniques
and empirical study are highlighted for the requirements engineering
field. Finally, challenges and recommendations on future directions
research are presented based on the research team observations
during this study.
Abstract: The aim of this research is to determine how preservice Turkish teachers perceive themselves in terms of problem solving skills. Students attending Department of Turkish Language Teaching of Gazi University Education Faculty in 2005-2006 academic year constitute the study group (n= 270) of this research in which survey model was utilized. Data were obtained by Problem Solving Inventory developed by Heppner & Peterson and Personal Information Form. Within the settings of this research, Cronbach Alpha reliability coefficient of the scale was found as .87. Besides, reliability coefficient obtained by split-half technique which splits odd and even numbered items of the scale was found as r=.81 (Split- Half Reliability). The findings of the research revealed that preservice Turkish teachers were sufficiently qualified on the subject of problem solving skills and statistical significance was found in favor of male candidates in terms of “gender" variable. According to the “grade" variable, statistical significance was found in favor of 4th graders.
Abstract: This paper proposed high level feature for online Lao handwritten recognition. This feature must be high level enough so that the feature is not change when characters are written by different persons at different speed and different proportion (shorter or longer stroke, head, tail, loop, curve). In this high level feature, a character is divided in to sequence of curve segments where a segment start where curve reverse rotation (counter clockwise and clockwise). In each segment, following features are gathered cumulative change in direction of curve (- for clockwise), cumulative curve length, cumulative length of left to right, right to left, top to bottom and bottom to top ( cumulative change in X and Y axis of segment). This feature is simple yet robust for high accuracy recognition. The feature can be gather from parsing the original time sampling sequence X, Y point of the pen location without re-sampling. We also experiment on other segmentation point such as the maximum curvature point which was widely used by other researcher. Experiments results show that the recognition rates are at 94.62% in comparing to using maximum curvature point 75.07%. This is due to a lot of variations of turning points in handwritten.
Abstract: This work is to study a roll of the fluctuating density
gradient in the compressible flows for the computational fluid dynamics
(CFD). A new anisotropy tensor with the fluctuating density
gradient is introduced, and is used for an invariant modeling technique
to model the turbulent density gradient correlation equation derived
from the continuity equation. The modeling equation is decomposed
into three groups: group proportional to the mean velocity, and that
proportional to the mean strain rate, and that proportional to the mean
density. The characteristics of the correlation in a wake are extracted
from the results by the two dimensional direct simulation, and shows
the strong correlation with the vorticity in the wake near the body.
Thus, it can be concluded that the correlation of the density gradient
is a significant parameter to describe the quick generation of the
turbulent property in the compressible flows.
Abstract: In this paper, fully developed flow and heat transfer of
viscoelastic materials in curved ducts with square cross section under
constant heat flux have been investigated. Here, staggered mesh is
used as computational grids and flow and heat transfer parameters
have been allocated in this mesh with marker and cell method.
Numerical solution of governing equations has being performed with
FTCS finite difference method. Furthermore, Criminale-Eriksen-
Filbey (CEF) constitutive equation has being used as viscoelastic
model. CEF constitutive equation is a suitable model for studying
steady shear flow of viscoelastic materials which is able to model
both effects of the first and second normal stress differences. Here, it
is shown that the first and second normal stresses differences have
noticeable and inverse effect on secondary flows intensity and mean
Nusselt number which is the main novelty of current research.
Abstract: Modeling of the distributed systems allows us to
represent the whole its functionality. The working system instance
rarely fulfils the whole functionality represented by model; usually
some parts of this functionality should be accessible periodically.
The reporting system based on the Data Warehouse concept seams to
be an intuitive example of the system that some of its functionality is
required only from time to time. Analyzing an enterprise risk
associated with the periodical change of the system functionality, we
should consider not only the inaccessibility of the components
(object) but also their functions (methods), and the impact of such a
situation on the system functionality from the business point of view.
In the paper we suggest that the risk attributes should be estimated
from risk attributes specified at the requirements level (Use Case in
the UML model) on the base of the information about the structure of
the model (presented at other levels of the UML model). We argue
that it is desirable to consider the influence of periodical changes in
requirements on the enterprise risk estimation. Finally, the
proposition of such a solution basing on the UML system model is
presented.
Abstract: Potatoes are a good source of carotenoids, which are lipophilic compounds synthesized in plastids from isoprenoids. The aim of this research was to determine the content of carotenoids in relationship with the colour of organically and conventionally cultivated potato genotypes before and after period of storage. In cooperation with the State Priekuli Plant Breeding Institute (Latvia), six potato genotypes were studied: 'Agrie dzeltenie', 'Prelma', 'Imanta', 'S-03135-10', 'S-99108-8' and 'S-01063-5'. All the genotypes were cultivated under three different conditions: organically and conventionally (two conditions). The content of carotenoids was determined by using spectrophotometer and the colour – L*a*b* system. The results of current research show that after the period of storage, carotenoid amount has increased and in conventionally cultivated potatoes it varies from 228.514 to 552.434 μg 100 g-1 while in organically cultivated potato genotypes – from 45.485 to 662.699 μg 100 g-1 FW. Colour of potato flesh was changing during storage.
Abstract: Fuzzy C-means Clustering algorithm (FCM) is a
method that is frequently used in pattern recognition. It has the
advantage of giving good modeling results in many cases, although,
it is not capable of specifying the number of clusters by itself. In
FCM algorithm most researchers fix weighting exponent (m) to a
conventional value of 2 which might not be the appropriate for all
applications. Consequently, the main objective of this paper is to use
the subtractive clustering algorithm to provide the optimal number of
clusters needed by FCM algorithm by optimizing the parameters of
the subtractive clustering algorithm by an iterative search approach
and then to find an optimal weighting exponent (m) for the FCM
algorithm. In order to get an optimal number of clusters, the iterative
search approach is used to find the optimal single-output Sugenotype
Fuzzy Inference System (FIS) model by optimizing the
parameters of the subtractive clustering algorithm that give minimum
least square error between the actual data and the Sugeno fuzzy
model. Once the number of clusters is optimized, then two
approaches are proposed to optimize the weighting exponent (m) in
the FCM algorithm, namely, the iterative search approach and the
genetic algorithms. The above mentioned approach is tested on the
generated data from the original function and optimal fuzzy models
are obtained with minimum error between the real data and the
obtained fuzzy models.
Abstract: Wireless sensor networks have been used in wide
areas of application and become an attractive area for researchers in
recent years. Because of the limited energy storage capability of
sensor nodes, Energy consumption is one of the most challenging
aspects of these networks and different strategies and protocols deals
with this area. This paper presents general methods for designing low
power wireless sensor network. Different sources of energy
consumptions in these networks are discussed here and techniques for
alleviating the consumption of energy are presented.
Abstract: Artificial Immune System is applied as a Heuristic
Algorithm for decades. Nevertheless, many of these applications
took advantage of the benefit of this algorithm but seldom proposed
approaches for enhancing the efficiency. In this paper, a
Self-evolving Artificial Immune System is proposed via developing
the T and B cell in Immune System and built a self-evolving
mechanism for the complexities of different problems. In this
research, it focuses on enhancing the efficiency of Clonal selection
which is responsible for producing Affinities to resist the invading of
Antigens. T and B cell are the main mechanisms for Clonal
Selection to produce different combinations of Antibodies.
Therefore, the development of T and B cell will influence the
efficiency of Clonal Selection for searching better solution.
Furthermore, for better cooperation of the two cells, a co-evolutional
strategy is applied to coordinate for more effective productions of
Antibodies. This work finally adopts Flow-shop scheduling
instances in OR-library to validate the proposed algorithm.
Abstract: In this work, we consider a deterministic model for
the transmission of leptospirosis which is currently spreading in the
Thai population. The SIR model which incorporates the features of
this disease is applied to the epidemiological data in Thailand. It is
seen that the numerical solutions of the SIR equations are in good
agreement with real empirical data. Further improvements are
discussed.
Abstract: This paper made an attempt to investigate the problem associated with enhancement of emulsions of light crude oil-water recovery in an oil field of Algerian Sahara. Measurements were taken through experiments using RheoStress (RS600). Factors such as shear rate, temperature and light oil concentration on the viscosity behavior were considered. Experimental measurements were performed in terms of shear stress–shear rate, yield stress and flow index on mixture of light crude oil–water. The rheological behavior of emulsion showed Non-Newtonian shear thinning behavior (Herschel-Bulkley). The experiments done in the laboratory showed the stability of some water in light crude oil emulsions form during consolidate oil recovery process. To break the emulsion using additives may involve higher cost and could be very expensive. Therefore, further research should be directed to find solution of these problems that have been encountered.
Abstract: Semisolid metal processing uses solid–liquid slurries
containing fine and globular solid particles uniformly distributed in a
liquid matrix, which can be handled as a solid and flow like a liquid.
In the recent years, many methods have been introduced for the
production of semisolid slurries since it is scientifically sound and
industrially viable with such preferred microstructures called
thixotropic microstructures as feedstock materials. One such process
that needs very low equipment investment and running costs is the
cooling slope. In this research by using a mechanical stirrer slurry
maker constructed by the authors, the effects of mechanical stirring
parameters such as: stirring time, stirring temperature and stirring
Speed on micro-structure and mechanical properties of A360
aluminum alloy in semi-solid forming, are investigated. It is
determined that mold temperature and holding time of part in
temperature of 580ºC have a great effect on micro-structure and
mechanical properties(stirring temperature of 585ºC, stirring time of
20 minutes and stirring speed of 425 RPM). By optimizing the
forming parameters, dendrite microstructure changes to globular and
mechanical properties improves. This is because of breaking and
globularzing dendrites of primary α-AL.
Abstract: Classifying biomedical literature is a difficult and
challenging task, especially when a large number of biomedical
articles should be organized into a hierarchical structure. In this paper,
we present an approach for classifying a collection of biomedical text
abstracts downloaded from Medline database with the help of
ontology alignment. To accomplish our goal, we construct two types
of hierarchies, the OHSUMED disease hierarchy and the Medline
abstract disease hierarchies from the OHSUMED dataset and the
Medline abstracts, respectively. Then, we enrich the OHSUMED
disease hierarchy before adapting it to ontology alignment process for
finding probable concepts or categories. Subsequently, we compute
the cosine similarity between the vector in probable concepts (in the
“enriched" OHSUMED disease hierarchy) and the vector in Medline
abstract disease hierarchies. Finally, we assign category to the new
Medline abstracts based on the similarity score. The results obtained
from the experiments show the performance of our proposed approach
for hierarchical classification is slightly better than the performance of
the multi-class flat classification.
Abstract: This paper describes about the process of recognition and classification of brain images such as normal and abnormal based on PSO-SVM. Image Classification is becoming more important for medical diagnosis process. In medical area especially for diagnosis the abnormality of the patient is classified, which plays a great role for the doctors to diagnosis the patient according to the severeness of the diseases. In case of DICOM images it is very tough for optimal recognition and early detection of diseases. Our work focuses on recognition and classification of DICOM image based on collective approach of digital image processing. For optimal recognition and classification Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and Support Vector Machine (SVM) are used. The collective approach by using PSO-SVM gives high approximation capability and much faster convergence.
Abstract: In an era of knowledge explosion, the growth of data
increases rapidly day by day. Since data storage is a limited resource,
how to reduce the data space in the process becomes a challenge issue.
Data compression provides a good solution which can lower the
required space. Data mining has many useful applications in recent
years because it can help users discover interesting knowledge in large
databases. However, existing compression algorithms are not
appropriate for data mining. In [1, 2], two different approaches were
proposed to compress databases and then perform the data mining
process. However, they all lack the ability to decompress the data to
their original state and improve the data mining performance. In this
research a new approach called Mining Merged Transactions with the
Quantification Table (M2TQT) was proposed to solve these problems.
M2TQT uses the relationship of transactions to merge related
transactions and builds a quantification table to prune the candidate
itemsets which are impossible to become frequent in order to improve
the performance of mining association rules. The experiments show
that M2TQT performs better than existing approaches.
Abstract: We introduce, a new interactive 3D simulation system of ocular motion and expressions suitable for: (1) character animation applications to game design, film production, HCI (Human Computer Interface), conversational animated agents, and virtual reality; (2) medical applications (ophthalmic neurological and muscular pathologies: research and education); and (3) real time simulation of unconscious cognitive and emotional responses (for use, e.g., in psychological research). The system is comprised of: (1) a physiologically accurate parameterized 3D model of the eyes, eyelids, and eyebrow regions; and (2) a prototype device for realtime control of eye motions and expressions, including unconsciously produced expressions, for application as in (1), (2), and (3) above. The 3D eye simulation system, created using state-of-the-art computer animation technology and 'optimized' for use with an interactive and web deliverable platform, is, to our knowledge, the most advanced/realistic available so far for applications to character animation and medical pedagogy.
Abstract: Today, building automation is advancing from simple
monitoring and control tasks of lightning and heating towards more
and more complex applications that require a dynamic perception
and interpretation of different scenes occurring in a building. Current
approaches cannot handle these newly upcoming demands. In this
article, a bionically inspired approach for multimodal, dynamic scene
perception and interpretation is presented, which is based on neuroscientific
and neuro-psychological research findings about the perceptual
system of the human brain. This approach bases on data from diverse
sensory modalities being processed in a so-called neuro-symbolic
network. With its parallel structure and with its basic elements being
information processing and storing units at the same time, a very
efficient method for scene perception is provided overcoming the
problems and bottlenecks of classical dynamic scene interpretation
systems.
Abstract: In this paper, we propose a novel algorithm for
delineating the endocardial wall from a human heart ultrasound scan.
We assume that the gray levels in the ultrasound images are
independent and identically distributed random variables with
different Rician Inverse Gaussian (RiIG) distributions. Both synthetic
and real clinical data will be used for testing the algorithm. Algorithm
performance will be evaluated using the expert radiologist evaluation
of a soft copy of an ultrasound scan during the scanning process and
secondly, doctor’s conclusion after going through a printed copy of
the same scan. Successful implementation of this algorithm should
make it possible to differentiate normal from abnormal soft tissue and
help disease identification, what stage the disease is in and how best
to treat the patient. We hope that an automated system that uses this
algorithm will be useful in public hospitals especially in Third World
countries where problems such as shortage of skilled radiologists and
shortage of ultrasound machines are common. These public hospitals
are usually the first and last stop for most patients in these countries.