Abstract: The common bean is the most important grain legume for direct human consumption in the world and BCMV is one of the world's most serious bean diseases that can reduce yield and quality of harvested product. To determine the best tolerance index to BCMV and recognize tolerant genotypes, 2 experiments were conducted in field conditions. Twenty five common bean genotypes were sown in 2 separate RCB design with 3 replications under contamination and non-contamination conditions. On the basis of the results of indices correlations GMP, MP and HARM were determined as the most suitable tolerance indices. The results of principle components analysis indicated 2 first components totally explained 98.52% of variations among data. The first and second components were named potential yield and stress susceptible respectively. Based on the results of BCMV tolerance indices assessment and biplot analysis WA8563-4, WA8563-2 and Cardinal were the genotypes that exhibited potential seed yield under contamination and noncontamination conditions.
Abstract: The paper presents the optimization problem for the
multi-element synthetic transmit aperture method (MSTA) in
ultrasound imaging applications. The optimal choice of the transmit
aperture size is performed as a trade-off between the lateral
resolution, penetration depth and the frame rate. Results of the
analysis obtained by a developed optimization algorithm are
presented. Maximum penetration depth and the best lateral resolution
at given depths are chosen as the optimization criteria. The
optimization algorithm was tested using synthetic aperture data of
point reflectors simulated by Filed II program for Matlab® for the
case of 5MHz 128-element linear transducer array with 0.48 mm
pitch are presented. The visualization of experimentally obtained
synthetic aperture data of a tissue mimicking phantom and in vitro
measurements of the beef liver are also shown. The data were
obtained using the SonixTOUCH Research systemequipped with a
linear 4MHz 128 element transducerwith 0.3 mm element pitch, 0.28
mm element width and 70% fractional bandwidth was excited by one
sine cycle pulse burst of transducer's center frequency.
Abstract: The number of framework conceived for e-learning
constantly increase, unfortunately the creators of learning materials
and educational institutions engaged in e-formation adopt a
“proprietor" approach, where the developed products (courses,
activities, exercises, etc.) can be exploited only in the framework
where they were conceived, their uses in the other learning
environments requires a greedy adaptation in terms of time and
effort. Each one proposes courses whose organization, contents,
modes of interaction and presentations are unique for all learners,
unfortunately the latter are heterogeneous and are not interested by
the same information, but only by services or documents adapted to
their needs. Currently the new tendency for the framework
conceived for e-learning, is the interoperability of learning materials,
several standards exist (DCMI (Dublin Core Metadata Initiative)[2],
LOM (Learning Objects Meta data)[1], SCORM (Shareable Content
Object Reference Model)[6][7][8], ARIADNE (Alliance of Remote
Instructional Authoring and Distribution Networks for Europe)[9],
CANCORE (Canadian Core Learning Resource Metadata
Application Profiles)[3]), they converge all to the idea of learning
objects. They are also interested in the adaptation of the learning
materials according to the learners- profile. This article proposes an
approach for the composition of courses adapted to the various
profiles (knowledge, preferences, objectives) of learners, based on
two ontologies (domain to teach and educational) and the learning
objects.
Abstract: Pakistani doctors (MBBS) are emigrating towards developed countries for professional adjustments. This study aims to highlight causes and consequences of doctors- brain drain from Pakistan. Primary data was collected from Mayo Hospital, Lahore by interviewing doctors (n=100) through systematic random sampling technique. It found that various socio-economic and political conditions are working as push and pull factors for brain drain of doctors in Pakistan. Majority of doctors (83%) declared poor remunerations and professional infrastructure of health department as push factor of doctors- brain drain. 81% claimed that continuous instability in political situation and threats of terrorism are responsible for emigration of doctors. 84% respondents considered fewer opportunities of further studies responsible for their emigration. Brain drain of doctors is affecting health sector-s policies / programs, standard doctor-patient ratios and quality of health services badly.
Abstract: The topic of surface flattening plays a vital role in the field of computer aided design and manufacture. Surface flattening enables the production of 2D patterns and it can be used in design and manufacturing for developing a 3D surface to a 2D platform, especially in fashion design. This study describes surface flattening based on minimum energy methods according to the property of different fabrics. Firstly, through the geometric feature of a 3D surface, the less transformed area can be flattened on a 2D platform by geodesic. Then, strain energy that has accumulated in mesh can be stably released by an approximate implicit method and revised error function. In some cases, cutting mesh to further release the energy is a common way to fix the situation and enhance the accuracy of the surface flattening, and this makes the obtained 2D pattern naturally generate significant cracks. When this methodology is applied to a 3D mannequin constructed with feature lines, it enhances the level of computer-aided fashion design. Besides, when different fabrics are applied to fashion design, it is necessary to revise the shape of a 2D pattern according to the properties of the fabric. With this model, the outline of 2D patterns can be revised by distributing the strain energy with different results according to different fabric properties. Finally, this research uses some common design cases to illustrate and verify the feasibility of this methodology.
Abstract: A new chelating resin is prepared by coupling Amberlite XAD-4 with 1-amino-2-naphthole through an azo spacer. The resulting sorbent has been characterized by FT-IR, elemental analysis and thermogravimetric analysis (TGA) and studied for preconcentrating of Fe (II) using flame atomic absorption spectrometry (FAAS) for metal monitoring. The optimum pH value for sorption of the iron ions was 6.5. The resin was subjected to evaluation through batch binding of mentioned metal ion. Quantitative desorption occurs instantaneously with 0.5 M HNO3. The sorption capacity was found 4.1 mmol.g-1 of resin for Fe (II) in the aqueous solution. The chelating resin can be reused for 10 cycles of sorption-desorption without any significant change in sorption capacity. A recovery of 97% was obtained the metal ions with 0.5 M HNO3 as eluting agent. The method was applied for metal ions determination from industrial waste water sample.
Abstract: Many computational techniques were applied to
solution of heat conduction problem. Those techniques were the
finite difference (FD), finite element (FE) and recently meshless
methods. FE is commonly used in solution of equation of heat
conduction problem based on the summation of stiffness matrix of
elements and the solution of the final system of equations. Because
of summation process of finite element, convergence rate was
decreased. Hence in the present paper Cellular Automata (CA)
approach is presented for the solution of heat conduction problem.
Each cell considered as a fixed point in a regular grid lead to the
solution of a system of equations is substituted by discrete systems of
equations with small dimensions. Results show that CA can be used
for solution of heat conduction problem.
Abstract: Nanostructured Iron Oxide with different
morphologies of rod-like and granular have been suc-cessfully
prepared via a solid-state reaction in the presence of NaCl, NaBr, NaI
and NaN3, respectively. The added salts not only prevent a drastic
increase in the size of the products but also provide suitable
conditions for the oriented growth of primary nanoparticles. The
formation mechanisms of these materials by solid-state reaction at
ambient temperature are proposed. The photocatalytic experiments
for congo red (CR) have demonstrated that the mixture of α-Fe2O3
and Fe3O4 nanostructures were more efficient than α-Fe2O3
nanostructures.
Abstract: As nanotechnology advances, the use of nanotechnology for medical purposes in the field of nanomedicine seems more promising; the rise of nanorobots for medical diagnostics and treatments could be arriving in the near future. This study proposes a swarm intelligence based control mechanism for swarm nanorobots that operate as artificial platelets to search for wounds. The canonical particle swarm optimization algorithm is employed in this study. A simulation in the circulatory system is constructed and used for demonstrating the movement of nanorobots with essential characteristics to examine the performance of proposed control mechanism. The effects of three nanorobot capabilities including their perception range, maximum velocity and respond time are investigated. The results show that canonical particle swarm optimization can be used to control the early version nanorobots with simple behaviors and actions.
Abstract: Text-based game is supposed to be a low resource
consumption application that delivers good performances when
compared to graphical-intensive type of games. But, nowadays, some
of the online text-based games are not offering performances that are
acceptable to the users. Therefore, an online text-based game called
Star_Quest has been developed in order to analyze its behavior under
different performance measurements. Performance metrics such as
throughput, scalability, response time and page loading time are
captured to yield the performance of the game. The techniques in
performing the load testing are also disclosed to exhibit the viability
of our work. The comparative assessment between the results
obtained and the accepted level of performances are conducted as to
determine the performance level of the game. The study reveals that
the developed game managed to meet all the performance objectives
set forth.
Abstract: The present models and simulation algorithms of intracellular stochastic kinetics are usually based on the premise that diffusion is so fast that the concentrations of all the involved species are homogeneous in space. However, recents experimental measurements of intracellular diffusion constants indicate that the assumption of a homogeneous well-stirred cytosol is not necessarily valid even for small prokaryotic cells. In this work a mathematical treatment of diffusion that can be incorporated in a stochastic algorithm simulating the dynamics of a reaction-diffusion system is presented. The movement of a molecule A from a region i to a region j of the space is represented as a first order reaction Ai k- ! Aj , where the rate constant k depends on the diffusion coefficient. The diffusion coefficients are modeled as function of the local concentration of the solutes, their intrinsic viscosities, their frictional coefficients and the temperature of the system. The stochastic time evolution of the system is given by the occurrence of diffusion events and chemical reaction events. At each time step an event (reaction or diffusion) is selected from a probability distribution of waiting times determined by the intrinsic reaction kinetics and diffusion dynamics. To demonstrate the method the simulation results of the reaction-diffusion system of chaperoneassisted protein folding in cytoplasm are shown.
Abstract: Many footbridges have natural frequencies that
coincide with the dominant frequencies of the pedestrian-induced
load and therefore they have a potential to suffer excessive vibrations
under dynamic loads induced by pedestrians. Some of the design
standards introduce load models for pedestrian loads applicable for
simple structures. Load modeling for more complex structures, on the
other hand, is most often left to the designer. The main focus of this
paper is on the human induced forces transmitted to a footbridge and
on the ways these loads can be modeled to be used in the dynamic
design of footbridges. Also design criteria and load models proposed
by widely used standards were introduced and a comparison was
made. The dynamic analysis of the suspension bridge in Kolin in the
Czech Republic was performed on detailed FEM model using the
ANSYS program system. An attempt to model the load imposed by a
single person and a crowd of pedestrians resulted in displacements
and accelerations that are compared with serviceability criteria.
Abstract: Business process model describes process flow of a
business and can be seen as the requirement for developing a
software application. This paper discusses a BPM2CD guideline
which complements the Model Driven Architecture concept by
suggesting how to create a platform-independent software model in
the form of a UML class diagram from a business process model. An
important step is the identification of UML classes from the business
process model. A technique for object-oriented analysis called
domain analysis is borrowed and key concepts in the business
process model will be discovered and proposed as candidate classes
for the class diagram. The paper enhances this step by using ontology
search to help identify important classes for the business domain. As
ontology is a source of knowledge for a particular domain which
itself can link to ontologies of related domains, the search can give a
refined set of candidate classes for the resulting class diagram.
Abstract: Deep and radical social reforms of the last century-s
nineties in many Eastern European countries caused changes in
Information Technology-s (IT) field. Inefficient information
technologies were rapidly replaced with forefront IT solutions, e.g.,
in Eastern European countries there is a high level penetration of
qualitative high-speed Internet. The authors have taken part in the
introduction of those changes in Latvia-s leading IT research
institute. Grounding on their experience authors in this paper offer an
IT services based model for analysis the mentioned changes- and
development processes in the higher education and research fields,
i.e., for research e-infrastructure-s development. Compare to the
international practice such services were developed in Eastern Europe
in an untraditional way, which provided swift and positive
technological changes.
Abstract: Chemically defined Schlegel-s medium was modified
to improve production of cell growth and other metabolites that are
produced by fluorescent pseudomonad R62 strain. The modified
medium does not require pH control as pH changes are kept within ±
0.2 units of the initial pH 7.1 during fermentation. The siderophore
production was optimized for the fluorescent pseudomonad strain in
the modified medium containing 1% glycerol as a major carbon
source supplemented with 0.05% succinic acid and 0.5% Ltryptophan.
Indole-3 acetic acid (IAA) production was higher when
L-tryptophan was used at 0.5%. The 2,4- diacetylphloroglucinol
(DAPG) was higher with amended three trace elements in medium.
The optimized medium produced 2.28 g/l of dry cell mass and 900
mg/l of siderophore at the end of 36 h cultivation, while the
production levels of IAA and DAPG were 65 mg/l and 81 mg/l
respectively at the end of 48 h cultivation.
Abstract: This article outlines conceptualization and
implementation of an intelligent system capable of extracting
knowledge from databases. Use of hybridized features of both the
Rough and Fuzzy Set theory render the developed system flexibility
in dealing with discreet as well as continuous datasets. A raw data set
provided to the system, is initially transformed in a computer legible
format followed by pruning of the data set. The refined data set is
then processed through various Rough Set operators which enable
discovery of parameter relationships and interdependencies. The
discovered knowledge is automatically transformed into a rule base
expressed in Fuzzy terms. Two exemplary cancer repository datasets
(for Breast and Lung Cancer) have been used to test and implement
the proposed framework.
Abstract: The purpose of this study is i) to investigate the driving factors and barriers of the adoption of Information and Communication Technology (ICT) in Halal logistic and ii) to develop an ICT adoption framework for Halal logistic service provider. The Halal LSPs selected for the study currently used ICT service platforms, such as accounting and management system for Halal logistic business. The study categorizes the factors influencing the adoption decision and process by LSPs into four groups: technology related factors, organizational and environmental factors, Halal assurance related factors, and government related factors. The major contribution in this study is the discovery that technology related factors (ICT compatibility with Halal requirement) and Halal assurance related factors are the most crucial factors among the Halal LSPs applying ICT for Halal control in transportation-s operation. Among the government related factors, ICT requirement for monitoring Halal included in Halal Logistic Standard on Transportation (MS2400:2010) are the most influencing factors in the adoption of ICT with the support of the government. In addition, the government related factors are very important in the reducing the main barriers and the creation of the atmosphere of ICT adoption in Halal LSP sector.
Abstract: Camera calibration is an indispensable step for augmented
reality or image guided applications where quantitative information
should be derived from the images. Usually, a camera
calibration is obtained by taking images of a special calibration object
and extracting the image coordinates of projected calibration marks
enabling the calculation of the projection from the 3d world coordinates
to the 2d image coordinates. Thus such a procedure exhibits
typical steps, including feature point localization in the acquired
images, camera model fitting, correction of distortion introduced by
the optics and finally an optimization of the model-s parameters. In
this paper we propose to extend this list by further step concerning
the identification of the optimal subset of images yielding the smallest
overall calibration error. For this, we present a Monte Carlo based
algorithm along with a deterministic extension that automatically
determines the images yielding an optimal calibration. Finally, we
present results proving that the calibration can be significantly
improved by automated image selection.
Abstract: Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, many real life optimization problems often
require finding optimal solution to complex high dimensional,
multimodal problems involving computationally very expensive
fitness function evaluations. Use of evolutionary algorithms in such
problem domains is thus practically prohibitive. An attractive
alternative is to build meta models or use an approximation of the
actual fitness functions to be evaluated. These meta models are order
of magnitude cheaper to evaluate compared to the actual function
evaluation. Many regression and interpolation tools are available to
build such meta models. This paper briefly discusses the
architectures and use of such meta-modeling tools in an evolutionary
optimization context. We further present two evolutionary algorithm
frameworks which involve use of meta models for fitness function
evaluation. The first framework, namely the Dynamic Approximate
Fitness based Hybrid EA (DAFHEA) model [14] reduces
computation time by controlled use of meta-models (in this case
approximate model generated by Support Vector Machine
regression) to partially replace the actual function evaluation by
approximate function evaluation. However, the underlying
assumption in DAFHEA is that the training samples for the metamodel
are generated from a single uniform model. This does not take
into account uncertain scenarios involving noisy fitness functions.
The second model, DAFHEA-II, an enhanced version of the original
DAFHEA framework, incorporates a multiple-model based learning
approach for the support vector machine approximator to handle
noisy functions [15]. Empirical results obtained by evaluating the
frameworks using several benchmark functions demonstrate their
efficiency
Abstract: This conference paper discusses a risk allocation problem for subprime investing banks involving investment in subprime structured mortgage products (SMPs) and Treasuries. In order to solve this problem, we develop a L'evy process-based model of jump diffusion-type for investment choice in subprime SMPs and Treasuries. This model incorporates subprime SMP losses for which credit default insurance in the form of credit default swaps (CDSs) can be purchased. In essence, we solve a mean swap-at-risk (SaR) optimization problem for investment which determines optimal allocation between SMPs and Treasuries subject to credit risk protection via CDSs. In this regard, SaR is indicative of how much protection investors must purchase from swap protection sellers in order to cover possible losses from SMP default. Here, SaR is defined in terms of value-at-risk (VaR). Finally, we provide an analysis of the aforementioned optimization problem and its connections with the subprime mortgage crisis (SMC).