Abstract: Chikungunya virus (CHICKV) is an arboviruses belonging to family Tagoviridae and is transmitted to human through by mosquito (Aedes aegypti and Aedes albopictus) bite. A large outbreak of chikungunya has been reported in India between 2006 and 2007, along with several other countries from South-East Asia and for the first time in Europe. It was for the first time that the CHICKV outbreak has been reported with mortality from Reunion Island and increased mortality from Asian countries. CHICKV affects all age groups, and currently there are no specific drugs or vaccine to cure the disease. The need of antiviral agents for the treatment of CHICKV infection and the success of virtual screening against many therapeutically valuable targets led us to carry out the structure based drug design against Chikungunya nSP2 protease (PDB: 3TRK). Highthroughput virtual screening of publicly available databases, ZINC12 and BindingDB, has been carried out using the Openeye tools and Schrodinger LLC software packages. Openeye Filter program has been used to filter the database and the filtered outputs were docked using HTVS protocol implemented in GLIDE package of Schrodinger LLC. The top HITS were further used for enriching the similar molecules from the database through vROCS; a shape based screening protocol implemented in Openeye. The approach adopted has provided different scaffolds as HITS against CHICKV protease. Three scaffolds: Indole, Pyrazole and Sulphone derivatives were selected based on the docking score and synthetic feasibility. Derivatives of Pyrazole were synthesized and submitted for antiviral screening against CHICKV.
Abstract: Today-s healthcare industries had become more
patient-centric than profession-centric, from which the issues of quality of healthcare and the patient safety are the major concerns in the modern healthcare facilities. An unplanned extubation (UE) may
be detrimental to the patient-s life, and thus is one of the major indexes
of patient safety and healthcare quality. A high UE rate not only
defeated the healthcare quality as well as the patient safety policy but
also the nurses- morality, and job satisfaction. The UE problem in a psychiatric hospital is unique and may be a tough challenge for the
healthcare professionals for the patients were mostly lacking communication capabilities. We reported with this essay a particular
project that was organized to reduce the UE rate from the current 2.3%
to a lower and satisfactory level in the long-term care units of a psychiatric hospital. The project was conducted between March 1st,
2011 and August 31st, 2011. Based on the error information gathered
from varied units of the hospital, the team analyzed the root causes
with possible solutions proposed to the meetings. Four solutions were
then concluded with consensus and launched to the units in question.
The UE rate was now reduced to a level of 0.17%. Experience from
this project, the procedure and the tools adopted would be good reference to other hospitals.
Abstract: Until recently, researchers have developed various
tools and methodologies for effective clinical decision-making.
Among those decisions, chest pain diseases have been one of
important diagnostic issues especially in an emergency department. To
improve the ability of physicians in diagnosis, many researchers have
developed diagnosis intelligence by using machine learning and data
mining. However, most of the conventional methodologies have been
generally based on a single classifier for disease classification and
prediction, which shows moderate performance. This study utilizes an
ensemble strategy to combine multiple different classifiers to help
physicians diagnose chest pain diseases more accurately than ever.
Specifically the ensemble strategy is applied by using the integration
of decision trees, neural networks, and support vector machines. The
ensemble models are applied to real-world emergency data. This study
shows that the performance of the ensemble models is superior to each
of single classifiers.
Abstract: In this paper as showed a non-invasive 3D eye tracker
for optometry clinical applications. Measurements of biomechanical
variables in clinical practice have many font of errors associated with
traditional procedments such cover test (CT), near point of
accommodation (NPC), eye ductions (ED), eye vergences (EG) and,
eye versions (ES). Ocular motility should always be tested but all
evaluations have a subjective interpretations by practitioners, the
results is based in clinical experiences, repeatability and accuracy
don-t exist. Optometric-lab is a tool with 3 (tree) analogical video
cameras triggered and synchronized in one acquisition board AD.
The variables globe rotation angle and velocity can be quantified.
Data record frequency was performed with 27Hz, camera calibration
was performed in a know volume and image radial distortion
adjustments.
Abstract: An application framework provides a reusable design
and implementation for a family of software systems. Application
developers extend the framework to build their particular
applications using hooks. Hooks are the places identified to show
how to use and customize the framework. Hooks define Framework
Interface Classes (FICs) and their possible specifications, which
helps in building reusable test cases for the implementations of these
classes. In applications developed using gray-box frameworks, FICs
inherit framework classes or use them without inheritance. In this
paper, a test-case generation technique is extended to build test cases
for FICs built for gray-box frameworks. A tool is developed to
automate the introduced technique.
Abstract: This work deals with problems of tool axis inclination angles in ball-end milling. Tool axis inclination angle contributes to improvement of functional surface properties (surface integrity - surface roughness, residual stress, micro hardness, etc.), decreasing cutting forces and improving production. By milling with ball-end milling tool, using standard way of cutting, when work piece and cutting tool contain right angle, we have zero cutting speed on edge. At this point cutting tool only pushes material into the work piece. Here we can observe the following undesirable effects - chip contraction, increasing of cutting temperature, increasing vibrations or creation of built-up edge. These effects have negative results – low quality of surface and decreasing of tool life (in the worse case even it is pinching out). These effects can be eliminated with the tilt of cutting tool or tilt of work piece.
Abstract: One of the most important problems in production planning of flexible manufacturing system (FMS) is machine tool selection and operation allocation problem that directly influences the production costs and times .In this paper minimizing machining cost, set-up cost and material handling cost as a multi-objective problem in flexible manufacturing systems environment are considered. We present a 0-1 integer linear programming model for the multiobjective machine tool selection and operation allocation problem and due to the large scale nature of the problem, solving the problem to obtain optimal solution in a reasonable time is infeasible, Paretoant colony optimization (P-ACO) approach for solving the multiobjective problem in reasonable time is developed. Experimental results indicate effectiveness of the proposed algorithm for solving the problem.
Abstract: This paper presents software tools that convert the C/Cµ floating point source code for a DSP algorithm into a fixedpoint simulation model that can be used to evaluate the numericalperformance of the algorithm on several different fixed pointplatforms including microprocessors, DSPs and FPGAs. The tools use a novel system for maintaining binary point informationso that the conversion from floating point to fixed point isautomated and the resulting fixed point algorithm achieves maximum possible precision. A configurable architecture is used during the simulation phase so that the algorithm can produce a bit-exact output for several different target devices.
Abstract: This paper presents a new strategy of identification
and classification of pathological voices using the hybrid method
based on wavelet transform and neural networks. After speech
acquisition from a patient, the speech signal is analysed in order to
extract the acoustic parameters such as the pitch, the formants, Jitter,
and shimmer. Obtained results will be compared to those normal and
standard values thanks to a programmable database. Sounds are
collected from normal people and patients, and then classified into
two different categories. Speech data base is consists of several
pathological and normal voices collected from the national hospital
“Rabta-Tunis". Speech processing algorithm is conducted in a
supervised mode for discrimination of normal and pathology voices
and then for classification between neural and vocal pathologies
(Parkinson, Alzheimer, laryngeal, dyslexia...). Several simulation
results will be presented in function of the disease and will be
compared with the clinical diagnosis in order to have an objective
evaluation of the developed tool.
Abstract: The objective of this work is to explicit knowledge on the interactions between the chlorophyll-a and nine meroplankton larvae of epibenthonic fauna. The studied case is the Arraial do Cabo upwelling system, Southeastern of Brazil, which provides different environmental conditions. To assess this information a network approach based in probability estimative was used. Comparisons among the generated graphs are made in the light of different water masses, application of Shannon biodiversity index, and the closeness and betweenness centralities measurements. Our results show the main pattern among different water masses and how the core organisms belonging to the network skeleton are correlated to the main environmental variable. We conclude that the approach of complex networks is a promising tool for environmental diagnostic.
Abstract: Simulations play a major role in education not only because they provide realistic models with which students can interact to acquire real world experiences, but also because they constitute safe environments in which students can repeat processes without any risk in order to perceive easier concepts and theories. Virtual reality is widely recognized as a significant technological advance that can facilitate learning process through the development of highly realistic 3D simulations supporting immersive and interactive features. The objective of this paper is to analyze the influence of virtual reality-s use in chemistry instruction as well as to present an integrated web-based learning environment for the simulation of chemical experiments. The proposed application constitutes a cost-effective solution for both schools and universities without appropriate infrastructure and a valuable tool for distance learning and life-long education in chemistry. Its educational objectives are the familiarization of students with the equipment of a real chemical laboratory and the execution of virtual volumetric analysis experiments with the active participation of students.
Abstract: Fossil fuels are the major source to meet the world
energy requirements but its rapidly diminishing rate and adverse
effects on our ecological system are of major concern. Renewable
energy utilization is the need of time to meet the future challenges.
Ocean energy is the one of these promising energy resources. Threefourths
of the earth-s surface is covered by the oceans. This enormous
energy resource is contained in the oceans- waters, the air above the
oceans, and the land beneath them. The renewable energy source of
ocean mainly is contained in waves, ocean current and offshore solar
energy. Very fewer efforts have been made to harness this reliable
and predictable resource. Harnessing of ocean energy needs detail
knowledge of underlying mathematical governing equation and their
analysis. With the advent of extra ordinary computational resources
it is now possible to predict the wave climatology in lab simulation.
Several techniques have been developed mostly stem from numerical
analysis of Navier Stokes equations. This paper presents a brief over
view of such mathematical model and tools to understand and
analyze the wave climatology. Models of 1st, 2nd and 3rd generations
have been developed to estimate the wave characteristics to assess the
power potential. A brief overview of available wave energy
technologies is also given. A novel concept of on-shore wave energy
extraction method is also presented at the end. The concept is based
upon total energy conservation, where energy of wave is transferred
to the flexible converter to increase its kinetic energy. Squeezing
action by the external pressure on the converter body results in
increase velocities at discharge section. High velocity head then can
be used for energy storage or for direct utility of power generation.
This converter utilizes the both potential and kinetic energy of the
waves and designed for on-shore or near-shore application. Increased
wave height at the shore due to shoaling effects increases the
potential energy of the waves which is converted to renewable
energy. This approach will result in economic wave energy
converter due to near shore installation and more dense waves due to
shoaling. Method will be more efficient because of tapping both
potential and kinetic energy of the waves.
Abstract: Applying a rigorous process to optimize the elements
of a supply-chain network resulted in reduction of the waiting time
for a service provider and customer. Different sources of downtime
of hydraulic pressure controller/calibrator (HPC) were causing
interruptions in the operations. The process examined all the issues to
drive greater efficiencies. The issues included inherent design issues
with HPC pump, contamination of the HPC with impurities, and the
lead time required for annual calibration in the USA.
HPC is used for mandatory testing/verification of formation
tester/pressure measurement/logging-while drilling tools by oilfield
service providers, including Halliburton.
After market study andanalysis, it was concluded that the current
HPC model is best suited in the oilfield industry. To use theexisting
HPC model effectively, design andcontamination issues were
addressed through design and process improvements. An optimum
network is proposed after comparing different supply-chain models
for calibration lead-time reduction.
Abstract: Iran is one of the greatest producers of date in the
world. However due to lack of information about its viscoelastic
properties, much of the production downgraded during harvesting
and postharvesting processes. In this study the effect of temperature
and moisture content of product were investigated on stress
relaxation characteristics. Therefore, the freshly harvested date
(kabkab) at tamar stage were put in controlled environment chamber
to obtain different temperature levels (25, 35, 45, and 55 0C) and
moisture contents (8.5, 8.7, 9.2, 15.3, 20, 32.2 %d.b.). A texture
analyzer TAXT2 (Stable Microsystems, UK) was used to apply
uniaxial compression tests. A chamber capable to control temperature
was designed and fabricated around the plunger of texture analyzer to
control the temperature during the experiment. As a new approach a
CCD camera (A4tech, 30 fps) was mounted on a cylindrical glass
probe to scan and record contact area between date and disk.
Afterwards, pictures were analyzed using image processing toolbox
of Matlab software. Individual date fruit was uniaxially compressed
at speed of 1 mm/s. The constant strain of 30% of thickness of date
was applied to the horizontally oriented fruit. To select a suitable
model for describing stress relaxation of date, experimental data were
fitted with three famous stress relaxation models including the
generalized Maxwell, Nussinovitch, and Pelege. The constant in
mentioned model were determined and correlated with temperature
and moisture content of product using non-linear regression analysis.
It was found that Generalized Maxwell and Nussinovitch models
appropriately describe viscoelastic characteristics of date fruits as
compared to Peleg mode.
Abstract: Intercropping is one of the sustainable agricultural
factors. The SPAD meter can be used to predict nitrogen index
reliably, it may also be a useful tool for assessing the relative impact
of weeds on crops. In order to study the effect of weeds on SPAD in
corn (Zea mays L.), sweet basil (Ocimum basilicum L.) and borage
(Borago officinalis L.) in intercropping system, a factorial experiment
was conducted in three replications in 2011. Experimental factors
were included intercropping of corn with sweet basil and borage in
different ratios (100:0, 75:25, 50:50, 25:75 and 0:100 corn: borage or
sweet basil) and weed infestation (weed control and weed
interference). The results showed that intercropping of corn with
sweet basil and borage increased the SPAD value of corn compare to
monoculture in weed interference condition. Sweet basil SPAD value
in weed control treatments (43.66) was more than weed interference
treatments (40.17). Corn could increase the borage SPAD value
compare to monoculture in weed interference treatments.
Abstract: This paper and its companion (Part 2) deal with
modeling and optimization of two NP-hard problems in production
planning of flexible manufacturing system (FMS), part type selection
problem and loading problem. The part type selection problem and
the loading problem are strongly related and heavily influence the
system-s efficiency and productivity. The complexity of the problems
is harder when flexibilities of operations such as the possibility of
operation processed on alternative machines with alternative tools are
considered. These problems have been modeled and solved
simultaneously by using real coded genetic algorithms (RCGA)
which uses an array of real numbers as chromosome representation.
These real numbers can be converted into part type sequence and
machines that are used to process the part types. This first part of the
papers focuses on the modeling of the problems and discussing how
the novel chromosome representation can be applied to solve the
problems. The second part will discuss the effectiveness of the
RCGA to solve various test bed problems.
Abstract: In the present paper, the three-dimensional
temperature field of tool is determined during the machining and
compared with experimental work on C45 workpiece using carbide
cutting tool inserts. During the metal cutting operations, high
temperature is generated in the tool cutting edge which influence on
the rate of tool wear. Temperature is most important characteristic of
machining processes; since many parameters such as cutting speed,
surface quality and cutting forces depend on the temperature and high
temperatures can cause high mechanical stresses which lead to early
tool wear and reduce tool life. Therefore, considerable attention is
paid to determine tool temperatures. The experiments are carried out
for dry and orthogonal machining condition. The results show that
the increase of tool temperature depends on depth of cut and
especially cutting speed in high range of cutting conditions.
Abstract: The paper discusses the results obtained to predict
reinforcement in singly reinforced beam using Neural Net (NN),
Support Vector Machines (SVM-s) and Tree Based Models. Major
advantage of SVM-s over NN is of minimizing a bound on the
generalization error of model rather than minimizing a bound on
mean square error over the data set as done in NN. Tree Based
approach divides the problem into a small number of sub problems to
reach at a conclusion. Number of data was created for different
parameters of beam to calculate the reinforcement using limit state
method for creation of models and validation. The results from this
study suggest a remarkably good performance of tree based and
SVM-s models. Further, this study found that these two techniques
work well and even better than Neural Network methods. A
comparison of predicted values with actual values suggests a very
good correlation coefficient with all four techniques.
Abstract: In this paper, a design methodology to implement low-power and high-speed 2nd order recursive digital Infinite Impulse Response (IIR) filter has been proposed. Since IIR filters suffer from a large number of constant multiplications, the proposed method replaces the constant multiplications by using addition/subtraction and shift operations. The proposed new 6T adder cell is used as the Carry-Save Adder (CSA) to implement addition/subtraction operations in the design of recursive section IIR filter to reduce the propagation delay. Furthermore, high-level algorithms designed for the optimization of the number of CSA blocks are used to reduce the complexity of the IIR filter. The DSCH3 tool is used to generate the schematic of the proposed 6T CSA based shift-adds architecture design and it is analyzed by using Microwind CAD tool to synthesize low-complexity and high-speed IIR filters. The proposed design outperforms in terms of power, propagation delay, area and throughput when compared with MUX-12T, MCIT-7T based CSA adder filter design. It is observed from the experimental results that the proposed 6T based design method can find better IIR filter designs in terms of power and delay than those obtained by using efficient general multipliers.
Abstract: Nowadays, we are facing with network threats that
cause enormous damage to the Internet community day by day. In
this situation, more and more people try to prevent their network
security using some traditional mechanisms including firewall,
Intrusion Detection System, etc. Among them honeypot is a versatile
tool for a security practitioner, of course, they are tools that are meant
to be attacked or interacted with to more information about attackers,
their motives and tools. In this paper, we will describe usefulness of
low-interaction honeypot and high-interaction honeypot and
comparison between them. And then we propose hybrid honeypot
architecture that combines low and high -interaction honeypot to
mitigate the drawback. In this architecture, low-interaction honeypot
is used as a traffic filter. Activities like port scanning can be
effectively detected by low-interaction honeypot and stop there.
Traffic that cannot be handled by low-interaction honeypot is handed
over to high-interaction honeypot. In this case, low-interaction
honeypot is used as proxy whereas high-interaction honeypot offers
the optimal level realism. To prevent the high-interaction honeypot
from infections, containment environment (VMware) is used.