Abstract: A learning content management system (LCMS) is an
environment to support web-based learning content development.
Primary function of the system is to manage the learning process as
well as to generate content customized to meet a unique requirement
of each learner. Among the available supporting tools offered by
several vendors, we propose to enhance the LCMS functionality to
individualize the presented content with the induction ability. Our
induction technique is based on rough set theory. The induced rules
are intended to be the supportive knowledge for guiding the content
flow planning. They can also be used as decision rules to help
content developers on managing content delivered to individual
learner.
Abstract: Circular knitting machine makes the fabric with more than two knitting tools. Variation of yarn tension between different knitting tools causes different loop length of stitches duration knitting process. In this research, a new intelligent method is applied to control loop length of stitches in various tools based on ideal shape of stitches and real angle of stitches direction while different loop length of stitches causes stitches deformation and deviation those of angle. To measure deviation of stitch direction against variation of tensions, image processing technique was applied to pictures of different fabrics with constant front light. After that, the rate of deformation is translated to needed compensation of loop length cam degree to cure stitches deformation. A fuzzy control algorithm was applied to loop length modification in knitting tools. The presented method was experienced for different knitted fabrics of various structures and yarns. The results show that presented method is useable for control of loop length variation between different knitting tools based on stitch deformation for various knitted fabrics with different fabric structures, densities and yarn types.
Abstract: Production of hard-to-cut materials with uncoated carbide cutting tools in turning, not only cause tool life reduction but also, impairs the product surface roughness. In this paper, influence of hot machining method were studied and presented in two cases. Case1-Workpiece surface roughness quality with constant cutting parameter and 300 ºC initial workpiece surface temperature. Case 2- Tool temperature variation when cutting with two speeds 78.5 (m/min) and 51 (m/min). The workpiece material and tool used in this study were AISI 1060 steel (45HRC) and uncoated carbide TNNM 120408-SP10(SANDVIK Coromant) respectively. A gas flam heating source was used to preheating of the workpiece surface up to 300 ºC, causing reduction of yield stress about 15%. Results obtained experimentally, show that the method used can considerably improved surface quality of the workpiece.
Abstract: The most common forensic activity is searching a hard
disk for string of data. Nowadays, investigators and analysts are
increasingly experiencing large, even terabyte sized data sets when
conducting digital investigations. Therefore consecutive searching can
take weeks to complete successfully. There are two primary search
methods: index-based search and bitwise search. Index-based
searching is very fast after the initial indexing but initial indexing
takes a long time. In this paper, we discuss a high speed bitwise search
model for large-scale digital forensic investigations. We used pattern
matching board, which is generally used for network security, to
search for string and complex regular expressions. Our results indicate
that in many cases, the use of pattern matching board can substantially
increase the performance of digital forensic search tools.
Abstract: In this paper an efficient implementation of Ripemd-
160 hash function is presented. Hash functions are a special family
of cryptographic algorithms, which is used in technological
applications with requirements for security, confidentiality and
validity. Applications like PKI, IPSec, DSA, MAC-s incorporate
hash functions and are used widely today. The Ripemd-160 is
emanated from the necessity for existence of very strong algorithms
in cryptanalysis. The proposed hardware implementation can be
synthesized easily for a variety of FPGA and ASIC technologies.
Simulation results, using commercial tools, verified the efficiency of
the implementation in terms of performance and throughput. Special
care has been taken so that the proposed implementation doesn-t
introduce extra design complexity; while in parallel functionality was
kept to the required levels.
Abstract: High level synthesis (HLS) is a process which
generates register-transfer level design for digital systems from
behavioral description. There are many HLS algorithms and
commercial tools. However, most of these algorithms consider a
behavioral description for the system when a single token is
presented to the system. This approach does not exploit extra
hardware efficiently, especially in the design of digital filters where
common operations may exist between successive tokens. In this
paper, we modify the behavioral description to process multiple
tokens in parallel. However, this approach is unlike the full
processing that requires full hardware replication. It exploits the
presence of common operations between successive tokens. The
performance of the proposed approach is better than sequential
processing and approaches that of full parallel processing as the
hardware resources are increased.
Abstract: Chikungunya virus (CHICKV) is an arboviruses belonging to family Tagoviridae and is transmitted to human through by mosquito (Aedes aegypti and Aedes albopictus) bite. A large outbreak of chikungunya has been reported in India between 2006 and 2007, along with several other countries from South-East Asia and for the first time in Europe. It was for the first time that the CHICKV outbreak has been reported with mortality from Reunion Island and increased mortality from Asian countries. CHICKV affects all age groups, and currently there are no specific drugs or vaccine to cure the disease. The need of antiviral agents for the treatment of CHICKV infection and the success of virtual screening against many therapeutically valuable targets led us to carry out the structure based drug design against Chikungunya nSP2 protease (PDB: 3TRK). Highthroughput virtual screening of publicly available databases, ZINC12 and BindingDB, has been carried out using the Openeye tools and Schrodinger LLC software packages. Openeye Filter program has been used to filter the database and the filtered outputs were docked using HTVS protocol implemented in GLIDE package of Schrodinger LLC. The top HITS were further used for enriching the similar molecules from the database through vROCS; a shape based screening protocol implemented in Openeye. The approach adopted has provided different scaffolds as HITS against CHICKV protease. Three scaffolds: Indole, Pyrazole and Sulphone derivatives were selected based on the docking score and synthetic feasibility. Derivatives of Pyrazole were synthesized and submitted for antiviral screening against CHICKV.
Abstract: Today-s healthcare industries had become more
patient-centric than profession-centric, from which the issues of quality of healthcare and the patient safety are the major concerns in the modern healthcare facilities. An unplanned extubation (UE) may
be detrimental to the patient-s life, and thus is one of the major indexes
of patient safety and healthcare quality. A high UE rate not only
defeated the healthcare quality as well as the patient safety policy but
also the nurses- morality, and job satisfaction. The UE problem in a psychiatric hospital is unique and may be a tough challenge for the
healthcare professionals for the patients were mostly lacking communication capabilities. We reported with this essay a particular
project that was organized to reduce the UE rate from the current 2.3%
to a lower and satisfactory level in the long-term care units of a psychiatric hospital. The project was conducted between March 1st,
2011 and August 31st, 2011. Based on the error information gathered
from varied units of the hospital, the team analyzed the root causes
with possible solutions proposed to the meetings. Four solutions were
then concluded with consensus and launched to the units in question.
The UE rate was now reduced to a level of 0.17%. Experience from
this project, the procedure and the tools adopted would be good reference to other hospitals.
Abstract: Until recently, researchers have developed various
tools and methodologies for effective clinical decision-making.
Among those decisions, chest pain diseases have been one of
important diagnostic issues especially in an emergency department. To
improve the ability of physicians in diagnosis, many researchers have
developed diagnosis intelligence by using machine learning and data
mining. However, most of the conventional methodologies have been
generally based on a single classifier for disease classification and
prediction, which shows moderate performance. This study utilizes an
ensemble strategy to combine multiple different classifiers to help
physicians diagnose chest pain diseases more accurately than ever.
Specifically the ensemble strategy is applied by using the integration
of decision trees, neural networks, and support vector machines. The
ensemble models are applied to real-world emergency data. This study
shows that the performance of the ensemble models is superior to each
of single classifiers.
Abstract: This paper presents software tools that convert the C/Cµ floating point source code for a DSP algorithm into a fixedpoint simulation model that can be used to evaluate the numericalperformance of the algorithm on several different fixed pointplatforms including microprocessors, DSPs and FPGAs. The tools use a novel system for maintaining binary point informationso that the conversion from floating point to fixed point isautomated and the resulting fixed point algorithm achieves maximum possible precision. A configurable architecture is used during the simulation phase so that the algorithm can produce a bit-exact output for several different target devices.
Abstract: Fossil fuels are the major source to meet the world
energy requirements but its rapidly diminishing rate and adverse
effects on our ecological system are of major concern. Renewable
energy utilization is the need of time to meet the future challenges.
Ocean energy is the one of these promising energy resources. Threefourths
of the earth-s surface is covered by the oceans. This enormous
energy resource is contained in the oceans- waters, the air above the
oceans, and the land beneath them. The renewable energy source of
ocean mainly is contained in waves, ocean current and offshore solar
energy. Very fewer efforts have been made to harness this reliable
and predictable resource. Harnessing of ocean energy needs detail
knowledge of underlying mathematical governing equation and their
analysis. With the advent of extra ordinary computational resources
it is now possible to predict the wave climatology in lab simulation.
Several techniques have been developed mostly stem from numerical
analysis of Navier Stokes equations. This paper presents a brief over
view of such mathematical model and tools to understand and
analyze the wave climatology. Models of 1st, 2nd and 3rd generations
have been developed to estimate the wave characteristics to assess the
power potential. A brief overview of available wave energy
technologies is also given. A novel concept of on-shore wave energy
extraction method is also presented at the end. The concept is based
upon total energy conservation, where energy of wave is transferred
to the flexible converter to increase its kinetic energy. Squeezing
action by the external pressure on the converter body results in
increase velocities at discharge section. High velocity head then can
be used for energy storage or for direct utility of power generation.
This converter utilizes the both potential and kinetic energy of the
waves and designed for on-shore or near-shore application. Increased
wave height at the shore due to shoaling effects increases the
potential energy of the waves which is converted to renewable
energy. This approach will result in economic wave energy
converter due to near shore installation and more dense waves due to
shoaling. Method will be more efficient because of tapping both
potential and kinetic energy of the waves.
Abstract: Applying a rigorous process to optimize the elements
of a supply-chain network resulted in reduction of the waiting time
for a service provider and customer. Different sources of downtime
of hydraulic pressure controller/calibrator (HPC) were causing
interruptions in the operations. The process examined all the issues to
drive greater efficiencies. The issues included inherent design issues
with HPC pump, contamination of the HPC with impurities, and the
lead time required for annual calibration in the USA.
HPC is used for mandatory testing/verification of formation
tester/pressure measurement/logging-while drilling tools by oilfield
service providers, including Halliburton.
After market study andanalysis, it was concluded that the current
HPC model is best suited in the oilfield industry. To use theexisting
HPC model effectively, design andcontamination issues were
addressed through design and process improvements. An optimum
network is proposed after comparing different supply-chain models
for calibration lead-time reduction.
Abstract: This paper and its companion (Part 2) deal with
modeling and optimization of two NP-hard problems in production
planning of flexible manufacturing system (FMS), part type selection
problem and loading problem. The part type selection problem and
the loading problem are strongly related and heavily influence the
system-s efficiency and productivity. The complexity of the problems
is harder when flexibilities of operations such as the possibility of
operation processed on alternative machines with alternative tools are
considered. These problems have been modeled and solved
simultaneously by using real coded genetic algorithms (RCGA)
which uses an array of real numbers as chromosome representation.
These real numbers can be converted into part type sequence and
machines that are used to process the part types. This first part of the
papers focuses on the modeling of the problems and discussing how
the novel chromosome representation can be applied to solve the
problems. The second part will discuss the effectiveness of the
RCGA to solve various test bed problems.
Abstract: The paper discusses the results obtained to predict
reinforcement in singly reinforced beam using Neural Net (NN),
Support Vector Machines (SVM-s) and Tree Based Models. Major
advantage of SVM-s over NN is of minimizing a bound on the
generalization error of model rather than minimizing a bound on
mean square error over the data set as done in NN. Tree Based
approach divides the problem into a small number of sub problems to
reach at a conclusion. Number of data was created for different
parameters of beam to calculate the reinforcement using limit state
method for creation of models and validation. The results from this
study suggest a remarkably good performance of tree based and
SVM-s models. Further, this study found that these two techniques
work well and even better than Neural Network methods. A
comparison of predicted values with actual values suggests a very
good correlation coefficient with all four techniques.
Abstract: Nowadays, we are facing with network threats that
cause enormous damage to the Internet community day by day. In
this situation, more and more people try to prevent their network
security using some traditional mechanisms including firewall,
Intrusion Detection System, etc. Among them honeypot is a versatile
tool for a security practitioner, of course, they are tools that are meant
to be attacked or interacted with to more information about attackers,
their motives and tools. In this paper, we will describe usefulness of
low-interaction honeypot and high-interaction honeypot and
comparison between them. And then we propose hybrid honeypot
architecture that combines low and high -interaction honeypot to
mitigate the drawback. In this architecture, low-interaction honeypot
is used as a traffic filter. Activities like port scanning can be
effectively detected by low-interaction honeypot and stop there.
Traffic that cannot be handled by low-interaction honeypot is handed
over to high-interaction honeypot. In this case, low-interaction
honeypot is used as proxy whereas high-interaction honeypot offers
the optimal level realism. To prevent the high-interaction honeypot
from infections, containment environment (VMware) is used.
Abstract: Soft topological spaces are considered as mathematical tools for dealing with uncertainties, and a fuzzy topological space is a special case of the soft topological space. The purpose of this paper is to study soft topological spaces. We introduce some new concepts in soft topological spaces such as soft first-countable spaces, soft second-countable spaces and soft separable spaces, and some basic properties of these concepts are explored.
Abstract: Brain Computer Interface (BCI) has been recently
increased in research. Functional Near Infrared Spectroscope (fNIRs)
is one the latest technologies which utilize light in the near-infrared
range to determine brain activities. Because near infrared technology
allows design of safe, portable, wearable, non-invasive and wireless
qualities monitoring systems, fNIRs monitoring of brain
hemodynamics can be value in helping to understand brain tasks. In
this paper, we present results of fNIRs signal analysis indicating that
there exist distinct patterns of hemodynamic responses which
recognize brain tasks toward developing a BCI. We applied two
different mathematics tools separately, Wavelets analysis for
preprocessing as signal filters and feature extractions and Neural
networks for cognition brain tasks as a classification module. We
also discuss and compare with other methods while our proposals
perform better with an average accuracy of 99.9% for classification.
Abstract: Globalization, supported by information and
communication technologies, changes the rules of competitiveness
and increases the significance of information, knowledge and
network cooperation. In line with this trend, the need for efficient
trust-building tools has emerged. The absence of trust building
mechanisms and strategies was identified within several studies.
Through trust development, participation on e-business network and
usage of network services will increase and provide to SMEs new
economic benefits. This work is focused on effective trust building
strategies development for electronic business network platforms.
Based on trust building mechanism identification, the questionnairebased
analysis of its significance and minimum level of requirements
was conducted. In the paper, we are confirming the trust dependency
on e-Skills which play crucial role in higher level of trust into the
more sophisticated and complex trust building ICT solutions.
Abstract: This research contribution propels the idea of collaborating environment for the execution of student satellite projects in the backdrop of project management principles. The recent past has witnessed a technological shift in the aerospace industry from the big satellite projects to the small spacecrafts especially for the earth observation and communication purposes. This vibrant shift has vitalized the academia and industry to share their resources and to create a win-win paradigm of mutual success and technological development along with the human resource development in the field of aerospace. Small student satellites are the latest jargon of academia and more than 100 CUBESAT projects have been executed successfully all over the globe and many new student satellite projects are in the development phase. The small satellite project management requires the application of specific knowledge, skills, tools and techniques to achieve the defined mission requirements. The Authors have presented the detailed outline for the project management of student satellites and presented the role of industry to collaborate with the academia to get the optimized results in academic environment.
Abstract: With the hardware technology advancing, the cost of
storing is decreasing. Thus there is an urgent need for new techniques
and tools that can intelligently and automatically assist us in
transferring this data into useful knowledge. Different techniques of
data mining are developed which are helpful for handling these large
size databases [7]. Data mining is also finding its role in the field of
biotechnology. Pedigree means the associated ancestry of a crop
variety. Genetic diversity is the variation in the genetic composition
of individuals within or among species. Genetic diversity depends
upon the pedigree information of the varieties. Parents at lower
hierarchic levels have more weightage for predicting genetic
diversity as compared to the upper hierarchic levels. The weightage
decreases as the level increases. For crossbreeding, the two varieties
should be more and more genetically diverse so as to incorporate the
useful characters of the two varieties in the newly developed variety.
This paper discusses the searching and analyzing of different possible
pairs of varieties selected on the basis of morphological characters,
Climatic conditions and Nutrients so as to obtain the most optimal
pair that can produce the required crossbreed variety. An algorithm
was developed to determine the genetic diversity between the
selected wheat varieties. Cluster analysis technique is used for
retrieving the results.