Abstract: One major issue that is regularly cited as a block to
the widespread use of online assessments in eLearning, is that of the
authentication of the student and the level of confidence that an
assessor can have that the assessment was actually completed by that
student. Currently, this issue is either ignored, in which case
confidence in the assessment and any ensuing qualification is
damaged, or else assessments are conducted at central, controlled
locations at specified times, losing the benefits of the distributed
nature of the learning programme. Particularly as we move towards
constructivist models of learning, with intentions towards achieving
heutagogic learning environments, the benefits of a properly
managed online assessment system are clear. Here we discuss some
of the approaches that could be adopted to address these issues,
looking at the use of existing security and biometric techniques,
combined with some novel behavioural elements. These approaches
offer the opportunity to validate the student on accessing an
assessment, on submission, and also during the actual production of
the assessment. These techniques are currently under development in
the DECADE project, and future work will evaluate and report their
use..
Abstract: In this study, a synthetic pathway was created by
assembling genes from Clostridium butyricum and Escherichia coli
in different combinations. Among the genes were dhaB1 and dhaB2
from C. butyricum VPI1718 coding for glycerol dehydratase (GDHt)
and its activator (GDHtAc), respectively, involved in the conversion
of glycerol to 3-hydroxypropionaldehyde (3-HPA). The yqhD gene
from E.coli BL21 was also included which codes for an NADPHdependent
1,3-propanediol oxidoreductase isoenzyme (PDORI)
reducing 3-HPA to 1,3-propanediol (1,3-PD). Molecular modeling
analysis indicated that the conformation of fusion protein of YQHD
and DHAB1 was favorable for direct molecular channeling of the
intermediate 3-HPA. According to the simulation results, the yqhD
and dhaB1 gene were assembled in the upstream of dhaB2 to express
a fusion protein, yielding the recombinant strain E. coliBL21
(DE3)//pET22b+::yqhD-dhaB1_dhaB2 (strain BP41Y3). Strain
BP41Y3 gave 10-fold higher 1,3-PD concentration than E. coliBL21
(DE3)//pET22b+::yqhD-dhaB1_dhaB2 (strain BP31Y2) expressing
the recombinant enzymes simultaneously but in a non-fusion mode.
This is the first report using a gene fusion approach to enhance the
biological conversion of glycerol to the value added compound 1,3-
PD.
Abstract: In general, image-based 3D scenes can now be found in many popular vision systems, computer games and virtual reality tours. So, It is important to segment ROI (region of interest) from input scenes as a preprocessing step for geometric stricture detection in 3D scene. In this paper, we propose a method for segmenting ROI based on tensor voting and Dirichlet process mixture model. In particular, to estimate geometric structure information for 3D scene from a single outdoor image, we apply the tensor voting and Dirichlet process mixture model to a image segmentation. The tensor voting is used based on the fact that homogeneous region in an image are usually close together on a smooth region and therefore the tokens corresponding to centers of these regions have high saliency values. The proposed approach is a novel nonparametric Bayesian segmentation method using Gaussian Dirichlet process mixture model to automatically segment various natural scenes. Finally, our method can label regions of the input image into coarse categories: “ground", “sky", and “vertical" for 3D application. The experimental results show that our method successfully segments coarse regions in many complex natural scene images for 3D.
Abstract: Iterative learning control aims to achieve zero tracking
error of a specific command. This is accomplished by iteratively
adjusting the command given to a feedback control system, based on
the tracking error observed in the previous iteration. One would like
the iterations to converge to zero tracking error in spite of any error
present in the model used to design the learning law. First, this need
for stability robustness is discussed, and then the need for robustness
of the property that the transients are well behaved. Methods of
producing the needed robustness to parameter variations and to
singular perturbations are presented. Then a method involving
reverse time runs is given that lets the world behavior produce the
ILC gains in such a way as to eliminate the need for a mathematical
model. Since the real world is producing the gains, there is no issue
of model error. Provided the world behaves linearly, the approach
gives an ILC law with both stability robustness and good transient
robustness, without the need to generate a model.
Abstract: Electrocardiogram (ECG) segmentation is necessary to help reduce the time consuming task of manually annotating ECG's. Several algorithms have been developed to segment the ECG automatically. We first review several of such methods, and then present a new single lead segmentation method based on Adaptive piecewise constant approximation (APCA) and Piecewise derivative dynamic time warping (PDDTW). The results are tested on the QT database. We compared our results to Laguna's two lead method. Our proposed approach has a comparable mean error, but yields a slightly higher standard deviation than Laguna's method.
Abstract: The purpose of this article applies the monthly final
energy yield and failure data of 202 PV systems installed in Taiwan to
analyze the PV operational performance and system availability. This
data is collected by Industrial Technology Research Institute through
manual records. Bad data detection and failure data estimation
approaches are proposed to guarantee the quality of the received
information. The performance ratio value and system availability are
then calculated and compared with those of other countries. It is
indicated that the average performance ratio of Taiwan-s PV systems
is 0.74 and the availability is 95.7%. These results are similar with
those of Germany, Switzerland, Italy and Japan.
Abstract: The use of hard and brittle material has become
increasingly more extensive in recent years. Therefore processing of
these materials for the parts fabrication has become a challenging
problem. However, it is time-consuming to machine the hard brittle
materials with the traditional metal-cutting technique that uses
abrasive wheels. In addition, the tool would suffer excessive wear as
well. However, if ultrasonic energy is applied to the machining
process and coupled with the use of hard abrasive grits, hard and
brittle materials can be effectively machined. Ultrasonic machining
process is mostly used for the brittle materials. The present research
work has developed models using finite element approach to predict
the mechanical stresses sand strains produced in the tool during
ultrasonic machining process. Also the flow behavior of abrasive
slurry coming out of the nozzle has been studied for simulation using
ANSYS CFX module. The different abrasives of different grit sizes
have been used for the experimentation work.
Abstract: In present study the effects of anti-inflammatory and
antinociceptive of vitex hydro-alcoholic extract were evaluated on
male mice. In inflammatory test mice were divided into 7 groups:
first group was control. The second group, positive control group,
received dexamethasone (15 mg/kg) and the other five groups
received different doses of hydroalcohol extract of Vitex fruit (265,
365, 465, 565, and 665 mg/kg). The inflammation was caused by
xylene-induced ear edema. Formalin test was used for evaluation of
antinociceptive effect of extract. In this test, mice were divided into 7
groups: control, morphine (10mg/kg) as positive control group, and
Vitex extract groups ((265, 365, 465, 565, and 665 mg/kg). All drugs
were administered intrapritoneally, 30 min before each test. The data
were analyzed using one-way ANOVA followed by Tukey-kramer
multiple comparison test. Results have shown significant antiinflammatory
effects of extract at all dosed as compared with control
(P
Abstract: This paper shows a simple and effective approach to
the design and implementation of Industrial Information Systems
(IIS) oriented to control the characteristics of each individual product manufactured in a production line and also their manufacturing conditions. The particular products considered in this work are large steel strips that are coiled just after their manufacturing. However, the approach is directly applicable to coiled strips in other industries, like
paper, textile, aluminum, etc. These IIS provide very detailed information of each manufactured product, which complement the general information managed by the ERP system of the production line. In spite of the high importance of this type of IIS to guarantee and improve the quality of the products manufactured in many industries, there are very few works about them in the technical literature. For this reason, this paper represents an important contribution to the development of this type of IIS, providing guidelines for their design, implementation and exploitation.
Abstract: Sickness absence represents a major economic and
social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is
often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient
and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using
a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model
selection and a critical analysis of the temporal trends, the occurrence
and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large
sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to
select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model
applicability to complicated longitudinal data.
Abstract: Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.
Abstract: Color Image quantization (CQ) is an important
problem in computer graphics, image and processing. The aim of
quantization is to reduce colors in an image with minimum distortion.
Clustering is a widely used technique for color quantization; all
colors in an image are grouped to small clusters. In this paper, we
proposed a new hybrid approach for color quantization using firefly
algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased
algorithm that can be used for solving optimization problems.
The proposed method can overcome the drawbacks of both
algorithms such as the local optima converge problem in K-means
and the early converge of firefly algorithm. Experiments on three
commonly used images and the comparison results shows that the
proposed algorithm surpasses both the base-line technique k-means
clustering and original firefly algorithm.
Abstract: In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.
Abstract: A Wireless sensor network (WSN) consists of a set of battery-powered nodes, which collaborate to perform sensing tasks in a given environment. Each node in WSN should be capable to act for long periods of time with scrimpy or no external management. One requirement for this independent is: in the presence of adverse positions, the sensor nodes must be capable to configure themselves. Hence, the nodes for determine the existence of unusual events in their surroundings should make use of position awareness mechanisms. This work approaches the problem by considering the possible unusual events as diseases, thus making it possible to diagnose them through their symptoms, namely, their side effects. Considering these awareness mechanisms as a foundation for highlevel monitoring services, this paper also shows how these mechanisms are included in the primal plan of an intrusion detection system.
Abstract: Vehicle suspension design must fulfill
some conflicting criteria. Among those is ride comfort
which is attained by minimizing the acceleration
transmitted to the sprung mass, via suspension spring
and damper. Also good handling of a vehicle is a
desirable property which requires stiff suspension and
therefore is in contrast with a vehicle with good ride.
Among the other desirable features of a suspension is
the minimization of the maximum travel of suspension.
This travel which is called suspension working space in
vehicle dynamics literature is also a design constraint
and it favors good ride. In this research a full car 8
degrees of freedom model has been developed and the
three above mentioned criteria, namely: ride, handling
and working space has been adopted as objective
functions. The Multi Objective Programming (MOP)
discipline has been used to find the Pareto Front and
some reasoning used to chose a design point between
these non dominated points of Pareto Front.
Abstract: The traditional Failure Mode and Effects Analysis
(FMEA) uses Risk Priority Number (RPN) to evaluate the risk level
of a component or process. The RPN index is determined by
calculating the product of severity, occurrence and detection indexes.
The most critically debated disadvantage of this approach is that
various sets of these three indexes may produce an identical value of
RPN. This research paper seeks to address the drawbacks in
traditional FMEA and to propose a new approach to overcome these
shortcomings. The Risk Priority Code (RPC) is used to prioritize
failure modes, when two or more failure modes have the same RPN.
A new method is proposed to prioritize failure modes, when there is a
disagreement in ranking scale for severity, occurrence and detection.
An Analysis of Variance (ANOVA) is used to compare means of
RPN values. SPSS (Statistical Package for the Social Sciences)
statistical analysis package is used to analyze the data. The results
presented are based on two case studies. It is found that the proposed
new methodology/approach resolves the limitations of traditional
FMEA approach.
Abstract: Resins are used in nuclear power plants for water
ultrapurification. Two approaches are considered in this work:
column experiments and simulations. A software called OPTIPUR
was developed, tested and used. The approach simulates the onedimensional
reactive transport in porous medium with convectivedispersive
transport between particles and diffusive transport within
the boundary layer around the particles. The transfer limitation in the
boundary layer is characterized by the mass transfer coefficient
(MTC). The influences on MTC were measured experimentally. The
variation of the inlet concentration does not influence the MTC; on
the contrary of the Darcy velocity which influences. This is consistent
with results obtained using the correlation of Dwivedi&Upadhyay.
With the MTC, knowing the number of exchange site and the relative
affinity, OPTIPUR can simulate the column outlet concentration
versus time. Then, the duration of use of resins can be predicted in
conditions of a binary exchange.
Abstract: Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.
Abstract: It was analyzed of fatty acid composition of 16 strains
of microalgae lipid fractions isolated from different basins of
Kazakhstan and characterized by stable active growth in the
laboratory. Three species of green microalgae (Oocystis
rhomboideus, Chlorococcum infusionum, Dictyochlorella globosa)
and three species of diatoms (Synedra sp., Nitzshia sp., Pleurosigma
attenuatum) are characterized by a high content of lipids and are
promising for further study as a source of polyunsaturated fatty acids.
Abstract: The RR interval series is non-stationary and unevenly
spaced in time. For estimating its power spectral density (PSD) using
traditional techniques like FFT, require resampling at uniform
intervals. The researchers have used different interpolation
techniques as resampling methods. All these resampling methods
introduce the low pass filtering effect in the power spectrum. The
lomb transform is a means of obtaining PSD estimates directly from
irregularly sampled RR interval series, thus avoiding resampling. In
this work, the superiority of Lomb transform method has been
established over FFT based approach, after applying linear and
cubicspline interpolation as resampling methods, in terms of
reproduction of exact frequency locations as well as the relative
magnitudes of each spectral component.