Abstract: A new, combinatorial model for analyzing and inter-
preting an electrocardiogram (ECG) is presented. An application of
the model is QRS peak detection. This is demonstrated with an
online algorithm, which is shown to be space as well as time efficient.
Experimental results on the MIT-BIH Arrhythmia database show that
this novel approach is promising. Further uses for this approach are
discussed, such as taking advantage of its small memory requirements
and interpreting large amounts of pre-recorded ECG data.
Abstract: Fishbone of Nile Tilapia (Tilapia nilotica), waste from the frozen Nile Tilapia fillet factory, is one of calcium sources. In order to increase fish bone powder value, this study aimed to investigate the effect of Tilapia bone flour (TBF) addition (5, 10, 15% by flour weight) on cooking quality, texture and sensory attributes of noodles. The results indicated that tensile strength, color value (a*) and water absorption of noodles significantly decreased (p£0.05) as the levels of TBF increased from 0-15%. While cooking loss, cooking time and color values (L* and b*) of noodles significantly increased (p£0.05). Sensory evaluation indicated that noodles with 5% TBF received the highest overall acceptability score.
Abstract: Aspheric optical components are an alternative to the use of conventional lenses in the implementation of imaging systems for the visible range. Spherical lenses are capable of producing aberrations. Therefore, they are not able to focus all the light into a single point. Instead, aspherical lenses correct aberrations and provide better resolution even with compact lenses incorporating a small number of lenses.
Metrology of these components is very difficult especially when the resolution requirements increase and insufficient or complexity of conventional tools requires the development of specific approaches to characterization.
This work is part of the problem existed because the objectives are the study and comparison of different methods used to measure surface rays hybrid aspherical lenses.
Abstract: In this paper the development of a software to
encrypt messages with asymmetric cryptography is presented. In
particular, is used the RSA (Rivest, Shamir and Adleman) algorithm
to encrypt alphanumeric information. The software allows to
generate different public keys from two prime numbers provided by
the user, the user must then select a public-key to generate the
corresponding private-key. To encrypt the information, the user must
provide the public-key of the recipient as well as the message to be
encrypted. The generated ciphertext can be sent through an insecure
channel, so that would be very difficult to be interpreted by an
intruder or attacker. At the end of the communication, the recipient
can decrypt the original message if provide his/her public-key and
his/her corresponding private-key.
Abstract: The demand of hydrocarbons has increased the
construction of pipelines and the protection of the physical and
mechanical integrity of the already existing infrastructure. Corrosion
is the main reason of failures in the pipeline and it is mostly produced
by acid (HCOOCH3). In this basis, a CFD code was used, in order to
study the corrosion of internal wall of hydrocarbons pipeline. In this
situation, the corrosion phenomenon shows a growing deposit, which
causes defect damages (welding or fabrication) at diverse positions
along the pipeline. The solution of the pipeline corrosion is based on
the diminution of the Naphthenic acid.
Abstract: The balancing numbers are natural numbers n satisfying
the Diophantine equation 1 + 2 + 3 + · · · + (n - 1) = (n + 1) +
(n + 2) + · · · + (n + r); r is the balancer corresponding to the
balancing number n.The nth balancing number is denoted by Bn
and the sequence {Bn}1
n=1 satisfies the recurrence relation Bn+1 =
6Bn-Bn-1. The balancing numbers posses some curious properties,
some like Fibonacci numbers and some others are more interesting.
This paper is a study of recurrent sequence {xn}1
n=1 satisfying the
recurrence relation xn+1 = Axn - Bxn-1 and possessing some
curious properties like the balancing numbers.
Abstract: In this paper, a novel corner detection method is
presented to stably extract geometrically important corners.
Intensity-based corner detectors such as the Harris corner can detect
corners in noisy environments but has inaccurate corner position and
misses the corners of obtuse angles. Edge-based corner detectors such
as Curvature Scale Space can detect structural corners but show
unstable corner detection due to incomplete edge detection in noisy
environments. The proposed image-based direct curvature estimation
can overcome limitations in both inaccurate structural corner detection
of the Harris corner detector (intensity-based) and the unstable corner
detection of Curvature Scale Space caused by incomplete edge
detection. Various experimental results validate the robustness of the
proposed method.
Abstract: Relevant agricultural information disseminator
(extension agent) ratio of 1:3500 farm families which become a
menace to agricultural production capacity in developing countries
necessitate this study. Out of 4 zones in the state, 24 extension agents
in each zone, 4 extension agents using cell phones and 120 farmers
using cell phone and 120 other farmers not using cell phone were
purposively selected to give 240 farmers that participated in the
research. Data were collected using interview guide and analysized
using frequency, percentage and t-test.. Frequency of contact with
agricultural information centers revealed that cell phone user farmers
had greater means score of X 41.43 contact as against the low mean
X19.32 contact recorded by farmers receiving agricultural
information from extension agents not using cell phone and their
production was statistically significant at P < 0.05. Usage of cell
phone increase extension agent contact and increase farmers-
production capacity.
Abstract: The effects of commercial or bovine yeasts on the
performance and blood variables of broiler chickens intoxicated with
aflatoxin were investigated in broilers. Four hundred eighty broilers
(Arbor Acres; 3-wk-old) were randomly assigned to 4 groups. Each
group (120 broiler chickens) was further randomly divided into 6
replicates of 20 chickens. The treatments were control diet without
additives (treatment 1), 250 ppb AFB1 (treatment 2), commercial
yeast, Saccharomyces cerevisiae, (CY 2.5 x 107 CFU/g) + 250 ppb
AFB1 (treatment 3) and bovine yeast, Saccharomyces cerevisiae,
(BY 2.5 x 107 CFU/g + 250 ppb AFB1 (treatment 4). Complete
randomized design (CRD) was used in the experiment. Feed
consumption and body weight were recorded at every five-day
period. On day 42, carcass compositions were determined from 30
birds per treatment. While chicks were sacrificed, 3-4 ml blood
sample was taken and stored frozen at (-20°C) for serum chemical
analysis to determine effects of consumption of diets on blood
chemistry (total protein, albumin, glucose, urea, cholesterol and
triglycerides). There were no significant differences in ADFI among
the treatments(P>0.05). However, BWG, FCR and mortality were
highly significantly different (P
Abstract: The experiment was performed to study the
relationship between excreta viscosity and Nitrogen-corrected true
metabolisable energy quantities of soybean meals using conventional
addition method (CAM) in adult cockerels for 7 d: a 3-d preexperiment
and a 4-d experiment period. Results indicated that
differences between the excreta viscosity values were (P
Abstract: we propose a new normalized LMS (NLMS) algorithm, which gives satisfactory performance in certain applications in comaprison with con-ventional NLMS recursion. This new algorithm can be treated as a block based simplification of NLMS algorithm with significantly reduced number of multi¬ply and accumulate as well as division operations. It is also shown that such a recursion can be easily implemented in block floating point (BFP) arithmetic, treating the implementational issues much efficiently. In particular, the core challenges of a BFP realization to such adaptive filters are mainly considered in this regard. A global upper bound on the step size control parameter of the new algorithm due to BFP implementation is also proposed to prevent overflow in filtering as well as weight updating operations jointly.
Abstract: Data Envelopment Analysis (DEA) is one of the most
widely used technique for evaluating the relative efficiency of a set
of homogeneous decision making units. Traditionally, it assumes that
input and output variables are known in advance, ignoring the critical
issue of data uncertainty. In this paper, we deal with the problem
of efficiency evaluation under uncertain conditions by adopting the
general framework of the stochastic programming. We assume that
output parameters are represented by discretely distributed random
variables and we propose two different models defined according to a
neutral and risk-averse perspective. The models have been validated
by considering a real case study concerning the evaluation of the
technical efficiency of a sample of individual firms operating in
the Italian leather manufacturing industry. Our findings show the
validity of the proposed approach as ex-ante evaluation technique
by providing the decision maker with useful insights depending on
his risk aversion degree.
Abstract: The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.
Abstract: This study has been prepared with the purpose to get the views of senior class Elementary Education Mathematics preservice teachers on proving. Data have been obtained via surveys and interviews carried out with 104 preservice teachers. According to the findings, although preservice teachers have positive views about using proving in mathematics teaching, it is seen that their experiences related to proving is limited to courses and they think proving is a work done only for the exams. Furthermore, they have expressed in the interviews that proving is difficult for them, and because of this reason they prefer memorizing instead of learning.
Abstract: The purposes of this study are 1) to study the frequent
English writing errors of students registering the course: Reading and
Writing English for Academic Purposes II, and 2) to find out the
results of writing error correction by using coded indirect corrective
feedback and writing error treatments. Samples include 28 2nd year
English Major students, Faculty of Education, Suan Sunandha
Rajabhat University. Tool for experimental study includes the lesson
plan of the course; Reading and Writing English for Academic
Purposes II, and tool for data collection includes 4 writing tests of
short texts. The research findings disclose that frequent English
writing errors found in this course comprise 7 types of grammatical
errors, namely Fragment sentence, Subject-verb agreement, Wrong
form of verb tense, Singular or plural noun endings, Run-ons
sentence, Wrong form of verb pattern and Lack of parallel structure.
Moreover, it is found that the results of writing error correction by
using coded indirect corrective feedback and error treatment reveal
the overall reduction of the frequent English writing errors and the
increase of students’ achievement in the writing of short texts with
the significance at .05.
Abstract: Document image processing has become an
increasingly important technology in the automation of office
documentation tasks. During document scanning, skew is inevitably
introduced into the incoming document image. Since the algorithm
for layout analysis and character recognition are generally very
sensitive to the page skew. Hence, skew detection and correction in
document images are the critical steps before layout analysis. In this
paper, a novel skew detection method is presented for binary
document images. The method considered the some selected
characters of the text which may be subjected to thinning and Hough
transform to estimate skew angle accurately. Several experiments
have been conducted on various types of documents such as
documents containing English Documents, Journals, Text-Book,
Different Languages and Document with different fonts, Documents
with different resolutions, to reveal the robustness of the proposed
method. The experimental results revealed that the proposed method
is accurate compared to the results of well-known existing methods.
Abstract: The effect of moisture content and loading rate on
mechanical strength of 12 brown rice grain varieties was determined.
The results showed that the rupture force of brown rice grain
decreased by increasing the moisture content and loading rate. The
highest rupture force values was obtained at the moisture content of
8% (w.b.) and loading rate of 10 mm/min; while the lowest rupture
force corresponded to the moisture content of 14% (w.b.) and loading
rate of 15 mm/min. The 12 varieties were divided into three groups,
namely local short grain varieties, local long grain varieties and
improved long grain varieties. It was observed that the rupture
strength of the three groups were statistically different from each
other (P
Abstract: Pineapples can be classified using an index with seven
levels of maturity based on the green and yellow color of the skin. As
the pineapple ripens, the skin will change from pale green to a golden
or yellowish color. The issues that occur in agriculture nowadays are
to do with farmers being unable to distinguish between the indexes of
pineapple maturity correctly and effectively. There are several
reasons for why farmers cannot properly follow the guideline provide
by Federal Agriculture Marketing Authority (FAMA) and one of
reason is that due to manual inspection done by experts, there are no
specific and universal guidelines to be adopted by farmers due to the
different points of view of the experts when sorting the pineapples
based on their knowledge and experience. Therefore, an automatic
system will help farmers to identify pineapple maturity effectively
and will become a universal indicator to farmers.
Abstract: In illumination variant face recognition, existing
methods extracting face albedo as light normalized image may lead to
loss of extensive facial details, with light template discarded. To
improve that, a novel approach for realistic facial texture
reconstruction by combining original image and albedo image is
proposed. First, light subspaces of different identities are established
from the given reference face images; then by projecting the original
and albedo image into each light subspace respectively, texture
reference images with corresponding lighting are reconstructed and
two texture subspaces are formed. According to the projections in
texture subspaces, facial texture with normal light can be synthesized.
Due to the combination of original image, facial details can be
preserved with face albedo. In addition, image partition is applied to
improve the synthesization performance. Experiments on Yale B and
CMUPIE databases demonstrate that this algorithm outperforms the
others both in image representation and in face recognition.
Abstract: In many applications, it is a priori known that the
target function should satisfy certain constraints imposed by, for
example, economic theory or a human-decision maker. Here we
consider partially monotone problems, where the target variable
depends monotonically on some of the predictor variables but not all.
We propose an approach to build partially monotone models based
on the convolution of monotone neural networks and kernel
functions. The results from simulations and a real case study on
house pricing show that our approach has significantly better
performance than partially monotone linear models. Furthermore, the
incorporation of partial monotonicity constraints not only leads to
models that are in accordance with the decision maker's expertise,
but also reduces considerably the model variance in comparison to
standard neural networks with weight decay.