Abstract: The use of renewable energy sources becomes more
necessary and interesting. As wider applications of renewable energy
devices at domestic, commercial and industrial levels has not only
resulted in greater awareness, but also significantly installed
capacities. In addition, biomass principally is in the form of woods,
which is a form of energy by humans for a long time. Gasification is
a process of conversion of solid carbonaceous fuel into combustible
gas by partial combustion. Many gasifier models have various
operating conditions; the parameters kept in each model are different.
This study applied experimental data, which has three inputs, which
are; biomass consumption, temperature at combustion zone and ash
discharge rate. One output is gas flow rate. For this paper, neural
network was used to identify the gasifier system suitable for the
experimental data. In the result,neural networkis usable to attain the
answer.
Abstract: It is a one-sided hypothesis testing process for assessing bioequivalence. Bootstrap and modified large-sample(MLS) methods are considered to study individual bioequivalence(IBE), type I error and power of hypothesis tests are simulated and compared with FDA(2001). The results show that modified large-sample method is equivalent to the method of FDA(2001) .
Abstract: In July 2012, an indoor/outdoor monitoring
programme was undertaken in two university sports facilities: a
fronton and a gymnasium. Comfort parameters (temperature, relative
humidity, CO and CO2) and total volatile organic compounds
(VOCs) were continuously monitored. Concentrations of NO2,
carbonyl compounds and individual VOCs were obtained. Low
volume samplers were used to collect particulate matter (PM10). The
minimum ventilation rates stipulated for acceptable indoor air quality
were observed in both sports facilities. It was found that cleaning
activities may have a large influence on the VOC levels. Acrolein
was one of the most abundant carbonyl compounds, showing
concentrations above the recommended limit. Formaldehyde was
detected at levels lower than those commonly reported for other
indoor environments. The PM10 concentrations obtained during the
occupancy periods ranged between 38 and 43μgm-3 in the fronton and
from 154 to 198μgm-3 in the gymnasium.
Abstract: The precision of heat flux simulation influences the
temperature field and test aberration for TB test and also reflects the
test level for spacecraft development. This paper describes TB tests for
a small satellite using solar simulator, electric heaters, calrod heaters
to evaluate the difference of the three methods. Under the same
boundary condition, calrod heaters cases were about 6oC higher than
solar simulator cases and electric heaters cases for
non-external-heat-flux cases (extreme low temperature cases). While
calrod heaters cases and electric heaters cases were 5~7oC and 2~3oC
lower than solar simulator cases respectively for high temperature
cases. The results show that the solar simulator is better than calrod
heaters for its better collimation, non-homogeneity and stability.
Abstract: The POD-assisted projective integration method based on the equation-free framework is presented in this paper. The method is essentially based on the slow manifold governing of given system. We have applied two variants which are the “on-line" and “off-line" methods for solving the one-dimensional viscous Bergers- equation. For the on-line method, we have computed the slow manifold by extracting the POD modes and used them on-the-fly along the projective integration process without assuming knowledge of the underlying slow manifold. In contrast, the underlying slow manifold must be computed prior to the projective integration process for the off-line method. The projective step is performed by the forward Euler method. Numerical experiments show that for the case of nonperiodic system, the on-line method is more efficient than the off-line method. Besides, the online approach is more realistic when apply the POD-assisted projective integration method to solve any systems. The critical value of the projective time step which directly limits the efficiency of both methods is also shown.
Abstract: Current image-based individual human recognition
methods, such as fingerprints, face, or iris biometric modalities
generally require a cooperative subject, views from certain aspects,
and physical contact or close proximity. These methods cannot
reliably recognize non-cooperating individuals at a distance in the
real world under changing environmental conditions. Gait, which
concerns recognizing individuals by the way they walk, is a relatively
new biometric without these disadvantages. The inherent gait
characteristic of an individual makes it irreplaceable and useful in
visual surveillance.
In this paper, an efficient gait recognition system for human
identification by extracting two features namely width vector of
the binary silhouette and the MPEG-7-based region-based shape
descriptors is proposed. In the proposed method, foreground objects
i.e., human and other moving objects are extracted by estimating
background information by a Gaussian Mixture Model (GMM) and
subsequently, median filtering operation is performed for removing
noises in the background subtracted image. A moving target classification
algorithm is used to separate human being (i.e., pedestrian)
from other foreground objects (viz., vehicles). Shape and boundary
information is used in the moving target classification algorithm.
Subsequently, width vector of the outer contour of binary silhouette
and the MPEG-7 Angular Radial Transform coefficients are taken as
the feature vector. Next, the Principal Component Analysis (PCA)
is applied to the selected feature vector to reduce its dimensionality.
These extracted feature vectors are used to train an Hidden Markov
Model (HMM) for identification of some individuals. The proposed
system is evaluated using some gait sequences and the experimental
results show the efficacy of the proposed algorithm.
Abstract: To evaluate the ability to predict xerostomia after
radiotherapy, we constructed and compared neural network and
logistic regression models. In this study, 61 patients who completed a
questionnaire about their quality of life (QoL) before and after a full
course of radiation therapy were included. Based on this questionnaire,
some statistical data about the condition of the patients’ salivary
glands were obtained, and these subjects were included as the inputs of
the neural network and logistic regression models in order to predict
the probability of xerostomia. Seven variables were then selected from
the statistical data according to Cramer’s V and point-biserial
correlation values and were trained by each model to obtain the
respective outputs which were 0.88 and 0.89 for AUC, 9.20 and 7.65
for SSE, and 13.7% and 19.0% for MAPE, respectively. These
parameters demonstrate that both neural network and logistic
regression methods are effective for predicting conditions of parotid
glands.
Abstract: Characteristics and sonocatalytic activity of zeolite
Y catalysts loaded with TiO2 using impregnation and ion exchange
methods for the degradation of amaranth dye were investigated.
The Ion-exchange method was used to encapsulate the TiO2 into
the internal pores of the zeolite while the incorporation of TiO2
mostly on the external surface of zeolite was carried out using the
impregnation method. Different characterization techniques were
used to elucidate the physicochemical properties of the produced
catalysts. The framework of zeolite Y remained virtually
unchanged after the encapsulation of TiO2 while the crystallinity of
zeolite decreased significantly after the incorporation of 15 wt% of
TiO2. The sonocatalytic activity was enhanced by TiO2
incorporation with maximum degradation efficiencies of 50% and
68% for the encapsulated titanium and titanium loaded onto the
zeolite, respectively after 120min of reaction. Catalysts
characteristics and sonocatalytic behaviors were significantly
affected by the preparation method and the location of TiO2
introduced with zeolite structure. Behaviors in the sonocatalytic
process were successfully correlated with the characteristics of the
catalysts used.
Abstract: The topic of surface flattening plays a vital role in the field of computer aided design and manufacture. Surface flattening enables the production of 2D patterns and it can be used in design and manufacturing for developing a 3D surface to a 2D platform, especially in fashion design. This study describes surface flattening based on minimum energy methods according to the property of different fabrics. Firstly, through the geometric feature of a 3D surface, the less transformed area can be flattened on a 2D platform by geodesic. Then, strain energy that has accumulated in mesh can be stably released by an approximate implicit method and revised error function. In some cases, cutting mesh to further release the energy is a common way to fix the situation and enhance the accuracy of the surface flattening, and this makes the obtained 2D pattern naturally generate significant cracks. When this methodology is applied to a 3D mannequin constructed with feature lines, it enhances the level of computer-aided fashion design. Besides, when different fabrics are applied to fashion design, it is necessary to revise the shape of a 2D pattern according to the properties of the fabric. With this model, the outline of 2D patterns can be revised by distributing the strain energy with different results according to different fabric properties. Finally, this research uses some common design cases to illustrate and verify the feasibility of this methodology.
Abstract: Many computational techniques were applied to
solution of heat conduction problem. Those techniques were the
finite difference (FD), finite element (FE) and recently meshless
methods. FE is commonly used in solution of equation of heat
conduction problem based on the summation of stiffness matrix of
elements and the solution of the final system of equations. Because
of summation process of finite element, convergence rate was
decreased. Hence in the present paper Cellular Automata (CA)
approach is presented for the solution of heat conduction problem.
Each cell considered as a fixed point in a regular grid lead to the
solution of a system of equations is substituted by discrete systems of
equations with small dimensions. Results show that CA can be used
for solution of heat conduction problem.
Abstract: This paper deals with the conceptual design of the
new aeroelastic demonstrator for the whirl flutter simulation. The
paper gives a theoretical background of the whirl flutter phenomenon
and describes the events of the whirl flutter occurrence in the
aerospace practice. The second part is focused on the experimental
research of the whirl flutter on aeroelastic similar models. Finally the
concept of the new aeroelastic demonstrator is described. The
demonstrator represents the wing and engine of the twin turboprop
commuter aircraft including a driven propeller. It allows the changes
of the main structural parameters influencing the whirl flutter
stability characteristics. It is intended for the experimental
investigation of the whirl flutter in the wind tunnel. The results will
be utilized for validation of analytical methods and software tools.
Abstract: It is known that if harmonic spectra are decreased, then
acoustic noise also decreased. Hence, this paper deals with a new
random switching strategy using DSP TMS320F2812 to decrease the
harmonics spectra of single phase switched reluctance motor. The
proposed method which combines random turn-on, turn-off angle
technique and random pulse width modulation technique is shown. A
harmonic spread factor (HSF) is used to evaluate the random
modulation scheme. In order to confirm the effectiveness of the new
method, the experimental results show that the harmonic intensity of
output voltage for the proposed method is better than that for
conventional methods.
Abstract: A mobile agent is a software which performs an
action autonomously and independently as a person or an
organizations assistance. Mobile agents are used for searching
information, retrieval information, filtering, intruder recognition in
networks, and so on. One of the important issues of mobile agent is
their security. It must consider different security issues in effective
and secured usage of mobile agent. One of those issues is the
integrity-s protection of mobile agents.
In this paper, the advantages and disadvantages of each method,
after reviewing the existing methods, is examined. Regarding to this
matter that each method has its own advantage or disadvantage, it
seems that by combining these methods, one can reach to a better
method for protecting the integrity of mobile agents. Therefore, this
method is provided in this paper and then is evaluated in terms of
existing method. Finally, this method is simulated and its results are
the sign of improving the possibility of integrity-s protection of
mobile agents.
Abstract: Biometric techniques are gaining importance for
personal authentication and identification as compared to the
traditional authentication methods. Biometric templates are
vulnerable to variety of attacks due to their inherent nature. When a
person-s biometric is compromised his identity is lost. In contrast to
password, biometric is not revocable. Therefore, providing security
to the stored biometric template is very crucial. Crypto biometric
systems are authentication systems, which blends the idea of
cryptography and biometrics. Fuzzy vault is a proven crypto
biometric construct which is used to secure the biometric templates.
However fuzzy vault suffer from certain limitations like nonrevocability,
cross matching. Security of the fuzzy vault is affected
by the non-uniform nature of the biometric data. Fuzzy vault when
hardened with password overcomes these limitations. Password
provides an additional layer of security and enhances user privacy.
Retina has certain advantages over other biometric traits. Retinal
scans are used in high-end security applications like access control to
areas or rooms in military installations, power plants, and other high
risk security areas. This work applies the idea of fuzzy vault for
retinal biometric template. Multimodal biometric system
performance is well compared to single modal biometric systems.
The proposed multi modal biometric fuzzy vault includes combined
feature points from retina and fingerprint. The combined vault is
hardened with user password for achieving high level of security.
The security of the combined vault is measured using min-entropy.
The proposed password hardened multi biometric fuzzy vault is
robust towards stored biometric template attacks.
Abstract: In this work, we improve a previously developed
segmentation scheme aimed at extracting edge information from
speckled images using a maximum likelihood edge detector. The
scheme was based on finding a threshold for the probability density
function of a new kernel defined as the arithmetic mean-to-geometric
mean ratio field over a circular neighborhood set and, in a general
context, is founded on a likelihood random field model (LRFM). The
segmentation algorithm was applied to discriminated speckle areas
obtained using simple elliptic discriminant functions based on
measures of the signal-to-noise ratio with fractional order moments.
A rigorous stochastic analysis was used to derive an exact expression
for the cumulative density function of the probability density
function of the random field. Based on this, an accurate probability
of error was derived and the performance of the scheme was
analysed. The improved segmentation scheme performed well for
both simulated and real images and showed superior results to those
previously obtained using the original LRFM scheme and standard
edge detection methods. In particular, the false alarm probability was
markedly lower than that of the original LRFM method with
oversegmentation artifacts virtually eliminated. The importance of
this work lies in the development of a stochastic-based segmentation,
allowing an accurate quantification of the probability of false
detection. Non visual quantification and misclassification in medical
ultrasound speckled images is relatively new and is of interest to
clinicians.
Abstract: The coalescer process is one of the methods for oily water treatment by increasing the oil droplet size in order to enhance the separating velocity and thus effective separation. However, the presence of surfactants in an oily emulsion can limit the obtained mechanisms due to the small oil size related with stabilized emulsion. In this regard, the purpose of this research is to improve the efficiency of the coalescer process for treating the stabilized emulsion. The effects of bed types, bed height, liquid flow rate and stage coalescer (step-bed) on the treatment efficiencies in term of COD values were studied. Note that the treatment efficiency obtained experimentally was estimated by using the COD values and oil droplet size distribution. The study has shown that the plastic media has more effective to attach with oil particles than the stainless one due to their hydrophobic properties. Furthermore, the suitable bed height (3.5 cm) and step bed (3.5 cm with 2 steps) were necessary in order to well obtain the coalescer performance. The application of step bed coalescer process in reactor has provided the higher treatment efficiencies in term of COD removal than those obtained with classical process. The proposed model for predicting the area under curve and thus treatment efficiency, based on the single collector efficiency (ηT) and the attachment efficiency (α), provides relatively a good coincidence between the experimental and predicted values of treatment efficiencies in this study.
Abstract: This paper describes simple implementation of
homotopy (also called continuation) algorithm for determining the proper resistance of the resistor to dissipate energy at a specified rate of an electric circuit. Homotopy algorithm can be considered as a developing of the classical methods in numerical computing such as Newton-Raphson and fixed
point methods. In homoptopy methods, an embedding
parameter is used to control the convergence. The method purposed in this work utilizes a special homotopy called Newton homotopy. Numerical example solved in MATLAB is given to show the effectiveness of the purposed method
Abstract: Web usage mining has become a popular research
area, as a huge amount of data is available online. These data can be
used for several purposes, such as web personalization, web structure
enhancement, web navigation prediction etc. However, the raw log
files are not directly usable; they have to be preprocessed in order to
transform them into a suitable format for different data mining tasks.
One of the key issues in the preprocessing phase is to identify web
users. Identifying users based on web log files is not a
straightforward problem, thus various methods have been developed.
There are several difficulties that have to be overcome, such as client
side caching, changing and shared IP addresses and so on. This paper
presents three different methods for identifying web users. Two of
them are the most commonly used methods in web log mining
systems, whereas the third on is our novel approach that uses a
complex cookie-based method to identify web users. Furthermore we
also take steps towards identifying the individuals behind the
impersonal web users. To demonstrate the efficiency of the new
method we developed an implementation called Web Activity
Tracking (WAT) system that aims at a more precise distinction of
web users based on log data. We present some statistical analysis
created by the WAT on real data about the behavior of the Hungarian
web users and a comprehensive analysis and comparison of the three
methods
Abstract: Parsing is important in Linguistics and Natural
Language Processing to understand the syntax and semantics of a
natural language grammar. Parsing natural language text is
challenging because of the problems like ambiguity and inefficiency.
Also the interpretation of natural language text depends on context
based techniques. A probabilistic component is essential to resolve
ambiguity in both syntax and semantics thereby increasing accuracy
and efficiency of the parser. Tamil language has some inherent
features which are more challenging. In order to obtain the solutions,
lexicalized and statistical approach is to be applied in the parsing
with the aid of a language model. Statistical models mainly focus on
semantics of the language which are suitable for large vocabulary
tasks where as structural methods focus on syntax which models
small vocabulary tasks. A statistical language model based on Trigram
for Tamil language with medium vocabulary of 5000 words has
been built. Though statistical parsing gives better performance
through tri-gram probabilities and large vocabulary size, it has some
disadvantages like focus on semantics rather than syntax, lack of
support in free ordering of words and long term relationship. To
overcome the disadvantages a structural component is to be
incorporated in statistical language models which leads to the
implementation of hybrid language models. This paper has attempted
to build phrase structured hybrid language model which resolves
above mentioned disadvantages. In the development of hybrid
language model, new part of speech tag set for Tamil language has
been developed with more than 500 tags which have the wider
coverage. A phrase structured Treebank has been developed with 326
Tamil sentences which covers more than 5000 words. A hybrid
language model has been trained with the phrase structured Treebank
using immediate head parsing technique. Lexicalized and statistical
parser which employs this hybrid language model and immediate
head parsing technique gives better results than pure grammar and
trigram based model.
Abstract: In this study, workplace environmental monitoring
systems were established using USN(Ubiquitous Sensor Networks)
and LabVIEW. Although existing direct sampling methods enable
finding accurate values as of the time points of measurement, those
methods are disadvantageous in that continuous management and
supervision are difficult and costs for are high when those methods are
used. Therefore, the efficiency and reliability of workplace
management by supervisors are relatively low when those methods are
used. In this study, systems were established so that information on
workplace environmental factors such as temperatures, humidity and
noises is measured and transmitted to the PC in real time to enable
supervisors to monitor workplaces through LabVIEW on the PC.
When any accidents have occurred in workplaces, supervisors can
immediately respond through the monitoring system and this system
enables integrated workplace management and the prevention of
safety accidents. By introducing these monitoring systems, safety
accidents due to harmful environmental factors in workplaces can be
prevented and these monitoring systems will be also helpful in finding
out the correlation between safety accidents and occupational diseases
by comparing and linking databases established by this monitoring
system with existing statistical data.