Abstract: In the present study Schwertmannite (an iron oxide
hydroxide) is selected as an adsorbent for defluoridation of water.
The adsorbent was prepared by wet chemical process and was
characterized by SEM, XRD and BET. The fluoride adsorption
efficiency of the prepared adsorbent was determined with respect to
contact time, initial fluoride concentration, adsorbent dose and pH of
the solution. The batch adsorption data revealed that the fluoride
adsorption efficiency was highly influenced by the studied factors.
Equilibrium was attained within one hour of contact time indicating
fast kinetics and the adsorption data followed pseudo second order
kinetic model. Equilibrium isotherm data fitted to both Langmuir and
Freundlich isotherm models for a concentration range of 5-30 mg/L.
The adsorption system followed Langmuir isotherm model with
maximum adsorption capacity of 11.3 mg/g. The high adsorption
capacity of Schwertmannite points towards the potential of this
adsorbent for fluoride removal from aqueous medium.
Abstract: The purpose of the study is to determine the primary mathematics student teachers- views related to use instructional technology tools in course of the learning process and to reveal how the sample presentations towards different mathematical concepts affect their views. This is a qualitative study involving twelve mathematics students from a public university. The data gathered from two semi-structural interviews. The first one was realized in the beginning of the study. After that the representations prepared by the researchers were showed to the participants. These representations contain animations, Geometer-s Sketchpad activities, video-clips, spreadsheets, and power-point presentations. The last interview was realized at the end of these representations. The data from the interviews and content analyses were transcribed and read and reread to explore the major themes. Findings revealed that the views of the students changed in this process and they believed that the instructional technology tools should be used in their classroom.
Abstract: Ensemble learning algorithms such as AdaBoost and
Bagging have been in active research and shown improvements in
classification results for several benchmarking data sets with mainly
decision trees as their base classifiers. In this paper we experiment to
apply these Meta learning techniques with classifiers such as random
forests, neural networks and support vector machines. The data sets
are from MAGIC, a Cherenkov telescope experiment. The task is to
classify gamma signals from overwhelmingly hadron and muon
signals representing a rare class classification problem. We compare
the individual classifiers with their ensemble counterparts and
discuss the results. WEKA a wonderful tool for machine learning has
been used for making the experiments.
Abstract: One of the most important requirements for the
operation and planning activities of an electrical utility is the
prediction of load for the next hour to several days out, known as
short term load forecasting. This paper presents the development of
an artificial neural network based short-term load forecasting model.
The model can forecast daily load profiles with a load time of one
day for next 24 hours. In this method can divide days of year with
using average temperature. Groups make according linearity rate of
curve. Ultimate forecast for each group obtain with considering
weekday and weekend. This paper investigates effects of temperature
and humidity on consuming curve. For forecasting load curve of
holidays at first forecast pick and valley and then the neural network
forecast is re-shaped with the new data. The ANN-based load models
are trained using hourly historical. Load data and daily historical
max/min temperature and humidity data. The results of testing the
system on data from Yazd utility are reported.
Abstract: In this paper, a new approach for design of a fully
differential second order current mode continuous-time sigma-delta
modulator is presented. For circuit implementation, square root
domain (SRD) translinear loop based on floating-gate MOS
transistors that operate in saturation region is employed. The
modulator features, low supply voltage, low power consumption
(8mW) and high dynamic range (55dB). Simulation results confirm
that this design is suitable for data converters.
Abstract: Electrocardiogram (ECG) segmentation is necessary
to help reduce the time consuming task of manually annotating
ECG-s. Several algorithms have been developed to segment the ECG
automatically. We first review several of such methods, and then
present a new single lead segmentation method based on Adaptive
piecewise constant approximation (APCA) and Piecewise derivative
dynamic time warping (PDDTW). The results are tested on the QT
database. We compared our results to Laguna-s two lead method. Our
proposed approach has a comparable mean error, but yields a slightly
higher standard deviation than Laguna-s method.
Abstract: Dorsal hand vein pattern is an emerging biometric which is attracting the attention of researchers, of late. Research is being carried out on existing techniques in the hope of improving them or finding more efficient ones. In this work, Principle Component Analysis (PCA) , which is a successful method, originally applied on face biometric is being modified using Cholesky decomposition and Lanczos algorithm to extract the dorsal hand vein features. This modified technique decreases the number of computation and hence decreases the processing time. The eigenveins were successfully computed and projected onto the vein space. The system was tested on a database of 200 images and using a threshold value of 0.9 to obtain the False Acceptance Rate (FAR) and False Rejection Rate (FRR). This modified algorithm is desirable when developing biometric security system since it significantly decreases the matching time.
Abstract: The modeling of inelastic behavior of plastic materials requires measurements providing information on material response to different multiaxial loading conditions. Different triaxiality conditions and values of Lode parameters have to be
covered for complex description of the material plastic behavior.
Samples geometries providing material plastic behavoiur over the range of interest are proposed with the use of FEM analysis. Round samples with 3 different notches and smooth surface are used
together with butterfly type of samples tested at angle ranging for 0 to
90°. Identification of ductile damage parameters is carried out on
the basis of obtained experimental data for austenitic stainless steel.
The obtained material plastic damage parameters are subsequently applied to FEM simulation of notched CT normally samples used for
fracture mechanics testing and results from the simulation are
compared with real tests.
Abstract: Excilamps are new UV sources with great potential
for application in wastewater treatment. In the present work, a XeBr
excilamp emitting radiation at 283 nm has been used for the
photodegradation of 4-chlorophenol within a range of concentrations
from 50 to 500 mg L-1. Total removal of 4-chlorophenol was
achieved for all concentrations assayed. The two main photoproduct
intermediates formed along the photodegradation process,
benzoquinone and hydroquinone, although not being completely
removed, remain at very low residual concentrations. Such
concentrations are insignificant compared to the 4-chlorophenol
initial ones and non-toxic. In order to simulate the process and scaleup,
a kinetic model has been developed and validated from the
experimental data.
Abstract: Virtual Assembly (VA) is one of the key technologies
in advanced manufacturing field. It is a promising application of
virtual reality in design and manufacturing field. It has drawn much
interest from industries and research institutes in the last two decades.
This paper describes a process for integrating an interactive Virtual
Reality-based assembly simulation of a digital mockup with the
CAD/CAM infrastructure. The necessary hardware and software
preconditions for the process are explained so that it can easily be
adopted by non VR experts. The article outlines how assembly
simulation can improve the CAD/CAM procedures and structures;
how CAD model preparations have to be carried out and which
virtual environment requirements have to be fulfilled. The issue of
data transfer is also explained in the paper. The other challenges and
requirements like anti-aliasing and collision detection have also been
explained. Finally, a VA simulation has been carried out for a ball
valve assembly and a car door assembly with the help of Vizard
virtual reality toolkit in a semi-immersive environment and their
performance analysis has been done on different workstations to
evaluate the importance of graphical processing unit (GPU) in the
field of VA.
Abstract: The policies governing the business of any
organization are well reflected in her business rules. The business
rules are implemented by data validation techniques, coded during
the software development process. Any change in business
policies results in change in the code written for data validation
used to enforce the business policies. Implementing the change in
business rules without changing the code is the objective of this
paper. The proposed approach enables users to create rule sets at
run time once the software has been developed. The newly defined
rule sets by end users are associated with the data variables for
which the validation is required. The proposed approach facilitates
the users to define business rules using all the comparison
operators and Boolean operators. Multithreading is used to
validate the data entered by end user against the business rules
applied. The evaluation of the data is performed by a newly
created thread using an enhanced form of the RPN (Reverse Polish
Notation) algorithm.
Abstract: This paper presents the result of three senior capstone
projects at the Department of Computer Engineering, Prince of
Songkla University, Thailand. These projects focus on developing an
examination management system for the Faculty of Engineering in
order to manage the examination both the examination room
assignments and the examination proctor assignments in each room.
The current version of the software is a web-based application. The
developed software allows the examination proctors to select their
scheduled time online while each subject is assigned to each available
examination room according to its type and the room capacity. The
developed system is evaluated using real data by prospective users of
the system. Several suggestions for further improvements are given
by the testers. Even though the features of the developed software are
not superior, the developing process can be a case study for a projectbased
teaching style. Furthermore, the process of developing this
software can show several issues in developing an educational
support application.
Abstract: We have developed a computer program consisting of
6 subtests assessing the children hand dexterity applicable in the
rehabilitation medicine. We have carried out a normative study on a
representative sample of 285 children aged from 7 to 15 (mean age
11.3) and we have proposed clinical standards for three age groups
(7-9, 9-11, 12-15 years). We have shown statistical significance of
differences among the corresponding mean values of the task time
completion. We have also found a strong correlation between the task
time completion and the age of the subjects, as well as we have
performed the test-retest reliability checks in the sample of 84
children, giving the high values of the Pearson coefficients for the
dominant and non-dominant hand in the range 0.740.97 and
0.620.93, respectively.
A new MATLAB-based programming tool aiming at analysis of
cardiologic RR intervals and blood pressure descriptors, is worked
out, too. For each set of data, ten different parameters are extracted: 2
in time domain, 4 in frequency domain and 4 in Poincaré plot
analysis. In addition twelve different parameters of baroreflex
sensitivity are calculated. All these data sets can be visualized in time
domain together with their power spectra and Poincaré plots. If
available, the respiratory oscillation curves can be also plotted for
comparison. Another application processes biological data obtained
from BLAST analysis.
Abstract: Key performance indicators (KPIs) are used for post
result evaluation in the construction industry, and they normally do
not have provisions for changes. This paper proposes a set of
dynamic key performance indicators (d-KPIs) which predicts the
future performance of the activity being measured and presents the
opportunity to change practice accordingly. Critical to the
predictability of a construction project is the ability to achieve
automated data collection. This paper proposes an effective way to
collect the process and engineering management data from an
integrated construction management system. The d-KPI matrix,
consisting of various indicators under seven categories, developed
from this study can be applied to close monitoring of the
development projects of aged-care facilities. The d-KPI matrix also
enables performance measurement and comparison at both project
and organization levels.
Abstract: In this paper, an improvement of PDLZW implementation
with a new dictionary updating technique is proposed. A
unique dictionary is partitioned into hierarchical variable word-width
dictionaries. This allows us to search through dictionaries in parallel.
Moreover, the barrel shifter is adopted for loading a new input string
into the shift register in order to achieve a faster speed. However,
the original PDLZW uses a simple FIFO update strategy, which is
not efficient. Therefore, a new window based updating technique
is implemented to better classify the difference in how often each
particular address in the window is referred. The freezing policy
is applied to the address most often referred, which would not be
updated until all the other addresses in the window have the same
priority. This guarantees that the more often referred addresses would
not be updated until their time comes. This updating policy leads
to an improvement on the compression efficiency of the proposed
algorithm while still keep the architecture low complexity and easy
to implement.
Abstract: A special case of floating point data representation is block
floating point format where a block of operands are forced to have a joint
exponent term. This paper deals with the finite wordlength properties of
this data format. The theoretical errors associated with the error model for
block floating point quantization process is investigated with the help of error
distribution functions. A fast and easy approximation formula for calculating
signal-to-noise ratio in quantization to block floating point format is derived.
This representation is found to be a useful compromise between fixed point
and floating point format due to its acceptable numerical error properties over
a wide dynamic range.
Abstract: The customer satisfaction for textile sector carries
great importance like the customer satisfaction for other sectors
carry. Especially, if it is considered that gaining new customers
create four times more costs than protecting existing customers from
leaving, it can be seen that the customer satisfaction plays a great
role for the firms. In this study the affecting independent variables of
customer satisfaction are chosen as brand image, perceived service
quality and perceived product quality. By these independent
variables, it is investigated that if any differences exist in perception
of customer satisfaction according to the Turkish textile consumers in
the view of gender. In data analysis of this research the SPSS
program is used.
Abstract: This paper proposes a method that discovers time series event patterns from textual data with time information. The patterns are composed of sequences of events and each event is extracted from the textual data, where an event is characteristic content included in the textual data such as a company name, an action, and an impression of a customer. The method introduces 7 types of time constraints based on the analysis of the textual data. The method also evaluates these constraints when the frequency of a time series event pattern is calculated. We can flexibly define the time constraints for interesting combinations of events and can discover valid time series event patterns which satisfy these conditions. The paper applies the method to daily business reports collected by a sales force automation system and verifies its effectiveness through numerical experiments.
Abstract: We develop new nonlinear methods of
immunofluorescence analysis for a sensitive technology of
respiratory burst reaction of DNA fluorescence due to oxidative
activity in the peripheral blood neutrophils. Histograms in flow
cytometry experiments represent a fluorescence flashes frequency as
functions of fluorescence intensity. We used the Shannon-Weaver
index for definition of neutrophils- biodiversity and Hurst index for
definition of fractal-s correlations in immunofluorescence for
different donors, as the basic quantitative criteria for medical
diagnostics of health status. We analyze frequencies of flashes,
information, Shannon entropies and their fractals in
immunofluorescence networks due to reduction of histogram range.
We found the number of simplest universal correlations for
biodiversity, information and Hurst index in diagnostics and
classification of pathologies for wide spectra of diseases. In addition
is determined the clear criterion of a common immunity and human
health status in a form of yes/no answers type. These answers based
on peculiarities of information in immunofluorescence networks and
biodiversity of neutrophils. Experimental data analysis has shown the
existence of homeostasis for information entropy in oxidative activity
of DNA in neutrophil nuclei for all donors.
Abstract: This study proposes a novel recommender system to
provide the advertisements of context-aware services. Our proposed
model is designed to apply a modified collaborative filtering (CF)
algorithm with regard to the several dimensions for the personalization
of mobile devices – location, time and the user-s needs type. In
particular, we employ a classification rule to understand user-s needs
type using a decision tree algorithm. In addition, we collect primary
data from the mobile phone users and apply them to the proposed
model to validate its effectiveness. Experimental results show that the
proposed system makes more accurate and satisfactory advertisements
than comparative systems.