Abstract: The objective of this study was to determine the accuracy to estimation fetal weight by Johnson-s method and compares it with actual birth weight. The sample group was 126 infants delivered in Dan KhunThot hospital from January March 2012. Fetal weight was estimated by measuring fundal height according to Johnson-s method. The information was collected by studying historical delivery records and then analyzed by using the statistics of frequency, percentage, mean, and standard deviation. Finally, the difference was analyzed by a paired t-test.The results showed had an average birth weight was 3093.57 ± 391.03 g (mean ± SD) and 3,455 ± 454.55 g average estimated fetal weight by Johnson-s method higher than average actual birth weight was 384.09 grams. When classifying the infants according to birth weight found that low birth weight ( 4000 g) actual birth weight was more than estimated fetal weight. The difference was found between actual birth weight and estimation fetal weight of the minimum weight in high birth weight ( > 4000 g) , the appropriate birth weight (2500-3999g) and low birth weight (
Abstract: Drought is one of the most damaging climate-related
hazards, it is generally considered as a prolonged absence of
precipitation. This normal and recurring climate phenomenon had
plagued civilization throughout history because of the negative
impacts on economical, environmental and social sectors. Drought
characteristics are thus recognized as important factors in water
resources planning and management. The purpose of this study is to
detect the changes in drought frequency, persistence and severity
in the Ruhr river basin. The frequency of drought events was
calculated using the Standardized Precipitation Index (SPI). Used
data are daily precipitation records from seven meteorological
stations covering the period 1961-2007. The main benefit of the
application of this index is its versatility, only rainfall data is required
to deliver five major dimensions of a drought : duration, intensity,
severity, magnitude, and frequency. Furthermore, drought can be
calculated in different time steps. In this study SPI was calculated for
1, 3, 6, 9, 12, and 24 months. Several drought events were detected
in the covered period, these events contain mild, moderate and severe
droughts. Also positive and negative trends in the SPI values were
observed.
Abstract: This paper discusses the issue of tribal development,
displacement, rehabilitation and resettlement policies, and
implementation in the agency (scheduled / tribal) areas of the West
Godavari District, Andhra Pradesh State, India. This study is based
on action anthropological approach, conducted among the displaced
tribal communities i.e. Konda Reddis and Nayakapods of this region,
under the 'Kovvada Reservoir' Project. These groups are
traditionally shifting cultivators and popularly known as the
Primitive Tribal Groups (PTGs) in the government records. This
paper also focuses on the issues of tribal displacement and land
alienation due to construction of the Kovvada reservoir, without
proper rehabilitation and resettlement, although there are well
defined guidelines, procedures and norms for the rehabilitation of
Project Affected Persons (PAPs). It is necessary to begin with, to
provide an overview of the issues in tribal development and policies
related to displacement and rehabilitation in the Indian context as a
background to the Kovvada Reservoir Project, the subject of this
study.
Abstract: Probability-based identity disclosure risk
measurement may give the same overall risk for different
anonymization strategy of the same dataset. Some entities in the
anonymous dataset may have higher identification risks than the
others. Individuals are more concerned about higher risks than the
average and are more interested to know if they have a possibility of
being under higher risk. A notation of overall risk in the above
measurement method doesn-t indicate whether some of the involved
entities have higher identity disclosure risk than the others. In this
paper, we have introduced an identity disclosure risk measurement
method that not only implies overall risk, but also indicates whether
some of the members have higher risk than the others. The proposed
method quantifies the overall risk based on the individual risk values,
the percentage of the records that have a risk value higher than the
average and how larger the higher risk values are compared to the
average. We have analyzed the disclosure risks for different
disclosure control techniques applied to original microdata and
present the results.
Abstract: The principal purpose of this article is to present a new method based on Adaptive Neural Network Fuzzy Inference System (ANFIS) to generate additional artificial earthquake accelerograms from presented data, which are compatible with specified response spectra. The proposed method uses the learning abilities of ANFIS to develop the knowledge of the inverse mapping from response spectrum to earthquake records. In addition, wavelet packet transform is used to decompose specified earthquake records and then ANFISs are trained to relate the response spectrum of records to their wavelet packet coefficients. Finally, an interpretive example is presented which uses an ensemble of recorded accelerograms to demonstrate the effectiveness of the proposed method.
Abstract: This paper explores an application of an adaptive learning mechanism for robots based on the natural immune system. Most of the research carried out so far are based either on the innate or adaptive characteristics of the immune system, we present a combination of these to achieve behavior arbitration wherein a robot learns to detect vulnerable areas of a track and adapts to the required speed over such portions. The test bed comprises of two Lego robots deployed simultaneously on two predefined near concentric tracks with the outer robot capable of helping the inner one when it misaligns. The helper robot works in a damage-control mode by realigning itself to guide the other robot back onto its track. The panic-stricken robot records the conditions under which it was misaligned and learns to detect and adapt under similar conditions thereby making the overall system immune to such failures.
Abstract: In this report we present a rule-based approach to
detect anomalous telephone calls. The method described here uses
subscriber usage CDR (call detail record) data sampled over two
observation periods: study period and test period. The study period
contains call records of customers- non-anomalous behaviour.
Customers are first grouped according to their similar usage
behaviour (like, average number of local calls per week, etc). For
customers in each group, we develop a probabilistic model to describe
their usage. Next, we use maximum likelihood estimation (MLE) to
estimate the parameters of the calling behaviour. Then we determine
thresholds by calculating acceptable change within a group. MLE is
used on the data in the test period to estimate the parameters of the
calling behaviour. These parameters are compared against thresholds.
Any deviation beyond the threshold is used to raise an alarm. This
method has the advantage of identifying local anomalies as compared
to techniques which identify global anomalies. The method is tested
for 90 days of study data and 10 days of test data of telecom
customers. For medium to large deviations in the data in test window,
the method is able to identify 90% of anomalous usage with less than
1% false alarm rate.
Abstract: In multi hop wireless systems, such as ad hoc and
sensor networks, mobile ad hoc network applications are deployed,
security emerges as a central requirement. A particularly devastating
attack is known as the wormhole attack, where two or more malicious
colluding nodes create a higher level virtual tunnel in the network,
which is employed to transport packets between the tunnel end points.
These tunnels emulate shorter links in the network. In which
adversary records transmitted packets at one location in the network,
tunnels them to another location, and retransmits them into the
network. The wormhole attack is possible even if the attacker has not
compromised any hosts and even if all communication provides
authenticity and confidentiality. In this paper, we analyze wormhole
attack nature in ad hoc and sensor networks and existing methods of
the defending mechanism to detect wormhole attacks without require
any specialized hardware. This analysis able to provide in
establishing a method to reduce the rate of refresh time and the
response time to become more faster.
Abstract: Road authorities have confronted problems to
maintaining the serviceability of road infrastructure systems by using
various traditional methods of contracting. As a solution to these
problems, many road authorities have started contracting out road
maintenance works to the private sector based on performance
measures. This contracting method is named Performance-Based
Maintenance Contracting (PBMC). It is considered more costeffective
than other traditional methods of contracting. It has a
substantial success records in many developed and developing
countries over the last two decades. This paper discusses and
analyses the potential issues to be considered before the introduction
of PBMC in a country.
Abstract: A Matlab based software for logistic regression is developed to enhance the process of teaching quantitative topics and assist researchers with analyzing wide area of applications where categorical data is involved. The software offers an option of performing stepwise logistic regression to select the most significant predictors. The software includes a feature to detect influential observations in data, and investigates the effect of dropping or misclassifying an observation on a predictor variable. The input data may consist either as a set of individual responses (yes/no) with the predictor variables or as grouped records summarizing various categories for each unique set of predictor variables' values. Graphical displays are used to output various statistical results and to assess the goodness of fit of the logistic regression model. The software recognizes possible convergence constraints when present in data, and the user is notified accordingly.
Abstract: We introduce a novel approach to measuring how
humans learn based on techniques from information theory and
apply it to the oriental game of Go. We show that the total amount
of information observable in human strategies, called the strategic
information, remains constant for populations of players of differing
skill levels for well studied patterns of play. This is despite the very
large amount of knowledge required to progress from the recreational
players at one end of our spectrum to the very best and most
experienced players in the world at the other and is in contrast to
the idea that having more knowledge might imply more 'certainty'
in what move to play next. We show this is true for very local
up to medium sized board patterns, across a variety of different
moves using 80,000 game records. Consequences for theoretical and
practical AI are outlined.
Abstract: Magneto-rheological (MR) fluid damper is a semiactive
control device that has recently received more attention by the
vibration control community. But inherent hysteretic and highly
nonlinear dynamics of MR fluid damper is one of the challenging
aspects to employ its unique characteristics. The combination of
artificial neural network (ANN) and fuzzy logic system (FLS) have
been used to imitate more precisely the behavior of this device.
However, the derivative-based nature of adaptive networks causes
some deficiencies. Therefore, in this paper, a novel approach that
employ genetic algorithm, as a free-derivative algorithm, to enhance
the capability of fuzzy systems, is proposed. The proposed method
used to model MR damper. The results will be compared with
adaptive neuro-fuzzy inference system (ANFIS) model, which is one
of the well-known approaches in soft computing framework, and two
best parametric models of MR damper. Data are generated based on
benchmark program by applying a number of famous earthquake
records.
Abstract: The paper deals with an analysis of visibility records collected from 210 European airports to obtain a realistic estimation of the availability of Free Space Optical (FSO) data links. Commercially available optical links usually operate in the 850nm waveband. Thus the influence of the atmosphere on the optical beam and on the visible light is similar. Long-term visibility records represent an invaluable source of data for the estimation of the quality of service of FSO links. The model used characterizes both the statistical properties of fade depths and the statistical properties of individual fade durations. Results are presented for Italy, France, and Germany.
Abstract: The wavelet transform is one of the most important
method used in signal processing. In this study, we have introduced
frequency-energy characteristics of local earthquakes using discrete
wavelet transform. Frequency-energy characteristic was analyzed
depend on difference between P and S wave arrival time and noise
within records. We have found that local earthquakes have similar
characteristics. If frequency-energy characteristics can be found
accurately, this gives us a hint to calculate P and S wave arrival time.
It can be seen that wavelet transform provides successful
approximation for this. In this study, 100 earthquakes with 500
records were analyzed approximately.
Abstract: Nosocomial (i.e., hospital-acquired) infections
(NI) is a major cause of morbidity and mortality in hospitals. NI
rate is higher in intensive care units (ICU) than in the general
ward due to patients with severe symptoms, poor immunity,
and accepted many invasive therapies. Contact behaviors
between health caregivers and patients is one of the infect
factors. It is difficult to obtain complete contact records by
traditional method of retrospective analysis of medical records.
This paper establishes a contact history inferential model
(CHIM) intended to extend the use of Proximity Sensing of
rapid frequency identification (RFID) technology to
transferring all proximity events between health caregivers and
patients into clinical events (close-in events, contact events and
invasive events).The results of the study indicated that the
CHIM can infer proximity care activities into close-in events
and contact events.
The infection control team could redesign and build optimal
workflow in the ICU according to the patient-specific contact
history which provided by our automatic tracing system.
Abstract: In recent years image watermarking has become an
important research area in data security, confidentiality and image
integrity. Many watermarking techniques were proposed for medical
images. However, medical images, unlike most of images, require
extreme care when embedding additional data within them because
the additional information must not affect the image quality and
readability. Also the medical records, electronic or not, are linked to
the medical secrecy, for that reason, the records must be confidential.
To fulfill those requirements, this paper presents a lossless
watermarking scheme for DICOM images. The proposed a fragile
scheme combines two reversible techniques based on difference
expansion for patient's data hiding and protecting the region of
interest (ROI) with tamper detection and recovery capability.
Patient's data are embedded into ROI, while recovery data are
embedded into region of non-interest (RONI). The experimental
results show that the original image can be exactly extracted from the
watermarked one in case of no tampering. In case of tampered ROI,
tampered area can be localized and recovered with a high quality
version of the original area.
Abstract: National Biodiversity Database System (NBIDS) has
been developed for collecting Thai biodiversity data. The goal of this
project is to provide advanced tools for querying, analyzing,
modeling, and visualizing patterns of species distribution for
researchers and scientists. NBIDS data record two types of datasets:
biodiversity data and environmental data. Biodiversity data are
specie presence data and species status. The attributes of biodiversity
data can be further classified into two groups: universal and projectspecific
attributes. Universal attributes are attributes that are common
to all of the records, e.g. X/Y coordinates, year, and collector name.
Project-specific attributes are attributes that are unique to one or a
few projects, e.g., flowering stage. Environmental data include
atmospheric data, hydrology data, soil data, and land cover data
collecting by using GLOBE protocols. We have developed webbased
tools for data entry. Google Earth KML and ArcGIS were used
as tools for map visualization. webMathematica was used for simple
data visualization and also for advanced data analysis and
visualization, e.g., spatial interpolation, and statistical analysis.
NBIDS will be used by park rangers at Khao Nan National Park, and
researchers.
Abstract: Sleep stage scoring is the process of classifying the
stage of the sleep in which the subject is in. Sleep is classified into
two states based on the constellation of physiological parameters.
The two states are the non-rapid eye movement (NREM) and the
rapid eye movement (REM). The NREM sleep is also classified into
four stages (1-4). These states and the state wakefulness are
distinguished from each other based on the brain activity. In this
work, a classification method for automated sleep stage scoring
based on a single EEG recording using wavelet packet decomposition
was implemented. Thirty two ploysomnographic recording from the
MIT-BIH database were used for training and validation of the
proposed method. A single EEG recording was extracted and
smoothed using Savitzky-Golay filter. Wavelet packets
decomposition up to the fourth level based on 20th order Daubechies
filter was used to extract features from the EEG signal. A features
vector of 54 features was formed. It was reduced to a size of 25 using
the gain ratio method and fed into a classifier of regression trees. The
regression trees were trained using 67% of the records available. The
records for training were selected based on cross validation of the
records. The remaining of the records was used for testing the
classifier. The overall correct rate of the proposed method was found
to be around 75%, which is acceptable compared to the techniques in
the literature.
Abstract: This paper describes an enhanced cookie-based
method for counting the visitors of web sites by using a web log
processing system that aims to cope with the ambitious goal of
creating countrywide statistics about the browsing practices of real
human individuals. The focus is put on describing a new more
efficient way of detecting human beings behind web users by placing
different identifiers on the client computers. We briefly introduce our
processing system designed to handle the massive amount of data
records continuously gathered from the most important content
providers of the Hungary. We conclude by showing statistics of
different time spans comparing the efficiency of multiple visitor
counting methods to the one presented here, and some interesting
charts about content providers and web usage based on real data
recorded in 2007 will also be presented.
Abstract: In this paper we present a novel approach for wavelet compression of electrocardiogram (ECG) signals based on the set partitioning in hierarchical trees (SPIHT) coding algorithm. SPIHT algorithm has achieved prominent success in image compression. Here we use a modified version of SPIHT for one dimensional signals. We applied wavelet transform with SPIHT coding algorithm on different records of MIT-BIH database. The results show the high efficiency of this method in ECG compression.