Abstract: To explore pipelines is one of various bio-mimetic
robot applications. The robot may work in common buildings such as
between ceilings and ducts, in addition to complicated and massive
pipeline systems of large industrial plants. The bio-mimetic robot finds
any troubled area or malfunction and then reports its data. Importantly,
it can not only prepare for but also react to any abnormal routes in the
pipeline. The pipeline monitoring tasks require special types of mobile
robots. For an effective movement along a pipeline, the movement of
the robot will be similar to that of insects or crawling animals. During
its movement along the pipelines, a pipeline monitoring robot has an
important task of finding the shapes of the approaching path on the
pipes. In this paper we propose an effective solution to the pipeline
pattern recognition, based on the fuzzy classification rules for the
measured IR distance data.
Abstract: We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.
Abstract: Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.
Abstract: This paper reviews recent studies and particularly the
effects of Climate Change in the North Tropical Atlantic by studying
atmospheric conditions that prevailed in 2005 ; Coral Bleaching
HotSpot and Hurricane Katrina. In the aim to better understand and
estimate the impact of the physical phenomenon, i.e. Thermal
Oceanic HotSpot (TOHS), isotopic studies of δ18O and δ13C on
marine animals from Guadeloupe (French Caribbean Island) were
carried out. Recorded measures show Sea Surface Temperature (SST)
up to 35°C in August which is much higher than data recorded by
NOAA satellites 32°C. After having reviewed the process that led to
the creation of Hurricane Katrina which hit New Orleans in August
29, 2005, it will be shown that the climatic conditions in the
Caribbean from August to October 2005 have influenced Katrina
evolution. This TOHS is a combined effect of various phenomenon
which represent an additional factor to estimate future climate
changes.
Abstract: In this work a new platform for mobile-health systems is
presented. System target application is providing decision support to
rescue corps or military medical personnel in combat areas. Software
architecture relies on a distributed client-server system that manages a
wireless ad-hoc networks hierarchy in which several different types of
client operate. Each client is characterized for different hardware and
software requirements. Lower hierarchy levels rely in a network of
completely custom devices that store clinical information and patient
status and are designed to form an ad-hoc network operating in the
2.4 GHz ISM band and complying with the IEEE 802.15.4 standard
(ZigBee). Medical personnel may interact with such devices, that are
called MICs (Medical Information Carriers), by means of a PDA
(Personal Digital Assistant) or a MDA (Medical Digital Assistant),
and transmit the information stored in their local databases as well as
issue a service request to the upper hierarchy levels by using IEEE
802.11 a/b/g standard (WiFi). The server acts as a repository that
stores both medical evacuation forms and associated events (e.g., a
teleconsulting request). All the actors participating in the diagnostic
or evacuation process may access asynchronously to such repository
and update its content or generate new events. The designed system
pretends to optimise and improve information spreading and flow
among all the system components with the aim of improving both
diagnostic quality and evacuation process.
Abstract: Time series analysis often requires data that represents
the evolution of an observed variable in equidistant time steps. In
order to collect this data sampling is applied. While continuous
signals may be sampled, analyzed and reconstructed applying
Shannon-s sampling theorem, time-discrete signals have to be dealt
with differently. In this article we consider the discrete-event
simulation (DES) of job-shop-systems and study the effects of
different sampling rates on data quality regarding completeness and
accuracy of reconstructed inventory evolutions. At this we discuss
deterministic as well as non-deterministic behavior of system
variables. Error curves are deployed to illustrate and discuss the
sampling rate-s impact and to derive recommendations for its wellfounded
choice.
Abstract: This paper examines whether or not immigration has a positive influence on the duration of unemployment, in a macroeconomic perspective. We analyse also whether the degree of labor market integration can influence migration. The integration of immigrants into the labor market is a recurrence theme in the work on the economic consequences of immigration. However, to our knowledge, no researchers have studied the impact of immigration on unemployment duration, and vice versa. With two methodology of research (panel estimations (OLS and 2SLS) and panel cointegration techniques), we show that migration seems to influence positively the short-term unemployment and negatively long-term unemployment, for 14 OECD destination countries. In addition, immigration seems to be conditioned by the structural and institutional characteristics of the labour market.
Abstract: Support vector machines (SVMs) are considered to be
the best machine learning algorithms for minimizing the predictive
probability of misclassification. However, their drawback is that for
large data sets the computation of the optimal decision boundary is a
time consuming function of the size of the training set. Hence several
methods have been proposed to speed up the SVM algorithm. Here
three methods used to speed up the computation of the SVM
classifiers are compared experimentally using a musical genre
classification problem. The simplest method pre-selects a random
sample of the data before the application of the SVM algorithm. Two
additional methods use proximity graphs to pre-select data that are
near the decision boundary. One uses k-Nearest Neighbor graphs and
the other Relative Neighborhood Graphs to accomplish the task.
Abstract: The Sensor Network consists of densely deployed
sensor nodes. Energy optimization is one of the most important
aspects of sensor application design. Data acquisition and aggregation
techniques for processing data in-network should be energy efficient.
Due to the cross-layer design, resource-limited and noisy nature
of Wireless Sensor Networks(WSNs), it is challenging to study
the performance of these systems in a realistic setting. In this
paper, we propose optimizing queries by aggregation of data and
data redundancy to reduce energy consumption without requiring
all sensed data and directed diffusion communication paradigm to
achieve power savings, robust communication and processing data
in-network. To estimate the per-node power consumption POWERTossim
mica2 energy model is used, which provides scalable and
accurate results. The performance analysis shows that the proposed
methods overcomes the existing methods in the aspects of energy
consumption in wireless sensor networks.
Abstract: A generalised relational data model is formalised for
the representation of data with nested structure of arbitrary depth. A
recursive algebra for the proposed model is presented. All the
operations are formally defined. The proposed model is proved to be
a superset of the conventional relational model (CRM). The
functionality and validity of the model is shown by a prototype
implementation that has been undertaken in the functional
programming language Miranda.
Abstract: The objectives of this research are to search the
management pattern of Nakhon Pathom lodging entrepreneurs for
sufficient economy ways, to know the threat that affects this sector
and design fit arrangement model to sustain their business with
Nakhon Pathom style. What will happen if they do not use this
approach? Will they have a financial crisis? The data and
information are collected by informal discussions with 12 managers
and 400 questionnaires. A mixed method of both qualitative research
and quantitative research are used. Bent Flyvbjerg’s phronesis is
utilized for this analysis. Our research will prove that sufficient
economy can help small business firms to solve their problems. We
think that the results of our research will be a financial model to
solve many problems of the entrepreneurs and this way will can be a
model for other provinces of Thailand.
Abstract: The development of aid's systems for the medical
diagnosis is not easy thing because of presence of inhomogeneities in
the MRI, the variability of the data from a sequence to the other as
well as of other different source distortions that accentuate this
difficulty. A new automatic, contextual, adaptive and robust
segmentation procedure by MRI brain tissue classification is
described in this article. A first phase consists in estimating the
density of probability of the data by the Parzen-Rozenblatt method.
The classification procedure is completely automatic and doesn't
make any assumptions nor on the clusters number nor on the
prototypes of these clusters since these last are detected in an
automatic manner by an operator of mathematical morphology called
skeleton by influence zones detection (SKIZ). The problem of
initialization of the prototypes as well as their number is transformed
in an optimization problem; in more the procedure is adaptive since it
takes in consideration the contextual information presents in every
voxel by an adaptive and robust non parametric model by the
Markov fields (MF). The number of bad classifications is reduced by
the use of the criteria of MPM minimization (Maximum Posterior
Marginal).
Abstract: Power Spectral Density (PSD) computed by taking the Fourier transform of auto-correlation functions (Wiener-Khintchine Theorem) gives better result, in case of noisy data, as compared to the Periodogram approach. However, the computational complexity of Wiener-Khintchine approach is more than that of the Periodogram approach. For the computation of short time Fourier transform (STFT), this problem becomes even more prominent where computation of PSD is required after every shift in the window under analysis. In this paper, recursive version of the Wiener-Khintchine theorem has been derived by using the sliding DFT approach meant for computation of STFT. The computational complexity of the proposed recursive Wiener-Khintchine algorithm, for a window size of N, is O(N).
Abstract: The presented paper shows the possibility of using
holographic interferometry for measurement of temperature field in
moving fluids. There are a few methods for identification of velocity
fields in fluids, such us LDA, PIV, hot wire anemometry. It is very
difficult to measure the temperature field in moving fluids. One of the
often used methods is Constant Current Anemometry (CCA), which
is a point temperature measurement method. Data are possibly
acquired at frequencies up to 1000Hz. This frequency should be
limiting factor for using of CCA in fluid when fast change of
temperature occurs. This shortcoming of CCA measurements should
be overcome by using of optical methods such as holographic
interferometry. It is necessary to employ a special holographic setup
with double sensitivity instead of the commonly used Mach-Zehnder
type of holographic interferometer in order to attain the parameters
sufficient for the studied case. This setup is not light efficient like the
Mach-Zehnder type but has double sensitivity. The special technique
of acquiring and phase averaging of results from holographic
interferometry is also presented. The results from the holographic
interferometry experiments will be compared with the temperature
field achieved by methods CCA method.
Abstract: Double heterogeneity of randomly located pebbles in
the core and Coated Fuel Particles (CFPs) in the pebbles are specific
features in pebble bed reactors and usually, because of difficulty to
model with MCNP code capabilities, are neglected. In this study,
characteristics of HTR-10, Tsinghua University research reactor, are
used and not only double heterogeneous but also truncated CFPs and
Pebbles are considered.Firstly, 8335 CFPs are distributed randomly
in a pebble and then the core of reactor is filled with those pebbles
and graphite pebbles as moderator such that 57:43 ratio of fuel and
moderator pebbles is established.Finally, four different core
configurations are modeled. They are Simple Cubic (SC) structure
with truncated pebbles,SC structure without truncated pebble, and
Simple Hexagonal(SH) structure without truncated pebbles and SH
structure with truncated pebbles. Results like effective multiplication
factor (Keff), critical height,etc. are compared with available data.
Abstract: The paper proposes a new concept in developing
collaborative design system. The concept framework involves
applying simulation of supply chain management to collaborative
design called – 'SCM–Based Design Tool'. The system is developed
particularly to support design activities and to integrate all facilities
together. The system is aimed to increase design productivity and
creativity. Therefore, designers and customers can collaborate by the
system since conceptual design. JAG: Jewelry Art Generator based
on artificial intelligence techniques is integrated into the system.
Moreover, the proposed system can support users as decision tool
and data propagation. The system covers since raw material supply
until product delivery. Data management and sharing information are
visually supported to designers and customers via user interface. The
system is developed on Web–assisted product development
environment. The prototype system is presented for Thai jewelry
industry as a system prototype demonstration, but applicable for
other industry.
Abstract: Existing methods in which the animation data of all frames are stored and reproduced as with vertex animation cannot be used in mobile device environments because these methods use large amounts of the memory. So 3D animation data reduction methods aimed at solving this problem have been extensively studied thus far and we propose a new method as follows. First, we find and remove frames in which motion changes are small out of all animation frames and store only the animation data of remaining frames (involving large motion changes). When playing the animation, the removed frame areas are reconstructed using the interpolation of the remaining frames. Our key contribution is to calculate the accelerations of the joints of individual frames and the standard deviations of the accelerations using the information of joint locations in the relevant 3D model in order to find and delete frames in which motion changes are small. Our methods can reduce data sizes by approximately 50% or more while providing quality which is not much lower compared to original animations. Therefore, our method is expected to be usefully used in mobile device environments or other environments in which memory sizes are limited.
Abstract: This paper focuses on a novel method for semantic
searching and retrieval of information about learning materials.
Metametadata encapsulate metadata instances by using the properties
and attributes provided by ontologies rather than describing learning
objects. A novel metametadata taxonomy has been developed which
provides the basis for a semantic search engine to extract, match and
map queries to retrieve relevant results. The use of ontological views
is a foundation for viewing the pedagogical content of metadata
extracted from learning objects by using the pedagogical attributes
from the metametadata taxonomy. Using the ontological approach
and metametadata (based on the metametadata taxonomy) we present
a novel semantic searching mechanism.These three strands – the
taxonomy, the ontological views, and the search algorithm – are
incorporated into a novel architecture (OMESCOD) which has been
implemented.
Abstract: Does the spatial perspective provide a common thread for rural sociology? Have rural sociologists succeeded in bringing order to their data using spatial analysis models and techniques? A trial answer to such questions, as touchstones of theoretical and applied sociological studies in rural areas, is the point at issue in the present paper. Spatial analyses have changed the way rural sociologists approach scientific problems. Rural sociology is spatial by nature because much, if not most, of its research topics has a spatial “awareness." However, such spatial awareness is not quite the same as spatial analysis because it is not typically associated with underlying theories and hypotheses about spatial patterns that are designed to be tested for their specific spatial content. This paper presents pressing issues for future research to reintroduce mainstream rural sociology to the concept of space.
Abstract: The quick training algorithms and accurate solution
procedure for incremental learning aim at improving the efficiency of
training of SVR, whereas there are some disadvantages for them, i.e.
the nonconvergence of the formers for changeable training set and
the inefficiency of the latter for a massive dataset. In order to handle
the problems, a new training algorithm for a changeable training
set, named Approximation Incremental Training Algorithm (AITA),
was proposed. This paper explored the reason of nonconvergence
theoretically and discussed the realization of AITA, and finally
demonstrated the benefits of AITA both on precision and efficiency.