Abstract: In this paper, we evaluate the performance of some wavelet based coding algorithms such as 3D QT-L, 3D SPIHT and JPEG2K. In the first step we achieve an objective comparison between three coders, namely 3D SPIHT, 3D QT-L and JPEG2K. For this purpose, eight MRI head scan test sets of 256 x 256x124 voxels have been used. Results show superior performance of 3D SPIHT algorithm, whereas 3D QT-L outperforms JPEG2K. The second step consists of evaluating the robustness of 3D SPIHT and JPEG2K coding algorithm over wireless transmission. Compressed dataset images are then transmitted over AWGN wireless channel or over Rayleigh wireless channel. Results show the superiority of JPEG2K over these two models. In fact, it has been deduced that JPEG2K is more robust regarding coding errors. Thus we may conclude the necessity of using corrector codes in order to protect the transmitted medical information.
Abstract: CT assessment of postoperative spine is challenging in the presence of metal streak artifacts that could deteriorate the
quality of CT images. In this paper, we studied the influence of different acquisition parameters on the magnitude of metal streaking.
A water-bath phantom was constructed with metal insertion similar with postoperative spine assessment. The phantom was scanned with
different acquisition settings and acquired data were reconstructed
using various reconstruction settings. Standardized ROIs were defined within streaking region for image analysis. The result shows
increased kVp and mAs enhanced SNR values by reducing image
noise. Sharper kernel enhanced image quality compared to smooth
kernel, but produced more noise in the images with higher CT fluctuation. The noise between both kernels were significantly
different (P
Abstract: In the visual servoing systems, the data obtained by
Visionary is used for controlling robots. In this project, at first the
simulator which was proposed for simulating the performance of a
6R robot before, was examined in terms of software and test, and in
the proposed simulator, existing defects were obviated. In the first
version of simulation, the robot was directed toward the target object only in a Position-based method using two cameras in the
environment. In the new version of the software, three cameras were used simultaneously. The camera which is installed as eye-inhand on the end-effector of the robot is used for visual servoing in a
Feature-based method. The target object is recognized according to
its characteristics and the robot is directed toward the object in compliance with an algorithm similar to the function of human-s
eyes. Then, the function and accuracy of the operation of the robot are examined through Position-based visual servoing method using
two cameras installed as eye-to-hand in the environment. Finally, the obtained results are tested under ANSI-RIA R15.05-2 standard.
Abstract: The paper discusses the mathematics of pattern
indexing and its applications to recognition of visual patterns that are
found in video clips. It is shown that (a) pattern indexes can be
represented by collections of inverted patterns, (b) solutions to
pattern classification problems can be found as intersections and
histograms of inverted patterns and, thus, matching of original
patterns avoided.
Abstract: Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.
Abstract: Conception is the primordial part in the realization of
a computer system. Several tools have been used to help inventors to
describe their software. These tools knew a big success in the
relational databases domain since they permit to generate SQL script
modeling the database from an Entity/Association model. However,
with the evolution of the computer domain, the relational databases
proved their limits and object-relational model became used more
and more. Tools of present conception don't support all new concepts
introduced by this model and the syntax of the SQL3 language. We
propose in this paper a tool of help to the conception and
implementation of object-relational databases called «NAVIGTOOLS"
that allows the user to generate script modeling its database
in SQL3 language. This tool bases itself on the Entity/Association
and navigational model for modeling the object-relational databases.
Abstract: In many data mining applications, it is a priori known
that the target function should satisfy certain constraints imposed
by, for example, economic theory or a human-decision maker. In this
paper we consider partially monotone prediction problems, where the
target variable depends monotonically on some of the input variables
but not on all. We propose a novel method to construct prediction
models, where monotone dependences with respect to some of
the input variables are preserved by virtue of construction. Our
method belongs to the class of mixture models. The basic idea is to
convolute monotone neural networks with weight (kernel) functions
to make predictions. By using simulation and real case studies,
we demonstrate the application of our method. To obtain sound
assessment for the performance of our approach, we use standard
neural networks with weight decay and partially monotone linear
models as benchmark methods for comparison. The results show that
our approach outperforms partially monotone linear models in terms
of accuracy. Furthermore, the incorporation of partial monotonicity
constraints not only leads to models that are in accordance with the
decision maker's expertise, but also reduces considerably the model
variance in comparison to standard neural networks with weight
decay.
Abstract: The benefits of physical activity for children are promoted widely and well understood; however factors which impact on children-s beliefs and attitudes towards physical education need to be explored in more detail. The purpose of this study was to evaluate how primary school children value and perceive their involvement in physical education (PE) classes through the use of drawings. While this type of data collection has been used previously to determine a child-s response to specific health education classes, such as drug education, to the best of our knowledge it has not been used in the context of PE. Results from this study showed that kindergarten children found PE classes fun and engaging. Children in Year 4 and Year 6 were less satisfied with PE classes because of the activities offered, the lack of opportunity to play sport, and perception that teachers did not appear to value this area of the curriculum.
Abstract: The new idea of this research is application of a new fault detection and isolation (FDI) technique for supervision of sensor networks in transportation system. In measurement systems, it is necessary to detect all types of faults and failures, based on predefined algorithm. Last improvements in artificial neural network studies (ANN) led to using them for some FDI purposes. In this paper, application of new probabilistic neural network features for data approximation and data classification are considered for plausibility check in temperature measurement. For this purpose, two-phase FDI mechanism was considered for residual generation and evaluation.
Abstract: Imperfect knowledge cannot be avoided all the time. Imperfections may have several forms; uncertainties, imprecision and incompleteness. When we look to classification of methods for the management of imperfect knowledge we see fuzzy set-based techniques. The choice of a method to process data is linked to the choice of knowledge representation, which can be numerical, symbolic, logical or semantic and it depends on the nature of the problem to be solved for example decision support, which will be mentioned in our study. Fuzzy Logic is used for its ability to manage imprecise knowledge, but it can take advantage of the ability of neural networks to learn coefficients or functions. Such an association of methods is typical of so-called soft computing. In this study a new method was used for the management of imprecision for collected knowledge which related to economic analysis of construction industry in Turkey. Because of sudden changes occurring in economic factors decrease competition strength of construction companies. The better evaluation of these changes in economical factors in view of construction industry will made positive influence on company-s decisions which are dealing construction.
Abstract: The H.264/AVC standard is a highly efficient video
codec providing high-quality videos at low bit-rates. As employing
advanced techniques, the computational complexity has been
increased. The complexity brings about the major problem in the
implementation of a real-time encoder and decoder. Parallelism is the
one of approaches which can be implemented by multi-core system.
We analyze macroblock-level parallelism which ensures the same bit
rate with high concurrency of processors. In order to reduce the
encoding time, dynamic data partition based on macroblock region is
proposed. The data partition has the advantages in load balancing and
data communication overhead. Using the data partition, the encoder
obtains more than 3.59x speed-up on a four-processor system. This
work can be applied to other multimedia processing applications.
Abstract: The purpose of the present study is to analyze the
effect of the target plate-s curvature on the heat transfer in laminar
confined impinging jet flows. Numerical results from two
dimensional compressible finite volume solver are compared
between three different shapes of impinging plates: Flat, Concave
and Convex plates. The remarkable result of this study proves that
the stagnation Nusselt number in laminar range of Reynolds number
based on the slot width is maximum in convex surface and is
minimum in concave plate. These results refuse the previous data in
literature stating the amount of the stagnation Nusselt number is
greater in concave surface related to flat plate configuration.
Abstract: Emerging Bio-engineering fields such as Brain
Computer Interfaces, neuroprothesis devices and modeling and
simulation of neural networks have led to increased research activity
in algorithms for the detection, isolation and classification of Action
Potentials (AP) from noisy data trains. Current techniques in the field
of 'unsupervised no-prior knowledge' biosignal processing include
energy operators, wavelet detection and adaptive thresholding. These
tend to bias towards larger AP waveforms, AP may be missed due to
deviations in spike shape and frequency and correlated noise
spectrums can cause false detection. Also, such algorithms tend to
suffer from large computational expense.
A new signal detection technique based upon the ideas of phasespace
diagrams and trajectories is proposed based upon the use of a
delayed copy of the AP to highlight discontinuities relative to
background noise. This idea has been used to create algorithms that
are computationally inexpensive and address the above problems.
Distinct AP have been picked out and manually classified from
real physiological data recorded from a cockroach. To facilitate
testing of the new technique, an Auto Regressive Moving Average
(ARMA) noise model has been constructed bases upon background
noise of the recordings. Along with the AP classification means this
model enables generation of realistic neuronal data sets at arbitrary
signal to noise ratio (SNR).
Abstract: In present work the problem of the ITER fusion
plasma neutron source parameter reconstruction using only the
Vertical Neutron Camera data was solved. The possibility of neutron
source parameter reconstruction was estimated by the numerical
simulations and the analysis of adequateness of mathematic model
was performed. The neutron source was specified in a parametric
form. The numerical analysis of solution stability with respect to data
distortion was done. The influence of the data errors on the
reconstructed parameters is shown:
• is reconstructed with errors less than 4% at all examined values
of δ (until 60%);
• is determined with errors less than 10% when δ do not overcome
5%;
• is reconstructed with relative error more than 10 %;
• integral intensity of the neutron source is determined with error
10% while δ error is less than 15%;
where -error of signal measurements, (R0,Z0), the plasma center
position,- /parameter of neutron source profile.
Abstract: This research aimed to study the competency of health
and wellness hotels and resorts in developing use the local natural
resources and wisdom to conform to the national health and wellness
tourism (HWT) strategy by comparing two independent samples,
from Aumpur Muang, Ranong province and Aumpur Muang,
Chiangmai province. And also study in the suggestive direct path to
lead the organization to the sustainable successful.
This research was conduct by using mix methodology; both
quantitative and qualitative data were used. The data of competency
of health and wellness hotels and resorts (HWHR) in developing use
the local natural resources for HWT promoting were collected via
300 set of questionnaires, from 6 hotels and resorts in 2 areas, 3
places from Aumpur Muang, Ranong province and another 3 from
Aumpur Muang, Chiangmai province.
Thestudy of HWHR’s competency in developing use the local
natural resources and wisdom to conform to the national HWT
strategycan be divided into fourmain areas, food and beverages
service, tourism activity, environmental service, and value adding.
The total competency of the Chiangmai sample is importantly
scoredp. value 0.01 higher than the Ranong one while the area of
safety, Chiangmai’s competency is importantly scored 0.05 higher
than the Ranong’scompetency. Others were rated not differently.
Since Chiangmai perform better, then it can be a role model in
developing HTHR or HWT destination.
From the part of qualitative research, content analysis of business
contents and its environments were analyzed. The four stages of
strategic development and plans, from the smallest scale to the largest
scale such a national base were discussed. The HWT: Evolution
model and strategy for lodging Business were suggested. All those
stages must work harmoniously together. The distinctive result
illustrates the need of human resource development as the key point
to create the identity of Thainess on Health and wellness service
providing. This will add-on the value of services and differentiates
ourselves from other competitors. The creative of Thailand’s health
and wellness brand possibly increase loyalty customers which agreed
to be a path of sustainable development.
Abstract: In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.
Abstract: Object Relational Databases (ORDB) are complex in
nature than traditional relational databases because they combine the
characteristics of both object oriented concepts and relational
features of conventional databases. Design of an ORDB demands
efficient and quality schema considering the structural, functional
and componential traits. This internal quality of the schema is
assured by metrics that measure the relevant attributes. This is
extended to substantiate the understandability, usability and
reliability of the schema, thus assuring external quality of the
schema. This work institutes a formalization of ORDB metrics;
metric definition, evaluation methodology and the calibration of the
metric. Three ORDB schemas were used to conduct the evaluation
and the formalization of the metrics. The metrics are calibrated using
content and criteria related validity based on the measurability,
consistency and reliability of the metrics. Nominal and summative
scales are derived based on the evaluated metric values and are
standardized. Future works pertaining to ORDB metrics forms the
concluding note.
Abstract: In order to define a new model of Tunisian foot
sizes and for building the most comfortable shoes, Tunisian
industrialists must be able to offer for their customers products able
to put on and adjust the majority of the target population concerned.
Moreover, the use of models of shoes, mainly from others
country, causes a mismatch between the foot and comfort of the
Tunisian shoes.
But every foot is unique; these models become uncomfortable for
the Tunisian foot. We have a set of measures produced from a
3D scan of the feet of a diverse population (women, men ...) and we
try to analyze this data to define a model of foot specific to the
Tunisian footwear design.
In this paper we propose tow new approaches to modeling a new
foot sizes model. We used, indeed, the neural networks, and specially
the Kohonen network.
Next, we combine neural networks with the concept of half-foot
size to improve the models already found. Finally, it was necessary to
compare the results obtained by applying each approach and we
decide what-s the best approach that give us the most model of foot
improving more comfortable shoes.
Abstract: A theory for optimal filtering of infinite sets of random
signals is presented. There are several new distinctive features of the
proposed approach. First, a single optimal filter for processing any
signal from a given infinite signal set is provided. Second, the filter is
presented in the special form of a sum with p terms where each term
is represented as a combination of three operations. Each operation
is a special stage of the filtering aimed at facilitating the associated
numerical work. Third, an iterative scheme is implemented into the
filter structure to provide an improvement in the filter performance at
each step of the scheme. The final step of the scheme concerns signal
compression and decompression. This step is based on the solution of
a new rank-constrained matrix approximation problem. The solution
to the matrix problem is described in this paper. A rigorous error
analysis is given for the new filter.
Abstract: Longitudinal data typically have the characteristics of
changes over time, nonlinear growth patterns, between-subjects
variability, and the within errors exhibiting heteroscedasticity and
dependence. The data exploration is more complicated than that of
cross-sectional data. The purpose of this paper is to organize/integrate
of various visual-graphical techniques to explore longitudinal data.
From the application of the proposed methods, investigators can
answer the research questions include characterizing or describing the
growth patterns at both group and individual level, identifying the time
points where important changes occur and unusual subjects, selecting
suitable statistical models, and suggesting possible within-error
variance.