Abstract: In this paper, the authors present architecture of a multi agent consultation system for obesity related problems, which hybrid the technology of an expert system (ES) and an intelligent agent (IA). The strength of the ES which is capable of pulling the expert knowledge is consulted and presented to the end user via the autonomous and friendly pushing environment of the intelligent agent.
Abstract: The aim of this paper is to study the internal
stabilization of the Bernoulli-Euler equation numerically. For this,
we consider a square plate subjected to a feedback/damping force
distributed only in a subdomain. An algorithm for obtaining an
approximate solution to this problem was proposed and implemented.
The numerical method used was the Finite Difference Method.
Numerical simulations were performed and showed the behavior of
the solution, confirming the theoretical results that have already been
proved in the literature. In addition, we studied the validation of the
numerical scheme proposed, followed by an analysis of the numerical
error; and we conducted a study on the decay of the energy associated.
Abstract: The challenge in the case of image authentication is that in many cases images need to be subjected to non malicious operations like compression, so the authentication techniques need to be compression tolerant. In this paper we propose an image authentication system that is tolerant to JPEG lossy compression operations. A scheme for JPEG grey scale images is proposed based on a data embedding method that is based on a secret key and a secret mapping vector in the frequency domain. An encrypted feature vector extracted from the image DCT coefficients, is embedded redundantly, and invisibly in the marked image. On the receiver side, the feature vector from the received image is derived again and compared against the extracted watermark to verify the image authenticity. The proposed scheme is robust against JPEG compression up to a maximum compression of approximately 80%,, but sensitive to malicious attacks such as cutting and pasting.
Abstract: Multimedia, as it stands now is perhaps the most
diverse and rich culture around the globe. One of the major needs of
Multimedia is to have a single system that enables people to
efficiently search through their multimedia catalogues. Many
Domain Specific Systems and architectures have been proposed but
up till now no generic and complete architecture is proposed. In this
paper, we have suggested a generic architecture for Multimedia
Database. The main strengths of our architecture besides being
generic are Semantic Libraries to reduce semantic gap, levels of
feature extraction for more specific and detailed feature extraction
according to classes defined by prior level, and merging of two types
of queries i.e. text and QBE (Query by Example) for more accurate
yet detailed results.
Abstract: Meshing is the process of discretizing problem
domain into many sub domains before the numerical calculation can
be performed. One of the most popular meshes among many types of meshes is tetrahedral mesh, due to their flexibility to fit into almost
any domain shape. In both 2D and 3D domains, triangular and tetrahedral meshes can be generated by using Delaunay triangulation.
The quality of mesh is an important factor in performing any Computational Fluid Dynamics (CFD) simulations as the results is
highly affected by the mesh quality. Many efforts had been done in
order to improve the quality of the mesh. The paper describes a mesh
generation routine which has been developed capable of generating
high quality tetrahedral cells in arbitrary complex geometry. A few
test cases in CFD problems are used for testing the mesh generator.
The result of the mesh is compared with the one generated by a
commercial software. The results show that no sliver exists for the
meshes generated, and the overall quality is acceptable since the percentage of the bad tetrahedral is relatively small. The boundary
recovery was also successfully done where all the missing faces are
rebuilt.
Abstract: Although the usefulness of fuzzy databases has been
pointed out in several works, they are not fully developed in numerous
domains. A task that is mostly disregarded and which is the topic
of this paper is the determination of suitable inequalities for fuzzy
sets in fuzzy query languages. This paper examines which kinds
of fuzzy inequalities exist at all. Afterwards, different procedures
are presented that appear theoretically appropriate. By being applied
to various examples, their strengths and weaknesses are revealed.
Furthermore, an algorithm for an efficient computation of the selected
fuzzy inequality is shown.
Abstract: Noise level has critical effects on the diagnostic
performance of signal-averaged electrocardiogram (SAECG), because
the true starting and end points of QRS complex would be masked by
the residual noise and sensitive to the noise level. Several studies and
commercial machines have used a fixed number of heart beats
(typically between 200 to 600 beats) or set a predefined noise level
(typically between 0.3 to 1.0 μV) in each X, Y and Z lead to perform
SAECG analysis. However different criteria or methods used to
perform SAECG would cause the discrepancies of the noise levels
among study subjects. According to the recommendations of 1991
ESC, AHA and ACC Task Force Consensus Document for the use of
SAECG, the determinations of onset and offset are related closely to
the mean and standard deviation of noise sample. Hence this study
would try to perform SAECG using consistent root-mean-square
(RMS) noise levels among study subjects and analyze the noise level
effects on SAECG. This study would also evaluate the differences
between normal subjects and chronic renal failure (CRF) patients in
the time-domain SAECG parameters.
The study subjects were composed of 50 normal Taiwanese and 20
CRF patients. During the signal-averaged processing, different RMS
noise levels were adjusted to evaluate their effects on three time
domain parameters (1) filtered total QRS duration (fQRSD), (2) RMS
voltage of the last QRS 40 ms (RMS40), and (3) duration of the low
amplitude signals below 40 μV (LAS40). The study results
demonstrated that the reduction of RMS noise level can increase
fQRSD and LAS40 and decrease the RMS40, and can further increase
the differences of fQRSD and RMS40 between normal subjects and
CRF patients. The SAECG may also become abnormal due to the
reduction of RMS noise level. In conclusion, it is essential to establish
diagnostic criteria of SAECG using consistent RMS noise levels for
the reduction of the noise level effects.
Abstract: There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.
Abstract: Discovering new biological knowledge from the highthroughput biological data is a major challenge to bioinformatics today. To address this challenge, we developed a new approach for protein classification. Proteins that are evolutionarily- and thereby functionally- related are said to belong to the same classification. Identifying protein classification is of fundamental importance to document the diversity of the known protein universe. It also provides a means to determine the functional roles of newly discovered protein sequences. Our goal is to predict the functional classification of novel protein sequences based on a set of features extracted from each protein sequence. The proposed technique used datasets extracted from the Structural Classification of Proteins (SCOP) database. A set of spectral domain features based on Fast Fourier Transform (FFT) is used. The proposed classifier uses multilayer back propagation (MLBP) neural network for protein classification. The maximum classification accuracy is about 91% when applying the classifier to the full four levels of the SCOP database. However, it reaches a maximum of 96% when limiting the classification to the family level. The classification results reveal that spectral domain contains information that can be used for classification with high accuracy. In addition, the results emphasize that sequence similarity measures are of great importance especially at the family level.
Abstract: Ontologies are broadly used in the context of networked home environments. With ontologies it is possible to define and store context information, as well as to model different kinds of physical environments. Ontologies are central to networked home environments as they carry the meaning. However, ontologies and the OWL language is complex. Several ontology visualization approaches have been developed to enhance the understanding of ontologies. The domain of networked home environments sets some special requirements for the ontology visualization approach. The visualization tool presented here, visualizes ontologies in a domain-specific way. It represents effectively the physical structures and spatial relationships of networked home environments. In addition, it provides extensive interaction possibilities for editing and manipulating the visualization. The tool shortens the gap from beginner to intermediate OWL ontology reader by visualizing instances in their actual locations and making OWL ontologies more interesting and concrete, and above all easier to comprehend.
Abstract: In this paper a novel scheme for watermarking digital
audio during its compression to MPEG-1 Layer III format is
proposed. For this purpose we slightly modify some of the selected
MDCT coefficients, which are used during MPEG audio
compression procedure. Due to the possibility of modifying different
MDCT coefficients, there will be different choices for embedding the
watermark into audio data, considering robustness and transparency
factors. Our proposed method uses a genetic algorithm to select the
best coefficients to embed the watermark. This genetic selection is
done according to the parameters that are extracted from the
perceptual content of the audio to optimize the robustness and
transparency of the watermark. On the other hand the watermark
security is increased due to the random nature of the genetic
selection. The information of the selected MDCT coefficients that
carry the watermark bits, are saves in a database for future extraction
of the watermark. The proposed method is suitable for online MP3
stores to pursue illegal copies of musical artworks. Experimental
results show that the detection ratio of the watermarks at the bitrate
of 128kbps remains above 90% while the inaudibility of the
watermark is preserved.
Abstract: Discourse pronominal anaphora resolution must be part of any efficient information processing systems, since the reference of a pronoun is dependent on an antecedent located in the discourse. Contrary to knowledge-poor approaches, this paper shows that syntax-semantic relations are basic in pronominal anaphora resolution. The identification of quantified expressions to which pronouns can be anaphorically related provides further evidence that pronominal anaphora is based on domains of interpretation where asymmetric agreement holds.
Abstract: Because of excellent properties, people has paid more
attention to SPIHI algorithm, which is based on the traditional wavelet
transformation theory, but it also has its shortcomings. Combined the
progress in the present wavelet domain and the human's visual
characteristics, we propose an improved algorithm based on human
visual characteristics of SPIHT in the base of analysis of SPIHI
algorithm. The experiment indicated that the coding speed and quality
has been enhanced well compared to the original SPIHT algorithm,
moreover improved the quality of the transmission cut off.
Abstract: This paper describes an effective solution to the task
of a remote monitoring of super-extended objects (oil and gas
pipeline, railways, national frontier). The suggested solution is based
on the principle of simultaneously monitoring of seismoacoustic and
optical/infrared physical fields. The principle of simultaneous
monitoring of those fields is not new but in contrast to the known
solutions the suggested approach allows to control super-extended
objects with very limited operational costs. So-called C-OTDR
(Coherent Optical Time Domain Reflectometer) systems are used to
monitor the seismoacoustic field. Far-CCTV systems are used to
monitor the optical/infrared field. A simultaneous data processing
provided by both systems allows effectively detecting and classifying
target activities, which appear in the monitored objects vicinity. The
results of practical usage had shown high effectiveness of the
suggested approach.
Abstract: In this paper, we develop an accurate and efficient Haar wavelet method for well-known FitzHugh-Nagumo equation. The proposed scheme can be used to a wide class of nonlinear reaction-diffusion equations. The power of this manageable method is confirmed. Moreover the use of Haar wavelets is found to be accurate, simple, fast, flexible, convenient, small computation costs and computationally attractive.
Abstract: The Spiral development model has been used
successfully in many commercial systems and in a good number of
defense systems. This is due to the fact that cost-effective
incremental commitment of funds, via an analogy of the spiral model
to stud poker and also can be used to develop hardware or integrate
software, hardware, and systems. To support adaptive, semantic
collaboration between domain experts and knowledge engineers, a
new knowledge engineering process, called Spiral_OWL is proposed.
This model is based on the idea of iterative refinement, annotation
and structuring of knowledge base. The Spiral_OWL model is
generated base on spiral model and knowledge engineering
methodology. A central paradigm for Spiral_OWL model is the
concentration on risk-driven determination of knowledge engineering
process. The collaboration aspect comes into play during knowledge
acquisition and knowledge validation phase. Design rationales for the
Spiral_OWL model are to be easy-to-implement, well-organized, and
iterative development cycle as an expanding spiral.
Abstract: An envelope echo signal measurement is proposed in
this paper using echo signal observation from the 200 kHz echo
sounder receiver. The envelope signal without any object is compared
with the envelope signal of the sphere. Two diameter size steel ball
(3.1 cm & 2.2 cm) and two diameter size air filled stainless steel ball
(4.8 cm & 7.4 cm) used in this experiment. The target was positioned
about 0.5 m and 1.0 meter from the transducer face using nylon rope.
From the echo observation in time domain, it is obviously shown that
echo signal structure is different between the size, distance and type
of metal sphere. The amplitude envelope voltage for the bigger
sphere is higher compare to the small sphere and it confirm that the
bigger sphere have higher target strength compare to the small
sphere. Although the structure signal without any object are different
compare to the signal from the sphere, the reflected signal from the
tank floor increase linearly with the sphere size. We considered this
event happened because of the object position approximately to the
tank floor.
Abstract: We provide a maximum norm analysis of a finite
element Schwarz alternating method for a nonlinear elliptic boundary
value problem of the form -Δu = f(u), on two overlapping sub
domains with non matching grids. We consider a domain which is
the union of two overlapping sub domains where each sub domain
has its own independently generated grid. The two meshes being
mutually independent on the overlap region, a triangle belonging to
one triangulation does not necessarily belong to the other one. Under
a Lipschitz assumption on the nonlinearity, we establish, on each sub
domain, an optimal L∞ error estimate between the discrete Schwarz
sequence and the exact solution of the boundary value problem.
Abstract: Heart-s electric field can be measured anywhere on
the surface of the body (ECG). When individuals touch, one person-s
ECG signal can be registered in other person-s EEG and elsewhere
on his body. Now, the aim of this study was to test the hypothesis
that physical contact (hand-holding) of two persons changes their
heart rate variability. Subjects were sixteen healthy female (age: 20-
26) which divided into eight sets. In each sets, we had two friends
that they passed intimacy test of J.sternberg. ECG of two subjects
(each set) acquired for 5 minutes before hand-holding (as control
group) and 5 minutes during they held their hands (as experimental
group). Then heart rate variability signals were extracted from
subjects' ECG and analyzed in linear feature space (time and
frequency domain) and nonlinear feature space. Considering the
results, we conclude that physical contact (hand-holding of two
friends) increases parasympathetic activity, as indicate by increase
SD1, SD1/SD2, HF and MF power (p
Abstract: This paper presents a novel approach for tuning unified power flow controller (UPFC) based damping controller in order to enhance the damping of power system low frequency oscillations. The design problem of damping controller is formulated as an optimization problem according to the eigenvalue-based objective function which is solved using iteration particle swarm optimization (IPSO). The effectiveness of the proposed controller is demonstrated through eigenvalue analysis and nonlinear time-domain simulation studies under a wide range of loading conditions. The simulation study shows that the designed controller by IPSO performs better than CPSO in finding the solution. Moreover, the system performance analysis under different operating conditions show that the δE based controller is superior to the mB based controller.