Abstract: The analysis to detect arrhythmias and life-threatening
conditions are highly essential in today world and this analysis
can be accomplished by advanced non-linear processing methods
for accurate analysis of the complex signals of heartbeat dynamics.
In this perspective, recent developments in the field of multiscale
information content have lead to the Microcanonical Multiscale
Formalism (MMF). We show that such framework provides several
signal analysis techniques that are especially adapted to the
study of heartbeat dynamics. In this paper, we just show first hand
results of whether the considered heartbeat dynamics signals have
the multiscale properties by computing local preticability exponents
(LPEs) and the Unpredictable Points Manifold (UPM), and thereby
computing the singularity spectrum.
Abstract: Aerial and satellite images are information rich. They are also complex to analyze. For GIS systems, many features require fast and reliable extraction of roads and intersections. In this paper, we study efficient and reliable automatic extraction algorithms to address some difficult issues that are commonly seen in high resolution aerial and satellite images, nonetheless not well addressed in existing solutions, such as blurring, broken or missing road boundaries, lack of road profiles, heavy shadows, and interfering surrounding objects. The new scheme is based on a new method, namely reference circle, to properly identify the pixels that belong to the same road and use this information to recover the whole road network. This feature is invariable to the shape and direction of roads and tolerates heavy noise and disturbances. Road extraction based on reference circles is much more noise tolerant and flexible than the previous edge-detection based algorithms. The scheme is able to extract roads reliably from images with complex contents and heavy obstructions, such as the high resolution aerial/satellite images available from Google maps.
Abstract: High-voltage power transmission lines are the back
bone of electrical power utilities. The stability and continuous
monitoring of this critical infrastructure is pivotal. Nine-Sigma
representing Eskom Holding SOC limited, South Africa has a major
problem on proactive detection of fallen power lines and real time
sagging measurement together with slipping of such conductors. The
main objective of this research is to innovate RFID technology to
solve this challenge. Various options and technologies such as GPS,
PLC, image processing, MR sensors and etc., have been reviewed
and draw backs were made. The potential of RFID to give precision
measurement will be observed and presented. The future research
will look at magnetic and electrical interference as well as corona
effect on the technology.
Abstract: The design of a complete expansion that allows for
compact representation of certain relevant classes of signals is a
central problem in signal processing applications. Achieving such a
representation means knowing the signal features for the purpose of
denoising, classification, interpolation and forecasting. Multilayer
Neural Networks are relatively a new class of techniques that are
mathematically proven to approximate any continuous function
arbitrarily well. Radial Basis Function Networks, which make use of
Gaussian activation function, are also shown to be a universal
approximator. In this age of ever-increasing digitization in the
storage, processing, analysis and communication of information,
there are numerous examples of applications where one needs to
construct a continuously defined function or numerical algorithm to
approximate, represent and reconstruct the given discrete data of a
signal. Many a times one wishes to manipulate the data in a way that
requires information not included explicitly in the data, which is
done through interpolation and/or extrapolation.
Tidal data are a very perfect example of time series and many
statistical techniques have been applied for tidal data analysis and
representation. ANN is recent addition to such techniques. In the
present paper we describe the time series representation capabilities
of a special type of ANN- Radial Basis Function networks and
present the results of tidal data representation using RBF. Tidal data
analysis & representation is one of the important requirements in
marine science for forecasting.
Abstract: This paper presents a forgetting factor scheme for variable step-size affine projection algorithms (APA). The proposed scheme uses a forgetting processed input matrix as the projection matrix of pseudo-inverse to estimate system deviation. This method introduces temporal weights into the projection matrix, which is typically a better model of the real error's behavior than homogeneous temporal weights. The regularization overcomes the ill-conditioning introduced by both the forgetting process and the increasing size of the input matrix. This algorithm is tested by independent trials with coloured input signals and various parameter combinations. Results show that the proposed algorithm is superior in terms of convergence rate and misadjustment compared to existing algorithms. As a special case, a variable step size NLMS with forgetting factor is also presented in this paper.
Abstract: As many scientific applications require large data processing, the importance of parallel I/O has been increasingly recognized. Collective I/O is one of the considerable features of parallel I/O and enables application programmers to easily handle their large data volume. In this paper we measured and analyzed the performance of original collective I/O and the subgroup method, the way of using collective I/O of MPI effectively. From the experimental results, we found that the subgroup method showed good performance with small data size.
Abstract: Tolerance is a tool for achieving a social cohesion, particularly, among individuals and groups with different values. The aim is to study the characteristics of the ethnic tolerance, the inhabitants of Latvia. The ethnic tolerance is taught as a set of conscious and unconscious orientations of the individual in social interaction and inter-ethnic communication. It uses the tools of empirical studies of the ethnic tolerance which allows to identify the explicitly and implicitly levels of the emotional component of Latvia's residents. Explicit measurements were made using the techniques of self-report which revealed the index of the ethnic tolerance and the ethnic identity of the participants. The implicit component was studied using methods based on the effect of the emotional priming. During the processing of the results, there were calculated indicators of the positive and negative implicit attitudes towards members of their own and other ethnicity as well as the explicit parameters of the ethnic tolerance and the ethnic identity of Latvia-s residents. The implicit measurements of the ratio of neighboring ethnic groups against each other showed a mutual negative attitude whereas the explicit measurements indicate a neutral attitude. The data obtained contribute to a further study of the ethnic tolerance of Latvia's residents.
Abstract: Digital watermarking is a way to provide the facility of secure multimedia data communication besides its copyright protection approach. The Spread Spectrum modulation principle is widely used in digital watermarking to satisfy the robustness of multimedia signals against various signal-processing operations. Several SS watermarking algorithms have been proposed for multimedia signals but very few works have discussed on the issues responsible for secure data communication and its robustness improvement. The current paper has critically analyzed few such factors namely properties of spreading codes, proper signal decomposition suitable for data embedding, security provided by the key, successive bit cancellation method applied at decoder which have greater impact on the detection reliability, secure communication of significant signal under camouflage of insignificant signals etc. Based on the analysis, robust SS watermarking scheme for secure data communication is proposed in wavelet domain and improvement in secure communication and robustness performance is reported through experimental results. The reported result also shows improvement in visual and statistical invisibility of the hidden data.
Abstract: Knowledge is indispensable but voluminous knowledge becomes a bottleneck for efficient processing. A great challenge for data mining activity is the generation of large number of potential rules as a result of mining process. In fact sometimes result size is comparable to the original data. Traditional data mining pruning activities such as support do not sufficiently reduce the huge rule space. Moreover, many practical applications are characterized by continual change of data and knowledge, thereby making knowledge voluminous with each change. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. Michalski & Winston proposed Censored Production Rules (CPRs), as an extension of production rules, that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence, are tight or there is simply no information available as to whether it holds or not. Thus the 'If P Then D' part of the CPR expresses important information while the Unless C part acts only as a switch changes the polarity of D to ~D. In this paper a scheme based on Dempster-Shafer Theory (DST) interpretation of a CPR is suggested for discovering CPRs from the discovered flat PRs. The discovery of CPRs from flat rules would result in considerable reduction of the already discovered rules. The proposed scheme incrementally incorporates new knowledge and also reduces the size of knowledge base considerably with each episode. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested cumulative learning scheme would be useful in mining data streams.
Abstract: Hand gesture is an active area of research in the vision
community, mainly for the purpose of sign language recognition and
Human Computer Interaction. In this paper, we propose a system to
recognize alphabet characters (A-Z) and numbers (0-9) in real-time
from stereo color image sequences using Hidden Markov Models
(HMMs). Our system is based on three main stages; automatic segmentation
and preprocessing of the hand regions, feature extraction
and classification. In automatic segmentation and preprocessing stage,
color and 3D depth map are used to detect hands where the hand
trajectory will take place in further step using Mean-shift algorithm
and Kalman filter. In the feature extraction stage, 3D combined features
of location, orientation and velocity with respected to Cartesian
systems are used. And then, k-means clustering is employed for
HMMs codeword. The final stage so-called classification, Baum-
Welch algorithm is used to do a full train for HMMs parameters.
The gesture of alphabets and numbers is recognized using Left-Right
Banded model in conjunction with Viterbi algorithm. Experimental
results demonstrate that, our system can successfully recognize hand
gestures with 98.33% recognition rate.
Abstract: The paper presents the design concept of a unitselection
text-to-speech synthesis system for the Slovenian language.
Due to its modular and upgradable architecture, the system can be
used in a variety of speech user interface applications, ranging from
server carrier-grade voice portal applications, desktop user interfaces
to specialized embedded devices.
Since memory and processing power requirements are important
factors for a possible implementation in embedded devices, lexica
and speech corpora need to be reduced. We describe a simple and
efficient implementation of a greedy subset selection algorithm that
extracts a compact subset of high coverage text sentences. The
experiment on a reference text corpus showed that the subset
selection algorithm produced a compact sentence subset with a small
redundancy.
The adequacy of the spoken output was evaluated by several
subjective tests as they are recommended by the International
Telecommunication Union ITU.
Abstract: In the present work, a comparative study on the
microstructure and mechanical properties of as cast, cast aged and
forged aged A356 alloy has been investigated. The study reveals that
mechanical properties of A356 alloy are highly influenced by melt
treatment and solid state processing. Cast aged alloys achieve highest
strength and hardness compared to as cast and forge aged ones. Ones
treated with combined addition of grain refiners and modifiers
achieve maximum strength and hardness. Cast aged A356 alloy
possesses higher wear resistance compared to as cast and forge aged
ones. Forging improves both strength and ductility of alloys over as
cast ones. However, the improvement in ductility is perceptible only
for properly grain refined and modified alloys. Ones refined with
0.65% Al-3Ti shows highest improvement in ductility while ones
treated with 0.20% Al-10Sr exhibits less improvement in ductility.
Abstract: Memory forensic is important in digital investigation.
The forensic is based on the data stored in physical memory that
involve memory management and processing time. However, the
current forensic tools do not consider the efficiency in terms of
storage management and the processing time. This paper shows the
high redundancy of data found in the physical memory that cause
inefficiency in processing time and memory management. The
experiment is done using Borland C compiler on Windows XP with
512 MB of physical memory.
Abstract: The main idea behind in network aggregation is that,
rather than sending individual data items from sensors to sinks,
multiple data items are aggregated as they are forwarded by the
sensor network. Existing sensor network data aggregation techniques
assume that the nodes are preprogrammed and send data to a central
sink for offline querying and analysis. This approach faces two major
drawbacks. First, the system behavior is preprogrammed and cannot
be modified on the fly. Second, the increased energy wastage due to
the communication overhead will result in decreasing the overall
system lifetime. Thus, energy conservation is of prime consideration
in sensor network protocols in order to maximize the network-s
operational lifetime. In this paper, we give an energy efficient
approach to query processing by implementing new optimization
techniques applied to in-network aggregation. We first discuss earlier
approaches in sensors data management and highlight their
disadvantages. We then present our approach “Energy Efficient
Indexed Aggregation" (EEIA) and evaluate it through several
simulations to prove its efficiency, competence and effectiveness.
Abstract: Successful intelligence (SI) is the integrated set of the
ability needed to attain success in life, within individual-s sociocultural
context. People are successfully intelligent by recognizing
their strengths and weaknesses. They will find ways to strengthen
their weakness and maintain their strength or even improve it. SI
people can shape, select, and adapt to the environments by using
balance of higher-ordered thinking abilities including; critical,
creative, and applicative. Aims: The purposes of this study were to;
1) develop curriculum that promotes SI for nursing students, and 2)
study the effectiveness of the curriculum development. Method:
Research and Development was a method used for this study. The
design was divided into two phases; 1) the curriculum development
which composed of three steps (needs assessment, curriculum
development and curriculum field trail), and 2) the curriculum
implementation. In this phase, a pre-experimental research design
(one group pretest-posttest design) was conducted. The sample
composed of 49 sophomore nursing students of Boromarajonani
College of Nursing, Surin, Thailand who enrolled in Nursing care of
Health problem course I in 2011 academic year. Data were carefully
collected using 4 instruments; 1) Modified essay questions test
(MEQ) 2) Nursing Care Plan evaluation form 3) Group processing
observation form (α = 0.74) and 4) Satisfied evaluation form of
learning (α = 0.82). Data were analyzed using descriptive statistics
and content analysis. Results: The results revealed that the sample
had post-test average score of SI higher than pre-test average score
(mean difference was 5.03, S.D. = 2.84). Fifty seven percentages of
the sample passed the MEQ posttest at the criteria of 60 percentages.
Students demonstrated the strategies of how to develop nursing care
plan. Overall, students- satisfaction on teaching performance was at
high level (mean = 4.35, S.D. = 0.46). Conclusion: This curriculum
can promote the attribute of characteristic of SI person and was
highly required to be continued.
Abstract: This paper presents modern vibration signalprocessing
techniques for vehicle gearbox fault diagnosis, via the
wavelet analysis and the Squared Envelope (SE) technique. The
wavelet analysis is regarded as a powerful tool for the detection of
sudden changes in non-stationary signals. The Squared Envelope
(SE) technique has been extensively used for rolling bearing
diagnostics. In the present work a scheme of using the Squared
Envelope technique for early detection of gear tooth pit. The pitting
defect is manufactured on the tooth side of a fifth speed gear on the
intermediate shaft of a vehicle gearbox. The objective is to
supplement the current techniques of gearbox fault diagnosis based
on using the raw vibration and ordered signals. The test stand is
equipped with three dynamometers; the input dynamometer serves as
the internal combustion engine, the output dynamometers introduce
the load on the flanges of output joint shafts. The gearbox used for
experimental measurements is the type most commonly used in
modern small to mid-sized passenger cars with transversely mounted
powertrain and front wheel drive; a five-speed gearbox with final
drive gear and front wheel differential. The results show that the
approaches methods are effective for detecting and diagnosing
localized gear faults in early stage under different operation
conditions, and are more sensitive and robust than current gear
diagnostic techniques.
Abstract: A word recognition architecture based on a network
of neural associative memories and hidden Markov models has been
developed. The input stream, composed of subword-units like wordinternal
triphones consisting of diphones and triphones, is provided
to the network of neural associative memories by hidden Markov
models. The word recognition network derives words from this input
stream. The architecture has the ability to handle ambiguities on
subword-unit level and is also able to add new words to the
vocabulary during performance. The architecture is implemented to
perform the word recognition task in a language processing system
for understanding simple command sentences like “bot show apple".
Abstract: Environmental impact assessment (EIA) is a procedure tool of environmental management for identifying, predicting, evaluating and mitigating the adverse effects of development proposals. EIA reports usually analyze how the amounts or concentrations of pollutants obey the relevant standards. Actually, many analytical tools can deepen the analysis of environmental impacts in EIA reports, such as life cycle assessment (LCA) and environmental risk assessment (ERA). Life cycle impact assessment (LCIA) is one of steps in LCA to introduce the causal relationships among environmental hazards and damage. Incorporating the LCIA concept into ERA as an integrated tool for EIA can extend the focus of the regulatory compliance of environmental impacts to determine of the significance of environmental impacts. Sometimes, when using integrated tools, it is necessary to consider fuzzy situations due to insufficient information; therefore, ERA should be generalized to fuzzy risk assessment (FRA). Finally, the use of the proposed methodology is demonstrated through the study case of the expansion plan of the world-s largest plastics processing factory.
Abstract: A high performance clarification system has been
discussed for advanced aqueous reprocessing of FBR spent fuel.
Dissolver residue gives the cause of troubles on the plant operation of
reprocessing. In this study, the new clarification system based on the
hybrid of centrifuge and filtration was proposed to get the high
separation ability of the component of whole insoluble sludge. The
clarification tests of simulated solid species were carried out to
evaluate the clarification performance using small-scale test apparatus
of centrifuge and filter unit. The density effect of solid species on the
collection efficiency was mainly evaluated in the centrifugal
clarification test. In the filtration test using ceramic filter with pore
size of 0.2μm, on the other hand, permeability and filtration rate
were evaluated in addition to the filtration efficiency. As results, it was
evaluated that the collection efficiency of solid species on the new
clarification system was estimated as nearly 100%. In conclusion, the
high clarification performance of dissolver liquor can be achieved by
the hybrid of the centrifuge and filtration system.
Abstract: Today modern simulations solutions in the wind turbine industry have achieved a high degree of complexity and detail in result. Limitations exist when it is time to validate model results against measurements. Regarding Model validation it is of special interest to identify mode frequencies and to differentiate them from the different excitations. A wind turbine is a complex device and measurements regarding any part of the assembly show a lot of noise. Input excitations are difficult or even impossible to measure due to the stochastic nature of the environment. Traditional techniques for frequency analysis or features extraction are widely used to analyze wind turbine sensor signals, but have several limitations specially attending to non stationary signals (Events). A new technique based on autoregresive analysis techniques is introduced here for a specific application, a comparison and examples related to different events in the wind turbine operations are presented.