Abstract: Antimicrobial resistant is becoming a major factor in
virtually all hospital acquired infection may soon untreatable is a
serious public health problem. These concerns have led to major
research effort to discover alternative strategies for the treatment of
bacterial infection. Nanobiotehnology is an upcoming and fast
developing field with potential application for human welfare. An
important area of nanotechnology for development of reliable and
environmental friendly process for synthesis of nanoscale particles
through biological systems In the present studies are reported on the
use of fungal strain Aspergillus species for the extracellular synthesis
of bionanoparticles from 1 mM silver nitrate (AgNO3) solution. The
report would be focused on the synthesis of metallic bionanoparticles
of silver using a reduction of aqueous Ag+ ion with the
culture supernatants of Microorganisms. The bio-reduction of the
Ag+ ions in the solution would be monitored in the aqueous
component and the spectrum of the solution would measure through
UV-visible spectrophotometer The bionanoscale particles were
further characterized by Atomic Force Microscopy (AFM), Fourier
Transform Infrared Spectroscopy (FTIR) and Thin layer
chromatography. The synthesized bionanoscale particle showed a
maximum absorption at 385 nm in the visible region. Atomic Force
Microscopy investigation of silver bionanoparticles identified that
they ranged in the size of 250 nm - 680 nm; the work analyzed the
antimicrobial efficacy of the silver bionanoparticles against various
multi drug resistant clinical isolates. The present Study would be
emphasizing on the applicability to synthesize the metallic
nanostructures and to understand the biochemical and molecular
mechanism of nanoparticles formation by the cell filtrate in order to
achieve better control over size and polydispersity of the
nanoparticles. This would help to develop nanomedicine against
various multi drug resistant human pathogens.
Abstract: Importance of software quality is increasing leading to development of new sophisticated techniques, which can be used in constructing models for predicting quality attributes. One such technique is Artificial Neural Network (ANN). This paper examined the application of ANN for software quality prediction using Object- Oriented (OO) metrics. Quality estimation includes estimating maintainability of software. The dependent variable in our study was maintenance effort. The independent variables were principal components of eight OO metrics. The results showed that the Mean Absolute Relative Error (MARE) was 0.265 of ANN model. Thus we found that ANN method was useful in constructing software quality model.
Abstract: Independent component analysis (ICA) in the
frequency domain is used for solving the problem of blind source
separation (BSS). However, this method has some problems. For
example, a general ICA algorithm cannot determine the permutation
of signals which is important in the frequency domain ICA. In this
paper, we propose an approach to the solution for a permutation
problem. The idea is to effectively combine two conventional
approaches. This approach improves the signal separation
performance by exploiting features of the conventional approaches.
We show the simulation results using artificial data.
Abstract: The last decade has shown that object-oriented
concept by itself is not that powerful to cope with the rapidly
changing requirements of ongoing applications. Component-based
systems achieve flexibility by clearly separating the stable parts of
systems (i.e. the components) from the specification of their
composition. In order to realize the reuse of components effectively
in CBSD, it is required to measure the reusability of components.
However, due to the black-box nature of components where the
source code of these components are not available, it is difficult to
use conventional metrics in Component-based Development as these
metrics require analysis of source codes. In this paper, we survey
few existing component-based reusability metrics. These metrics
give a border view of component-s understandability, adaptability,
and portability. It also describes the analysis, in terms of quality
factors related to reusability, contained in an approach that aids
significantly in assessing existing components for reusability.
Abstract: In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.
Abstract: A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.
Abstract: This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.
Abstract: The first generation of Mobile Agents based Intrusion
Detection System just had two components namely data collection
and single centralized analyzer. The disadvantage of this type of
intrusion detection is if connection to the analyzer fails, the entire
system will become useless. In this work, we propose novel hybrid
model for Mobile Agent based Distributed Intrusion Detection
System to overcome the current problem. The proposed model has
new features such as robustness, capability of detecting intrusion
against the IDS itself and capability of updating itself to detect new
pattern of intrusions. In addition, our proposed model is also capable
of tackling some of the weaknesses of centralized Intrusion Detection
System models.
Abstract: This paper is a survey of current component-based
software technologies and the description of promotion and
inhibition factors in CBSE. The features that software components
inherit are also discussed. Quality Assurance issues in componentbased
software are also catered to. The feat research on the quality
model of component based system starts with the study of what the
components are, CBSE, its development life cycle and the pro &
cons of CBSE. Various attributes are studied and compared keeping
in view the study of various existing models for general systems and
CBS. When illustrating the quality of a software component an apt
set of quality attributes for the description of the system (or
components) should be selected. Finally, the research issues that can
be extended are tabularized.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: The objective of this research is to study principal
component analysis for classification of 67 soil samples collected from
different agricultural areas in the western part of Thailand. Six soil
properties were measured on the soil samples and are used as original
variables. Principal component analysis is applied to reduce the
number of original variables. A model based on the first two
principal components accounts for 72.24% of total variance. Score
plots of first two principal components were used to map with
agricultural areas divided into horticulture, field crops and wetland.
The results showed some relationships between soil properties and
agricultural areas. PCA was shown to be a useful tool for agricultural
areas classification based on soil properties.
Abstract: Deprivation indices are widely used in public health
study. These indices are also referred as the index of inequalities or
disadvantage. Even though, there are many indices that have been
built before, it is believed to be less appropriate to use the existing
indices to be applied in other countries or areas which had different
socio-economic conditions and different geographical characteristics.
The objective of this study is to construct the index based on the
geographical and socio-economic factors in Peninsular Malaysia
which is defined as the weighted household-based deprivation index.
This study has employed the variables based on household items,
household facilities, school attendance and education level obtained
from Malaysia 2000 census report. The factor analysis is used to
extract the latent variables from indicators, or reducing the
observable variable into smaller amount of components or factor.
Based on the factor analysis, two extracted factors were selected,
known as Basic Household Amenities and Middle-Class Household
Item factor. It is observed that the district with a lower index values
are located in the less developed states like Kelantan, Terengganu
and Kedah. Meanwhile, the areas with high index values are located
in developed states such as Pulau Pinang, W.P. Kuala Lumpur and
Selangor.
Abstract: ICA which is generally used for blind source separation
problem has been tested for feature extraction in Speech recognition
system to replace the phoneme based approach of MFCC. Applying
the Cepstral coefficients generated to ICA as preprocessing has
developed a new signal processing approach. This gives much better
results against MFCC and ICA separately, both for word and speaker
recognition. The mixing matrix A is different before and after MFCC
as expected. As Mel is a nonlinear scale. However, cepstrals
generated from Linear Predictive Coefficient being independent
prove to be the right candidate for ICA. Matlab is the tool used for
all comparisons. The database used is samples of ISOLET.
Abstract: High voltage generators are being subject to higher
voltage rating and are being designed to operate in harsh conditions.
Stator windings are the main component of generators in which
Electrical, magnetically and thermal stresses remain major failures
for insulation degradation accelerated aging. A large number of
generators failed due to stator winding problems, mainly insulation
deterioration. Insulation degradation assessment plays vital role in the
asset life management. Mostly the stator failure is catastrophic
causing significant damage to the plant. Other than generation loss,
stator failure involves heavy repair or replacement cost. Electro
thermal analysis is the main characteristic for improvement design of
stator slot-s insulation. Dielectric parameters such as insulation
thickness, spacing, material types, geometry of winding and slot are
major design consideration. A very powerful method available to
analyze electro thermal performance is Finite Element Method
(FEM) which is used in this paper. The analysis of various stator coil
and slot configurations are used to design the better dielectric system
to reduce electrical and thermal stresses in order to increase the
power of generator in the same volume of core. This paper describes
the process used to perform classical design and improvement
analysis of stator slot-s insulation.
Abstract: We demonstrate through a sample application, Ebanking,
that the Web Service Modelling Language Ontology component
can be used as a very powerful object-oriented database design
language with logic capabilities. Its conceptual syntax allows the
definition of class hierarchies, and logic syntax allows the definition
of constraints in the database. Relations, which are available for
modelling relations of three or more concepts, can be connected to
logical expressions, allowing the implicit specification of database
content. Using a reasoning tool, logic queries can also be made
against the database in simulation mode.
Abstract: This paper presents the exergy analysis of a
desalination unit using humidification-dehumidification process.
Here, this unit is considered as a thermal system with three main
components, which are the heating unit by using a solar collector, the
evaporator or the humidifier, and the condenser or the dehumidifier.
In these components the exergy is a measure of the quality or grade
of energy and it can be destroyed in them. According to the second
law of thermodynamics this destroyed part is due to irreversibilities
which must be determined to obtain the exergetic efficiency of the
system.
In the current paper a computer program has been developed using
visual basic to determine the exergy destruction and the exergetic
efficiencies of the components of the desalination unit at variable
operation conditions such as feed water temperature, outlet air
temperature, air to feed water mass ratio and salinity, in addition to
cooling water mass flow rate and inlet temperature, as well as
quantity of solar irradiance.
The results obtained indicate that the exergy efficiency of the
humidifier increases by increasing the mass ratio and decreasing the
outlet air temperature. In the other hand the exergy efficiency of the
condenser increases with the increase of this ratio and also with the
increase of the outlet air temperature.
Abstract: Suburban area is an important area to the development of a city and a country. Russias economy is going through major transitions. These transitions are rapidly changing the relationship between cities (urban areas), countryside (rural areas) and the development, growth, and popularity of suburbia. The process of suburbanization takes place in biggest cities of Russia, including Krasnoyarsk City. The modern Krasnoyarsk with a population of about 1mln people occupies the territory of 34115 ha. This article examines the analysis of functions of suburban area and connects these functions with zoning of the suburban territory. The author uses the method of hierarchy to select the best conditions to each function in connection with nature component, transportation and distance from the city. The result of this research is the map of the functional zoning of suburban area of Krasnoyarsk City. The author uses a variety of factors, which have an influence on suburban area, to compare and choose the best conditions. KeywordsSuburban area, zoning of territory, Krasnoyarsk City.
Abstract: This paper describes a method to improve the robustness of a face recognition system based on the combination of two compensating classifiers. The face images are preprocessed by the appearance-based statistical approaches such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). LDA features of the face image are taken as the input of the Radial Basis Function Network (RBFN). The proposed approach has been tested on the ORL database. The experimental results show that the LDA+RBFN algorithm has achieved a recognition rate of 93.5%
Abstract: In this paper, a wavelet-based neural network (WNN) classifier for recognizing EEG signals is implemented and tested under three sets EEG signals (healthy subjects, patients with epilepsy and patients with epileptic syndrome during the seizure). First, the Discrete Wavelet Transform (DWT) with the Multi-Resolution Analysis (MRA) is applied to decompose EEG signal at resolution levels of the components of the EEG signal (δ, θ, α, β and γ) and the Parseval-s theorem are employed to extract the percentage distribution of energy features of the EEG signal at different resolution levels. Second, the neural network (NN) classifies these extracted features to identify the EEGs type according to the percentage distribution of energy features. The performance of the proposed algorithm has been evaluated using in total 300 EEG signals. The results showed that the proposed classifier has the ability of recognizing and classifying EEG signals efficiently.
Abstract: The present study was done primarily to address two major research gaps: firstly, development of an empirical measure of life meaningfulness for substance users and secondly, to determine the psychosocial determinants of life meaningfulness among the substance users. The study is classified into two phases: the first phase which dealt with development of Life Meaningfulness Scale and the second phase which examined the relationship between life meaningfulness and social support, abstinence self efficacy and depression. Both qualitative and quantitative approaches were used for framing items. A Principal Component Analysis yielded three components: Overall Goal Directedness, Striving for healthy lifestyle and Concern for loved ones which collectively accounted for 42.06% of the total variance. The scale and its subscales were also found to be highly reliable. Multiple regression analyses in the second phase of the study revealed that social support and abstinence self efficacy significantly predicted life meaningfulness among 48 recovering inmates of a de-addiction center while level of depression failed to predict life meaningfulness.