Abstract: On one hand, SNMP (Simple Network Management
Protocol) allows integrating different enterprise elements connected
through Internet into a standardized remote management. On the
other hand, as a consequence of the success of Intelligent Houses
they can be connected through Internet now by means of a residential
gateway according to a common standard called OSGi (Open
Services Gateway initiative). Due to the specifics of OSGi Service
Platforms and their dynamic nature, specific design criterions should
be defined to implement SNMP Agents for OSGi in order to integrate
them into the SNMP remote management. Based on the analysis of
the relation between both standards (SNMP and OSGi), this paper
shows how OSGi Service Platforms can be included into the SNMP
management of a global enterprise, giving implementation details
about an SNMP Agent solution and the definition of a new MIB
(Management Information Base) for managing OSGi platforms that
takes into account the specifics and dynamic nature of OSGi.
Abstract: Requirements management is critical to software
delivery success and project lifecycle. Requirements management
and their traceability provide assistance for many software
engineering activities like impact analysis, coverage analysis,
requirements validation and regression testing. In addition
requirements traceability is the recognized component of many
software process improvement initiatives. Requirements traceability
also helps to control and manage evolution of a software system.
This paper aims to provide an evaluation of current requirements
management and traceability tools. Management and test managers
require an appropriate tool for the software under test. We hope,
evaluation identified here will help to select the efficient and
effective tool.
Abstract: Cytogenetic analysis still remains the gold standard method for prenatal diagnosis of trisomy 21 (Down syndrome, DS). Nevertheless, the conventional cytogenetic analysis needs live cultured cells and is too time-consuming for clinical application. In contrast, molecular methods such as FISH, QF-PCR, MLPA and quantitative Real-time PCR are rapid assays with results available in 24h. In the present study, we have successfully used a novel MGB TaqMan probe-based real time PCR assay for rapid diagnosis of trisomy 21 status in Down syndrome samples. We have also compared the results of this molecular method with corresponding results obtained by the cytogenetic analysis. Blood samples obtained from DS patients (n=25) and normal controls (n=20) were tested by quantitative Real-time PCR in parallel to standard G-banding analysis. Genomic DNA was extracted from peripheral blood lymphocytes. A high precision TaqMan probe quantitative Real-time PCR assay was developed to determine the gene dosage of DSCAM (target gene on 21q22.2) relative to PMP22 (reference gene on 17p11.2). The DSCAM/PMP22 ratio was calculated according to the formula; ratio=2 -ΔΔCT. The quantitative Real-time PCR was able to distinguish between trisomy 21 samples and normal controls with the gene ratios of 1.49±0.13 and 1.03±0.04 respectively (p value
Abstract: Medical image segmentation based on image smoothing followed by edge detection assumes a great degree of importance in the field of Image Processing. In this regard, this paper proposes a novel algorithm for medical image segmentation based on vigorous smoothening by identifying the type of noise and edge diction ideology which seems to be a boom in medical image diagnosis. The main objective of this algorithm is to consider a particular medical image as input and make the preprocessing to remove the noise content by employing suitable filter after identifying the type of noise and finally carrying out edge detection for image segmentation. The algorithm consists of three parts. First, identifying the type of noise present in the medical image as additive, multiplicative or impulsive by analysis of local histograms and denoising it by employing Median, Gaussian or Frost filter. Second, edge detection of the filtered medical image is carried out using Canny edge detection technique. And third part is about the segmentation of edge detected medical image by the method of Normalized Cut Eigen Vectors. The method is validated through experiments on real images. The proposed algorithm has been simulated on MATLAB platform. The results obtained by the simulation shows that the proposed algorithm is very effective which can deal with low quality or marginal vague images which has high spatial redundancy, low contrast and biggish noise, and has a potential of certain practical use of medical image diagnosis.
Abstract: This paper makes a detailed analysis regarding the definition of the intrinsic mode function and proves that Condition 1 of the intrinsic mode function can really be deduced from Condition 2. Finally, an improved definition of the intrinsic mode function is given.
Abstract: Decision tree algorithms have very important place at
classification model of data mining. In literature, algorithms use
entropy concept or gini index to form the tree. The shape of the
classes and their closeness to each other some of the factors that
affect the performance of the algorithm. In this paper we introduce a
new decision tree algorithm which employs data (attribute) folding
method and variation of the class variables over the branches to be
created. A comparative performance analysis has been held between
the proposed algorithm and C4.5.
Abstract: In this study, a fuzzy integrated logical forecasting method (FILF) is extended for multi-variate systems by using a vector autoregressive model. Fuzzy time series forecasting (FTSF) method was recently introduced by Song and Chissom [1]-[2] after that Chen improved the FTSF method. Rather than the existing literature, the proposed model is not only compared with the previous FTS models, but also with the conventional time series methods such as the classical vector autoregressive model. The cluster optimization is based on the C-means clustering method. An empirical study is performed for the prediction of the chartering rates of a group of dry bulk cargo ships. The root mean squared error (RMSE) metric is used for the comparing of results of methods and the proposed method has superiority than both traditional FTS methods and also the classical time series methods.
Abstract: An experimental study is presented on the effect
of microstructural change on the Portevin-Le Chatelier effect
behaviour of Al-2.5%Mg alloy. Tensile tests are performed on
the as received and heat treated (at 400 ºC for 16 hours)
samples for a wide range of strain rates. The serrations
observed in the stress-time curve are investigated from
statistical analysis point of view. Microstructures of the
samples are characterized by optical metallography and X-ray
diffraction. It is found that the excess vacancy generated due
to heat treatment leads to decrease in the strain rate sensitivity
and the increase in the number of stress drop occurrences per
unit time during the PLC effect. The microstructural
parameters like domain size, dislocation density have no
appreciable effect on the PLC effect as far as the statistical
behavior of the serrations is considered.
Abstract: In this paper we have numerically analyzed terahertzrange
wavelength conversion using nondegenerate four wave mixing
(NDFWM) in a SOA integrated DFB laser (experiments reported
both in MIT electronics and Fujitsu research laboratories). For
analyzing semiconductor optical amplifier (SOA), we use finitedifference
beam propagation method (FDBPM) based on modified
nonlinear SchrÖdinger equation and for distributed feedback (DFB)
laser we use coupled wave approach. We investigated wavelength
conversion up to 4THz probe-pump detuning with conversion
efficiency -5dB in 1THz probe-pump detuning for a SOA integrated
quantum-well
Abstract: One of the most important power quality issues is voltage flicker. Nowadays this issue also impacts the power system all over the world. The fact of the matter is that the more and the larger capacity of wind generator has been installed. Under unstable wind power situation, the variation of output current and voltage have caused trouble to voltage flicker. Hence, the major purpose of this study is to analyze the impact of wind generator on voltage flicker of power system. First of all, digital simulation and analysis are carried out based on wind generator operating under various system short circuit capacity, impedance angle, loading, and power factor of load. The simulation results have been confirmed by field measurements.
Abstract: In recent years, the number of the cases of information
leaks is increasing. Companies and Research Institutions make various
actions against information thefts and security accidents. One of the
actions is adoption of the crime prevention system, including the
monitoring system by surveillance cameras. In order to solve
difficulties of multiple cameras monitoring, we develop the automatic
human tracking system using mobile agents through multiple
surveillance cameras to track target persons. In this paper, we develop
the monitor which confirms mobile agents tracing target persons, and
the simulator of video picture analysis to construct the tracking
algorithm.
Abstract: In this study, the theoretical relationship between pressure and density was investigated on cylindrical hollow fuel briquettes produced of a mixture of fibrous biomass material using a screw press without any chemical binder. The fuel briquettes were made of biomass and other waste material such as spent coffee beans, mielie husks, saw dust and coal fines under pressures of 0.878-2.2 Mega Pascals (MPa). The material was densified into briquettes of outer diameter of 100mm, inner diameter of 35mm and 50mm long. It was observed that manual screw compression action produces briquettes of relatively low density as compared to the ones made using hydraulic compression action. The pressure and density relationship was obtained in the form of power law and compare well with other cylindrical solid briquettes made using hydraulic compression action. The produced briquettes have a dry density of 989 kg/m3 and contain 26.30% fixed carbon, 39.34% volatile matter, 10.9% moisture and 10.46% ash as per dry proximate analysis. The bomb calorimeter tests have shown the briquettes yielding a gross calorific value of 18.9MJ/kg.
Abstract: The objective of this research was to study the
influence of marketing mix on customers purchasing behavior. A
total of 397 respondents were collected from customers who were the
patronages of the Chatuchak Plaza market. A questionnaire was
utilized as a tool to collect data. Statistics utilized in this research
included frequency, percentage, mean, standard deviation, and
multiple regression analysis. Data were analyzed by using Statistical
Package for the Social Sciences. The findings revealed that the
majority of respondents were male with the age between 25-34 years
old, hold undergraduate degree, married and stay together. The
average income of respondents was between 10,001-20,000 baht. In
terms of occupation, the majority worked for private companies. The
research analysis disclosed that there were three variables of
marketing mix which included price (X2), place (X3), and product
(X1) which had an influence on the frequency of customer
purchasing. These three variables can predict a purchase about 30
percent of the time by using the equation; Y1 = 6.851 + .921(X2) +
.949(X3) + .591(X1). It also found that in terms of marketing mixed,
there were two variables had an influence on the amount of customer
purchasing which were physical characteristic (X6), and the process
(X7). These two variables are 17 percent predictive of a purchasing
by using the equation: Y2 = 2276.88 + 2980.97(X6) + 2188.09(X7).
Abstract: System development life cycle (SDLC) is a
process uses during the development of any system. SDLC
consists of four main phases: analysis, design, implement and
testing. During analysis phase, context diagram and data flow
diagrams are used to produce the process model of a system.
A consistency of the context diagram to lower-level data flow
diagrams is very important in smoothing up developing
process of a system. However, manual consistency check from
context diagram to lower-level data flow diagrams by using a
checklist is time-consuming process. At the same time, the
limitation of human ability to validate the errors is one of the
factors that influence the correctness and balancing of the
diagrams. This paper presents a tool that automates the
consistency check between Data Flow Diagrams (DFDs)
based on the rules of DFDs. The tool serves two purposes: as
an editor to draw the diagrams and as a checker to check the
correctness of the diagrams drawn. The consistency check
from context diagram to lower-level data flow diagrams is
embedded inside the tool to overcome the manual checking
problem.
Abstract: Image compression plays a vital role in today-s
communication. The limitation in allocated bandwidth leads to
slower communication. To exchange the rate of transmission in the
limited bandwidth the Image data must be compressed before
transmission. Basically there are two types of compressions, 1)
LOSSY compression and 2) LOSSLESS compression. Lossy
compression though gives more compression compared to lossless
compression; the accuracy in retrievation is less in case of lossy
compression as compared to lossless compression. JPEG, JPEG2000
image compression system follows huffman coding for image
compression. JPEG 2000 coding system use wavelet transform,
which decompose the image into different levels, where the
coefficient in each sub band are uncorrelated from coefficient of
other sub bands. Embedded Zero tree wavelet (EZW) coding exploits
the multi-resolution properties of the wavelet transform to give a
computationally simple algorithm with better performance compared
to existing wavelet transforms. For further improvement of
compression applications other coding methods were recently been
suggested. An ANN base approach is one such method. Artificial
Neural Network has been applied to many problems in image
processing and has demonstrated their superiority over classical
methods when dealing with noisy or incomplete data for image
compression applications. The performance analysis of different
images is proposed with an analysis of EZW coding system with
Error Backpropagation algorithm. The implementation and analysis
shows approximately 30% more accuracy in retrieved image
compare to the existing EZW coding system.
Abstract: The method of gait identification based on the nearest neighbor classification technique with motion similarity assessment by the dynamic time warping is proposed. The model based kinematic motion data, represented by the joints rotations coded by Euler angles and unit quaternions is used. The different pose distance functions in Euler angles and quaternion spaces are considered. To evaluate individual features of the subsequent joints movements during gait cycle, joint selection is carried out. To examine proposed approach database containing 353 gaits of 25 humans collected in motion capture laboratory is used. The obtained results are promising. The classifications, which takes into consideration all joints has accuracy over 91%. Only analysis of movements of hip joints allows to correctly identify gaits with almost 80% precision.
Abstract: The performance of high-resolution schemes is investigated for unsteady, inviscid and compressible multiphase flows. An Eulerian diffuse interface approach has been chosen for the simulation of multicomponent flow problems. The reduced fiveequation and seven equation models are used with HLL and HLLC approximation. The authors demonstrated the advantages and disadvantages of both seven equations and five equations models studying their performance with HLL and HLLC algorithms on simple test case. The seven equation model is based on two pressure, two velocity concept of Baer–Nunziato [10], while five equation model is based on the mixture velocity and pressure. The numerical evaluations of two variants of Riemann solvers have been conducted for the classical one-dimensional air-water shock tube and compared with analytical solution for error analysis.
Abstract: Renewable energy systems are becoming a topic of
great interest and investment in the world. In recent years wind
power generation has experienced a very fast development in the
whole world. For planning and successful implementations of good
wind power plant projects, wind potential measurements are
required. In these projects, of great importance is the effective choice
of the micro location for wind potential measurements, installation of
the measurement station with the appropriate measuring equipment,
its maintenance and analysis of the gained data on wind potential
characteristics. In this paper, a wavelet transform has been applied to
analyze the wind speed data in the context of insight in the
characteristics of the wind and the selection of suitable locations that
could be the subject of a wind farm construction. This approach
shows that it can be a useful tool in investigation of wind potential.
Abstract: Three novel and significant contributions are made in
this paper Firstly, non-recursive formulation of Haar connection
coefficients, pioneered by the present authors is presented, which
can be computed very efficiently and avoid stack and memory
overflows. Secondly, the generalized approach for state analysis of
singular bilinear time-invariant (TI) and time-varying (TV) systems
is presented; vis-˜a-vis diversified and complex works reported by
different authors. Thirdly, a generalized approach for parameter
estimation of bilinear TI and TV systems is also proposed. The unified
framework of the proposed method is very significant in that the
digital hardware once-designed can be used to perform the complex
tasks of state analysis and parameter estimation of different types
of bilinear systems single-handedly. The simplicity, effectiveness and
generalized nature of the proposed method is established by applying
it to different types of bilinear systems for the two tasks.
Abstract: In recent years multi-agent systems have emerged as one of the interesting architectures facilitating distributed collaboration and distributed problem solving. Each node (agent) of the network might pursue its own agenda, exploit its environment, develop its own problem solving strategy and establish required communication strategies. Within each node of the network, one could encounter a diversity of problem-solving approaches. Quite commonly the agents can realize their processing at the level of information granules that is the most suitable from their local points of view. Information granules can come at various levels of granularity. Each agent could exploit a certain formalism of information granulation engaging a machinery of fuzzy sets, interval analysis, rough sets, just to name a few dominant technologies of granular computing. Having this in mind, arises a fundamental issue of forming effective interaction linkages between the agents so that they fully broadcast their findings and benefit from interacting with others.