Abstract: The algorithm represents the DCT coefficients to concentrate signal energy and proposes combination and dictator to eliminate the correlation in the same level subband for encoding the DCT-based images. This work adopts DCT and modifies the SPIHT algorithm to encode DCT coefficients. The proposed algorithm also provides the enhancement function in low bit rate in order to improve the perceptual quality. Experimental results indicate that the proposed technique improves the quality of the reconstructed image in terms of both PSNR and the perceptual results close to JPEG2000 at the same bit rate.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: We consider a heterogeneously mixing SIR stochastic
epidemic process in populations described by a general graph.
Likelihood theory is developed to facilitate statistic inference for the
parameters of the model under complete observation. We show that
these estimators are asymptotically Gaussian unbiased estimates by
using a martingale central limit theorem.
Abstract: Automatic face detection is a complex problem in
image processing. Many methods exist to solve this problem such as
template matching, Fisher Linear Discriminate, Neural Networks,
SVM, and MRC. Success has been achieved with each method to
varying degrees and complexities. In proposed algorithm we used
upright, frontal faces for single gray scale images with decent
resolution and under good lighting condition. In the field of face
recognition technique the single face is matched with single face
from the training dataset. The author proposed a neural network
based face detection algorithm from the photographs as well as if any
test data appears it check from the online scanned training dataset.
Experimental result shows that the algorithm detected up to 95%
accuracy for any image.
Abstract: The influence of axial magnetic field (B=0.48 T) on
the variation of ionization efficiency coefficient h and secondary
electron emission coefficient g with respect to reduced electric field
E/P is studied at a new range of plane-parallel electrode spacing (0<
d< 20 cm) and different nitrogen working pressure between 0.5-20
Pa. The axial magnetic field is produced from an inductive copper
coil of radius 5.6 cm. The experimental data of breakdown voltage is
adopted to estimate the mean Paschen curves at different working
features. The secondary electron emission coefficient is calculated
from the mean Paschen curve and used to determine the minimum
breakdown voltage. A reduction of discharge voltage of about 25% is
investigated by the applied of axial magnetic field. At high interelectrode
spacing, the effect of axial magnetic field becomes more
significant for the obtained values of h but it was less for the values
of g.
Abstract: Graph based image segmentation techniques are
considered to be one of the most efficient segmentation techniques
which are mainly used as time & space efficient methods for real
time applications. How ever, there is need to focus on improving the
quality of segmented images obtained from the earlier graph based
methods. This paper proposes an improvement to the graph based
image segmentation methods already described in the literature. We
contribute to the existing method by proposing the use of a weighted
Euclidean distance to calculate the edge weight which is the key
element in building the graph. We also propose a slight modification
of the segmentation method already described in the literature, which
results in selection of more prominent edges in the graph. The
experimental results show the improvement in the segmentation
quality as compared to the methods that already exist, with a slight
compromise in efficiency.
Abstract: Microcirculation is essential for the proper supply of
oxygen and nutritive substances to the biological tissue and the
removal of waste products of metabolism. The determination of
blood flow in the capillaries is therefore of great interest to clinicians.
A comparison has been carried out using the developed non-invasive,
non-contact and whole field laser speckle contrast imaging (LSCI)
based technique and as well as a commercially available laser
Doppler blood flowmeter (LDF) to evaluate blood flow at the finger
tip and elbow and is presented here. The LSCI technique gives more
quantitative information on the velocity of blood when compared to
the perfusion values obtained using the LDF. Measurement of blood
flow in capillaries can be of great interest to clinicians in the
diagnosis of vascular diseases of the upper extremities.
Abstract: Software effort estimation is the process of predicting
the most realistic use of effort required to develop or maintain
software based on incomplete, uncertain and/or noisy input. Effort
estimates may be used as input to project plans, iteration plans,
budgets. There are various models like Halstead, Walston-Felix,
Bailey-Basili, Doty and GA Based models which have already used
to estimate the software effort for projects. In this study Statistical
Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are
experimented to estimate the software effort for projects. The
performances of the developed models were tested on NASA
software project datasets and results are compared with the Halstead,
Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based
models mentioned in the literature. The result shows that the NF
Model has the lowest MMRE and RMSE values. The NF Model
shows the best results as compared with the Fuzzy-GA based hybrid
Inference System and other existing Models that are being used for
the Effort Prediction with lowest MMRE and RMSE values.
Abstract: In this paper, we start by first characterizing the most
important and distinguishing features of wavelet-based watermarking
schemes. We studied the overwhelming amount of algorithms
proposed in the literature. Application scenario, copyright protection
is considered and building on the experience that was gained,
implemented two distinguishing watermarking schemes. Detailed
comparison and obtained results are presented and discussed. We
concluded that Joo-s [1] technique is more robust for standard noise
attacks than Dote-s [2] technique.
Abstract: In this work we adopt a combination of Laplace
transform and the decomposition method to find numerical solutions
of a system of multi-pantograph equations. The procedure leads to a
rapid convergence of the series to the exact solution after computing a
few terms. The effectiveness of the method is demonstrated in some
examples by obtaining the exact solution and in others by computing
the absolute error which decreases as the number of terms of the series
increases.
Abstract: The paper presents a computational tool developed for
the evaluation of technical and economic advantages of an innovative
cleaning and conditioning technology of fluidized bed steam/oxygen
gasifiers outlet product gas. This technology integrates into a single
unit the steam gasification of biomass and the hot gas cleaning and
conditioning system. Both components of the computational tool,
process flowsheet and economic evaluator, have been developed
under IPSEpro software. The economic model provides information
that can help potential users, especially small and medium size
enterprises acting in the regenerable energy field, to decide the
optimal scale of a plant and to better understand both potentiality and
limits of the system when applied to a wide range of conditions.
Abstract: Principal Component Analysis (PCA) has many
different important applications especially in pattern detection
such as face detection / recognition. Therefore, for real time
applications, the response time is required to be as small as
possible. In this paper, new implementation of PCA for fast
face detection is presented. Such new implementation is
designed based on cross correlation in the frequency domain
between the input image and eigenvectors (weights).
Simulation results show that the proposed implementation of
PCA is faster than conventional one.
Abstract: The healthcare environment is generally perceived as
being information rich yet knowledge poor. However, there is a lack
of effective analysis tools to discover hidden relationships and trends
in data. In fact, valuable knowledge can be discovered from
application of data mining techniques in healthcare system. In this
study, a proficient methodology for the extraction of significant
patterns from the Coronary Heart Disease warehouses for heart
attack prediction, which unfortunately continues to be a leading cause
of mortality in the whole world, has been presented. For this purpose,
we propose to enumerate dynamically the optimal subsets of the
reduced features of high interest by using rough sets technique
associated to dynamic programming. Therefore, we propose to
validate the classification using Random Forest (RF) decision tree to
identify the risky heart disease cases. This work is based on a large
amount of data collected from several clinical institutions based on
the medical profile of patient. Moreover, the experts- knowledge in
this field has been taken into consideration in order to define the
disease, its risk factors, and to establish significant knowledge
relationships among the medical factors. A computer-aided system is
developed for this purpose based on a population of 525 adults. The
performance of the proposed model is analyzed and evaluated based
on set of benchmark techniques applied in this classification problem.
Abstract: The Wavelet-Galerkin finite element method for
solving the one-dimensional heat equation is presented in this work.
Two types of basis functions which are the Lagrange and multi-level
wavelet bases are employed to derive the full form of matrix system.
We consider both linear and quadratic bases in the Galerkin method.
Time derivative is approximated by polynomial time basis that
provides easily extend the order of approximation in time space. Our
numerical results show that the rate of convergences for the linear
Lagrange and the linear wavelet bases are the same and in order 2
while the rate of convergences for the quadratic Lagrange and the
quadratic wavelet bases are approximately in order 4. It also reveals
that the wavelet basis provides an easy treatment to improve
numerical resolutions that can be done by increasing just its desired
levels in the multilevel construction process.
Abstract: Global warming and continental changes have been
one of the people's issues in the recent years and its consequences
have appeared in the most parts of the earth planet or will appear in
the future. Temperature and Precipitation are two main parameters in
climatology. Any changes in these two parameters in this region
cause widespread changes in the ecosystem and its natural and
humanistic structure. One of the important consequences of this
procedure is change in surface and underground water resources.
Zayanderood watershed basin which is the main central river in Iran
has faced water shortage in the recent years and also it has resulted in
drought in Gavkhuni swamp and the river itself. Managers and
experts in provinces which are the Zayanderood water consumers
believe that global warming; raining decrease and continental
changes are the main reason of water decrease. By statistical
investigation of annual Precipitation and 46 years temperature of
internal and external areas of Zayanderood watershed basin's stations
and by using Kendal-man method, Precipitation and temperature
procedure changes have been analyzed in this basin. According to
obtained results, there was not any noticeable decrease or increase
procedure in Precipitation and annual temperature in the basin during
this period. However, regarding to Precipitation, a noticeable
decrease and increase have been observed in small part of western
and some parts of eastern and southern basin, respectively.
Furthermore, the investigation of annual temperature procedure has
shown that a noticeable increase has been observed in some parts of
western and eastern basin, and also a noticeable increasing procedure
of temperature in the central parts of metropolitan Esfahan can be
observed.
Abstract: The principal focus of this study is on the
measurement and analysis of labor learnings in Pakistan. The study
at the aggregate economy level focus on the labor productivity
movements and at large-scale manufacturing level focus on the cost
structure, with isolating the contribution of the learning curve. The
analysis of S-shaped curve suggests that learnings are only below one
half of aggregate learning curve and other half shows the retardation
in learning, hence retardation in productivity movements. The study
implies the existence of learning economies in term of cost reduction
that is input cost per unit produced decreases by 0.51 percent every
time the cumulative production output doubles.
Abstract: Decision fusion is one of hot research topics in
classification area, which aims to achieve the best possible
performance for the task at hand. In this paper, we
investigate the usefulness of this concept to improve change
detection accuracy in remote sensing. Thereby, outputs of
two fuzzy change detectors based respectively on
simultaneous and comparative analysis of multitemporal
data are fused by using fuzzy integral operators. This
method fuses the objective evidences produced by the
change detectors with respect to fuzzy measures that express
the difference of performance between them. The proposed
fusion framework is evaluated in comparison with some
ordinary fuzzy aggregation operators. Experiments carried
out on two SPOT images showed that the fuzzy integral was
the best performing. It improves the change detection
accuracy while attempting to equalize the accuracy rate in
both change and no change classes.
Abstract: RFID (Radio Frequency IDentification) system has
been widely used in our life, such as transport systems, passports,
automotive, animal tracking, human implants, library, and so on.
However, the RFID authentication protocols between RF (Radio
Frequency) tags and the RF readers have been bring about various
privacy problems that anonymity of the tags, tracking, eavesdropping,
and so on. Many researchers have proposed the solution of the
problems. However, they still have the problem, such as location
privacy, mutual authentication. In this paper, we show the problems of
the previous protocols, and then we propose a more secure and
efficient RFID authentication protocol.
Abstract: A series of tests on cold-formed steel (CFS) wall plate system subjected to uplift force at the mid span of the wall plate is presented. The aim of the study was to study the behaviour and identify the modes of failure of CFS wall plate system. Two parameters were considered in these studies: 1) different dimension of U-bracket at the supports and 2) different sizes of lipped C-channel. The lipped C-channels used were C07508, C07512 and C10012. The dimensions of the leg of U-bracket were 50x35 mm and 50x60 mm respectively, where 25 mm clearance was provided to the connections for specimens with clearance. Results show that specimens with and without clearance experienced the same mode of failure. Failure began with the yielding of the connectors followed by distortional buckling of the wall plate. However, when C075 sections were used as wall plate, the system behaved differently. There was a large deformation in the wall plate and failure began in the distortional buckling of the wall plate followed by bearing of the connecting plates at the supports (U-bracket). The ultimate strength of the system also decreased dramatically when C075 sections were used.
Abstract: This paper tries to represent a new method for
computing the reliability of a system which is arranged in series or
parallel model. In this method we estimate life distribution function
of whole structure using the asymptotic Extreme Value (EV)
distribution of Type I, or Gumbel theory. We use EV distribution in
minimal mode, for estimate the life distribution function of series
structure and maximal mode for parallel system. All parameters also
are estimated by Moments method. Reliability function and failure
(hazard) rate and p-th percentile point of each function are
determined. Other important indexes such as Mean Time to Failure
(MTTF), Mean Time to repair (MTTR), for non-repairable and
renewal systems in both of series and parallel structure will be
computed.