Abstract: The development of the signal compression
algorithms is having compressive progress. These algorithms are
continuously improved by new tools and aim to reduce, an average,
the number of bits necessary to the signal representation by means of
minimizing the reconstruction error. The following article proposes
the compression of Arabic speech signal by a hybrid method
combining the wavelet transform and the linear prediction. The
adopted approach rests, on one hand, on the original signal
decomposition by ways of analysis filters, which is followed by the
compression stage, and on the other hand, on the application of the
order 5, as well as, the compression signal coefficients. The aim of
this approach is the estimation of the predicted error, which will be
coded and transmitted. The decoding operation is then used to
reconstitute the original signal. Thus, the adequate choice of the
bench of filters is useful to the transform in necessary to increase the
compression rate and induce an impercevable distortion from an
auditive point of view.
Abstract: In this paper, a robust digital image watermarking
scheme for copyright protection applications using the singular value
decomposition (SVD) is proposed. In this scheme, an entropy
masking model has been applied on the host image for the texture
segmentation. Moreover, the local luminance and textures of the host
image are considered for watermark embedding procedure to
increase the robustness of the watermarking scheme. In contrast to all
existing SVD-based watermarking systems that have been designed
to embed visual watermarks, our system uses a pseudo-random
sequence as a watermark. We have tested the performance of our
method using a wide variety of image processing attacks on different
test images. A comparison is made between the results of our
proposed algorithm with those of a wavelet-based method to
demonstrate the superior performance of our algorithm.
Abstract: In this study, hydroxyapatite (HA) composites are
prepared on addition of 30%CaO-30%P2O5-40%Na2 O based glass to
pure HA, in proportion of 2, 5, and 10 wt %. Each composition was
sintered over a range of temperatures. The quantitative phase
analysis was carried out using XRD and the microstructures were
studied using SEM. The density, microhardness, and compressive
strength have shown increase with the increasing amount of glass
addition. The resulting composites have chemical compositions that
are similar to the inorganic constituent of the mineral part of bone,
and constitutes trace elements like Na. X-ray diffraction showed no
decomposition of HA to secondary phases, however, the glass
reinforced-HA composites contained a HA phase and variable
amounts of tricalcium phosphate phase, depending on the amount of
bioglass added. The HA-composite material exhibited higher
compressive strength compared to sintered HA. The HA composite
reinforced with 10 wt % bioglass showed highest bioactivity level.
Abstract: We study in this paper the effect of the scene
changing on image sequences coding system using Embedded
Zerotree Wavelet (EZW). The scene changing considered here is the
full motion which may occurs. A special image sequence is generated
where the scene changing occurs randomly. Two scenarios are
considered: In the first scenario, the system must provide the
reconstruction quality as best as possible by the management of the
bit rate (BR) while the scene changing occurs. In the second scenario,
the system must keep the bit rate as constant as possible by the
management of the reconstruction quality. The first scenario may be
motivated by the availability of a large band pass transmission
channel where an increase of the bit rate may be possible to keep the
reconstruction quality up to a given threshold. The second scenario
may be concerned by the narrow band pass transmission channel
where an increase of the bit rate is not possible. In this last case,
applications for which the reconstruction quality is not a constraint
may be considered. The simulations are performed with five scales
wavelet decomposition using the 9/7-tap filter bank biorthogonal
wavelet. The entropy coding is performed using a specific defined
binary code book and EZW algorithm. Experimental results are
presented and compared to LEAD H263 EVAL. It is shown that if
the reconstruction quality is the constraint, the system increases the
bit rate to obtain the required quality. In the case where the bit rate
must be constant, the system is unable to provide the required quality
if the scene change occurs; however, the system is able to improve
the quality while the scene changing disappears.
Abstract: Using logarithmic mean Divisia decomposition technique, this paper analyzes the change in industrial energy intensity of Fujian Province in China, based on data sets of added value and energy consumption for 35 selected industrial sub-sectors from 1999 to 2009. The change in industrial energy intensity is decomposed into intensity effect and structure effect. Results show that the industrial energy intensity of Fujian Province has achieved a reduction of 51% over the past ten years. The structural change, a shift in the mix of industrial sub-sectors, made overwhelming contribution to the reduction. The impact of energy efficiency’s improvement was relatively small. However, the aggregate industrial energy intensity was very sensitive to both the changes in energy intensity and in production share of energy-intensive sub-sectors, such as production and supply of electric power, steam and hot water. Pathway to reduce industrial energy intensity for energy conservation in Fujian Province is proposed in the end.
Abstract: Decomposition processes take place in landfill
generate leachates that can be categorized mainly of acetogenic and
methanogenic in nature. BOD:COD ratio computed in this study for a
landfill site over a 3 years duration revealed as a good indicator to
identify acetogenic leachate from methanogenic leachate. Correlation
relationships to predict pollutant level taking into consideration of
climatic condition are derived.
Abstract: Levan, an exopolysaccharide, was produced by
Microbacterium laevaniformans and its yield was characterized as a
function of concentrations of date syrup, sucrose and the fermentation
time. The optimum condition for levan production from sucrose was
at concentration of 20% sucrose for 48 h and for date syrup was 25%
for 48 h. The results show that an increase in fermentation time
caused a decrease in the levan production at all concentrations of date
syrup tested. Under these conditions after 48 h in sucrose medium,
levan production reached 48.9 g/L and for date syrup reached 10.48
g/L . The effect of pH on the yield of the purified levan was examined
and the optimum pH for levan production was determined to be 6.0.
Levan was composed mainly of fructose residues when analyzed by
TLC and FT-IR spectroscopy. Date syrup is a cheap substrate widely
available in Iran and has potential for levan production. The thermal
stability of levan was assessed by Thermo Gravimetric Analysis
(TGA) that revealed the onset of decomposition near to 49°C for the
levan produced from sucrose and 51°C for the levan from date syrup.
DSC results showed a single Tg at 98°C for levan produced from
sucrose and 206 °C for levan from date syrup.
Abstract: There have been different approaches to compute the
analytic instantaneous frequency with a variety of background reasoning
and applicability in practice, as well as restrictions. This paper presents an adaptive Fourier decomposition and (α-counting) based
instantaneous frequency computation approach. The adaptive Fourier
decomposition is a recently proposed new signal decomposition
approach. The instantaneous frequency can be computed through the so called mono-components decomposed by it. Due to the fast energy
convergency, the highest frequency of the signal will be discarded by the adaptive Fourier decomposition, which represents the noise of
the signal in most of the situation. A new instantaneous frequency
definition for a large class of so-called simple waves is also proposed
in this paper. Simple wave contains a wide range of signals for which
the concept instantaneous frequency has a perfect physical sense.
The α-counting instantaneous frequency can be used to compute the highest frequency for a signal. Combination of these two approaches one can obtain the IFs of the whole signal. An experiment is demonstrated the computation procedure with promising results.
Abstract: Electromyography (EMG) signal processing has been investigated remarkably regarding various applications such as in rehabilitation systems. Specifically, wavelet transform has served as a powerful technique to scrutinize EMG signals since wavelet transform is consistent with the nature of EMG as a non-stationary signal. In this paper, the efficiency of wavelet transform in surface EMG feature extraction is investigated from four levels of wavelet decomposition and a comparative study between different mother wavelets had been done. To recognize the best function and level of wavelet analysis, two evaluation criteria, scatter plot and RES index are recruited. Hereupon, four wavelet families, namely, Daubechies, Coiflets, Symlets and Biorthogonal are studied in wavelet decomposition stage. Consequently, the results show that only features from first and second level of wavelet decomposition yields good performance and some functions of various wavelet families can lead to an improvement in separability class of different hand movements.
Abstract: In this paper, a new approach for target recognition based on the Empirical mode decomposition (EMD) algorithm of Huang etal. [11] and the energy tracking operator of Teager [13]-[14] is introduced. The conjunction of these two methods is called Teager-Huang analysis. This approach is well suited for nonstationary signals analysis. The impulse response (IR) of target is first band pass filtered into subsignals (components) called Intrinsic mode functions (IMFs) with well defined Instantaneous frequency (IF) and Instantaneous amplitude (IA). Each IMF is a zero-mean AM-FM component. In second step, the energy of each IMF is tracked using the Teager energy operator (TEO). IF and IA, useful to describe the time-varying characteristics of the signal, are estimated using the Energy separation algorithm (ESA) algorithm of Maragos et al .[16]-[17]. In third step, a set of features such as skewness and kurtosis are extracted from the IF, IA and IMF energy functions. The Teager-Huang analysis is tested on set of synthetic IRs of Sonar targets with different physical characteristics (density, velocity, shape,? ). PCA is first applied to features to discriminate between manufactured and natural targets. The manufactured patterns are classified into spheres and cylinders. One hundred percent of correct recognition is achieved with twenty three echoes where sixteen IRs, used for training, are free noise and seven IRs, used for testing phase, are corrupted with white Gaussian noise.
Abstract: Graph decompositions are vital in the study of combinatorial design theory. Given two graphs G and H, an H-decomposition of G is a partition of the edge set of G into disjoint isomorphic copies of H. An n-sun is a cycle Cn with an edge terminating in a vertex of degree one attached to each vertex. In this paper we have proved that the complete graph of order 2n, K2n can be decomposed into n-2 n-suns, a Hamilton cycle and a perfect matching, when n is even and for odd case, the decomposition is n-1 n-suns and a perfect matching. For an odd order complete graph K2n+1, delete the star subgraph K1, 2n and the resultant graph K2n is decomposed as in the case of even order. The method of building n-suns uses Walecki's construction for the Hamilton decomposition of complete graphs. A spanning tree decomposition of even order complete graphs is also discussed using the labeling scheme of n-sun decomposition. A complete bipartite graph Kn, n can be decomposed into n/2 n-suns when n/2 is even. When n/2 is odd, Kn, n can be decomposed into (n-2)/2 n-suns and a Hamilton cycle.
Abstract: In this paper, we propose a new algorithm for joint time-delay and direction-of-arrival (DOA) estimation, here called two-dimensional code acquisition, in an asynchronous directsequence code-division multiple-access (DS-CDMA) array system. This algorithm depends on eigenvector-eigenvalue decomposition of sample correlation matrix, and requires to know desired user-s training sequence. The performance of the algorithm is analyzed both analytically and numerically in uncorrelated and coherent multipath environment. Numerical examples show that the algorithm is robust with unknown number of coherent signals.
Abstract: In this paper a new approach for transmission pricing
is presented. The main idea is voltage angle allocation, i.e.
determining the contribution of each contract on the voltage angle of
each bus. DC power flow is used to compute a primary solution for
angle decomposition. To consider the impacts of system non-linearity
on angle decomposition, the primary solution is corrected in different
iterations of decoupled Newton-Raphson power flow. Then, the
contribution of each contract on power flow of each transmission line
is computed based on angle decomposition. Contract-related flows
are used as a measure for “extent of use" of transmission network
capacity and consequently transmission pricing. The presented
approach is applied to a 4-bus test system and IEEE 30-bus test
system.
Abstract: One of the main image representations in Mathematical Morphology is the 3D Shape Decomposition Representation, useful for Image Compression and Representation,and Pattern Recognition. The 3D Morphological Shape Decomposition representation can be generalized a number of times,to extend the scope of its algebraic characteristics as much as possible. With these generalizations, the Morphological Shape Decomposition 's role to serve as an efficient image decomposition tool is extended to grayscale images.This work follows the above line, and further develops it. Anew evolutionary branch is added to the 3D Morphological Shape Decomposition's development, by the introduction of a 3D Multi Structuring Element Morphological Shape Decomposition, which permits 3D Morphological Shape Decomposition of 3D binary images (grayscale images) into "multiparameter" families of elements. At the beginning, 3D Morphological Shape Decomposition representations are based only on "1 parameter" families of elements for image decomposition.This paper addresses the gray scale inter frame interpolation by means of mathematical morphology. The new interframe interpolation method is based on generalized morphological 3D Shape Decomposition. This article will present the theoretical background of the morphological interframe interpolation, deduce the new representation and show some application examples.Computer simulations could illustrate results.
Abstract: The empirical mode decomposition (EMD) represents any time series into a finite set of basis functions. The bases are termed as intrinsic mode functions (IMFs) which are mutually orthogonal containing minimum amount of cross-information. The EMD successively extracts the IMFs with the highest local frequencies in a recursive way, which yields effectively a set low-pass filters based entirely on the properties exhibited by the data. In this paper, EMD is applied to explore the properties of the multi-year air temperature and to observe its effects on climate change under global warming. This method decomposes the original time-series into intrinsic time scale. It is capable of analyzing nonlinear, non-stationary climatic time series that cause problems to many linear statistical methods and their users. The analysis results show that the mode of EMD presents seasonal variability. The most of the IMFs have normal distribution and the energy density distribution of the IMFs satisfies Chi-square distribution. The IMFs are more effective in isolating physical processes of various time-scales and also statistically significant. The analysis results also show that the EMD method provides a good job to find many characteristics on inter annual climate. The results suggest that climate fluctuations of every single element such as temperature are the results of variations in the global atmospheric circulation.
Abstract: In this work, we incorporated a quartic bond potential
into a coarse-grained bead-spring model to study lubricant adsorption
on a solid surface as well as depletion instability. The surface tension
density and the number density profiles were examined to verify the
solid-liquid and liquid-vapor interfaces during heat treatment. It was
found that both the liquid-vapor interfacial thickness and the
solid-vapor separation increase with the temperatureT* when T*is
below the phase transition temperature Tc
*. At high temperatures
(T*>Tc
*), the solid-vapor separation decreases gradually as the
temperature increases. In addition, we evaluated the lubricant weight
and bond loss profiles at different temperatures. It was observed that
the lubricant desorption is favored over decomposition and is the main
cause of the lubricant failure at the head disk interface in our
simulations.
Abstract: Regenerative Thermal Oxidizer (RTO) is one of the
best solutions for removal of Volatile Organic Compounds (VOC)
from industrial processes. In the RTO, VOC in a raw gas are usually
decomposed at 950-1300 K and the combustion heat of VOC is
recovered by regenerative heat exchangers charged with ceramic
honeycombs. The optimization of the treatment of VOC leads to the
reduction of fuel addition to VOC decomposition, the minimization of
CO2 emission and operating cost as well.
In the present work, the thermal efficiency of the RTO was
investigated experimentally in a pilot-scale RTO unit using toluene as
a typical representative of VOC. As a result, it was recognized that the
radiative heat transfer was dominant in the preheating process of a raw
gas when the gas flow rate was relatively low. Further, it was found
that a minimum heat exchanger volume to achieve self combustion of
toluene without additional heating of the RTO by fuel combustion was
dependent on both the flow rate of a raw gas and the concentration of
toluene. The thermal efficiency calculated from fuel consumption and
the decomposed toluene ratio, was found to have a maximum value of
0.95 at a raw gas mass flow rate of 1810 kg·h-1 and honeycombs height
of 1.5m.
Abstract: Carboneous catalytical methane decomposition is an
attractive process because it produces two valuable products:
hydrogen and carbon. Furthermore, this reaction does not emit any
green house or hazardous gases. In the present study, experiments
were conducted in a thermo gravimetric analyzer using Fluka 05120
as carboneous catalyst to analyze its effectiveness in methane
decomposition. Various temperatures and methane partial pressures
were chosen and carbon mass gain was observed as a function of
time. Results are presented in terms of carbon formation rate,
hydrogen production and catalytical activity. It is observed that there
is linearity in carbon deposition amount by time at lower reaction
temperature (780 °C). On the other hand, it is observed that carbon
and hydrogen formation rates are increased with increasing
temperature. Finally, we observed that the carbon formation rate is
highest at 950 °C within the range of temperatures studied.
Abstract: In digital signal processing it is important to
approximate multi-dimensional data by the method called rank
reduction, in which we reduce the rank of multi-dimensional data from
higher to lower. For 2-dimennsional data, singular value
decomposition (SVD) is one of the most known rank reduction
techniques. Additional, outer product expansion expanded from SVD
was proposed and implemented for multi-dimensional data, which has
been widely applied to image processing and pattern recognition.
However, the multi-dimensional outer product expansion has behavior
of great computation complex and has not orthogonally between the
expansion terms. Therefore we have proposed an alterative method,
Third-order Orthogonal Tensor Product Expansion short for 3-OTPE.
3-OTPE uses the power method instead of nonlinear optimization
method for decreasing at computing time. At the same time the group
of B. D. Lathauwer proposed Higher-Order SVD (HOSVD) that is
also developed with SVD extensions for multi-dimensional data.
3-OTPE and HOSVD are similarly on the rank reduction of
multi-dimensional data. Using these two methods we can obtain
computation results respectively, some ones are the same while some
ones are slight different. In this paper, we compare 3-OTPE to
HOSVD in accuracy of calculation and computing time of resolution,
and clarify the difference between these two methods.
Abstract: The objective is to split a simply connected polygon
into a set of convex quadrilaterals without inserting new
boundary nodes. The presented approach consists in repeatedly
removing quadrilaterals from the polygon. Theoretical results
pertaining to quadrangulation of simply connected polygons are
derived from the usual 2-ear theorem. It produces a quadrangulation
technique with O(n) number of quadrilaterals. The
theoretical methodology is supplemented by practical results
and CAD surface segmentation.