Abstract: Employing a recently introduced unified adaptive filter
theory, we show how the performance of a large number of important
adaptive filter algorithms can be predicted within a general framework
in nonstationary environment. This approach is based on energy conservation
arguments and does not need to assume a Gaussian or white
distribution for the regressors. This general performance analysis can
be used to evaluate the mean square performance of the Least Mean
Square (LMS) algorithm, its normalized version (NLMS), the family
of Affine Projection Algorithms (APA), the Recursive Least Squares
(RLS), the Data-Reusing LMS (DR-LMS), its normalized version
(NDR-LMS), the Block Least Mean Squares (BLMS), the Block
Normalized LMS (BNLMS), the Transform Domain Adaptive Filters
(TDAF) and the Subband Adaptive Filters (SAF) in nonstationary
environment. Also, we establish the general expressions for the
steady-state excess mean square in this environment for all these
adaptive algorithms. Finally, we demonstrate through simulations that
these results are useful in predicting the adaptive filter performance.
Abstract: Process control and energy conservation are the two
primary reasons for using an adjustable speed drive. However,
voltage sags are the most important power quality problems facing
many commercial and industrial customers. The development of
boost converters has raised much excitement and speculation
throughout the electric industry. Now utilities are looking to these
devices for performance improvement and reliability in a variety of
areas. Examples of these include sags, spikes, or transients in supply
voltage as well as unbalanced voltages, poor electrical system
grounding, and harmonics. In this paper, simulations results are
presented for the verification of the proposed boost converter
topology. Boost converter provides ride through capability during
sag and swell. Further, input currents are near sinusoidal. This
eliminates the need of braking resistor also.
Abstract: Information and communication service providers
(ICSP) that are significant in size and provide Internet-based services
take administrative, technical, and physical protection measures via
the information security check service (ISCS). These protection
measures are the minimum action necessary to secure the stability and
continuity of the information and communication services (ICS) that
they provide. Thus, information assets are essential to providing ICS,
and deciding the relative importance of target assets for protection is a
critical procedure. The risk analysis model designed to decide the
relative importance of information assets, which is described in this
study, evaluates information assets from many angles, in order to
choose which ones should be given priority when it comes to
protection. Many-sided risk analysis (MSRS) grades the importance of
information assets, based on evaluation of major security check items,
evaluation of the dependency on the information and communication
facility (ICF) and influence on potential incidents, and evaluation of
major items according to their service classification, in order to
identify the ISCS target. MSRS could be an efficient risk analysis
model to help ICSPs to identify their core information assets and take
information protection measures first, so that stability of the ICS can
be ensured.
Abstract: Synthetic juice clarification was done through spiral
wound ultrafiltration (UF) membrane module. Synthetic juice was
clarified at two different operating conditions, such as, with and
without permeates recycle at turbulent flow regime. The performance
of spiral wound ultrafiltration membrane was analyzed during
clarification of synthetic juice. Synthetic juice was the mixture of
deionized water, sucrose and pectin molecule. The operating
conditions are: feed flowrate of 10 lpm, pressure drop of 413.7 kPa
and Reynolds no of 5000. Permeate sample was analyzed in terms of
volume reduction factor (VRF), viscosity (Pa.s), ⁰Brix, TDS (mg/l),
electrical conductivity (μS) and turbidity (NTU). It was observe that
the permeate flux declined with operating time for both conditions of
with and without permeate recycle due to increase of concentration
polarization and increase of gel layer on membrane surface. For
without permeate recycle, the membrane fouling rate was faster
compared to with permeate recycle. For without permeate recycle,
the VRF rose up to 5 and for with recycle permeate the VRF is 1.9.
The VRF is higher due to adsorption of solute (pectin) molecule on
membrane surface and resulting permeateflux declined with VRF.
With permeate recycle, quality was within acceptable limit. Fouled
membrane was cleaned by applying different processes (e.g.,
deionized water, SDS and EDTA solution). Membrane cleaning was
analyzed in terms of permeability recovery.
Abstract: Business Process Modeling (BPM) is the first and
most important step in business process management lifecycle. Graph
based formalism and rule based formalism are the two most
predominant formalisms on which process modeling languages are
developed. BPM technology continues to face challenges in coping
with dynamic business environments where requirements and goals
are constantly changing at the execution time. Graph based
formalisms incur problems to react to dynamic changes in Business
Process (BP) at the runtime instances. In this research, an adaptive
and flexible framework based on the integration between Object
Oriented diagramming technique and Petri Net modeling language is
proposed in order to support change management techniques for
BPM and increase the representation capability for Object Oriented
modeling for the dynamic changes in the runtime instances. The
proposed framework is applied in a higher education environment to
achieve flexible, updatable and dynamic BP.
Abstract: This paper present an efficient and reliable technique of optimization which combined fuel cost economic optimization and emission dispatch using the Sigmoid Decreasing Inertia Weight Particle Swarm Optimization algorithm (PSO) to reduce the cost of fuel and pollutants resulting from fuel combustion by keeping the output of generators, bus voltages, shunt capacitors and transformer tap settings within the security boundary. The performance of the proposed algorithm has been demonstrated on IEEE 30-bus system with six generating units. The results clearly show that the proposed algorithm gives better and faster speed convergence then linearly decreasing inertia weight.
Abstract: ZnO nanostructures including nanowires, nanorods,
and nanoneedles were successfully deposited on GaAs substrates,
respectively, by simple two-step chemical method for the first time. A
ZnO seed layer was firstly pre-coated on the O2-plasma treated
substrate by sol-gel process, followed by the nucleation of ZnO
nanostructures through hydrothermal synthesis. Nanostructures with
different average diameter (15-250 nm), length (0.9-1.8 μm), density
(0.9-16×109 cm-2) were obtained via adjusting the growth time and
concentration of precursors. From the reflectivity spectra, we
concluded ordered and taper nanostructures were preferential for
photovoltaic applications. ZnO nanoneedles with an average diameter
of 106 nm, a moderate length of 2.4 μm, and the density of 7.2×109
cm-2 could be synthesized in the concentration of 0.04 M for 18 h.
Integrated with the nanoneedle array, the power conversion efficiency
of single junction solar cell was increased from 7.3 to 12.2%,
corresponding to a 67% improvement.
Abstract: This paper describes the development of a 16-ports optical code division multiple access (OCDMA) encoder prototype based on Arrayed Waveguide Grating (AWG) and optical switches. It is potentially to provide a high security for data transmission due to all data will be transmitted in binary code form. The output signals from AWG are coded with a binary code that given to an optical switch before it signal modulate with the carrier and transmitted to the receiver. The 16-ports encoder used 16 double pole double throw (DPDT) toggle switches to control the polarization of voltage source from +5 V to -5 V for 16 optical switches. When +5 V is given, the optical switch will give code '1' and vice versa. The experimental results showed the insertion loss, crosstalk, uniformity, and optical signal-noise-ratio (OSNR) for the developed prototype are
Abstract: The advances in multimedia and networking technologies
have created opportunities for Internet pirates, who can easily
copy multimedia contents and illegally distribute them on the Internet,
thus violating the legal rights of content owners. This paper describes
how a simple and well-known watermarking procedure based on a
spread spectrum method and a watermark recovery by correlation can
be improved to effectively and adaptively protect MPEG-2 videos
distributed on the Internet. In fact, the procedure, in its simplest
form, is vulnerable to a variety of attacks. However, its security
and robustness have been increased, and its behavior has been
made adaptive with respect to the video terminals used to open
the videos and the network transactions carried out to deliver them
to buyers. In fact, such an adaptive behavior enables the proposed
procedure to efficiently embed watermarks, and this characteristic
makes the procedure well suited to be exploited in web contexts,
where watermarks usually generated from fingerprinting codes have
to be inserted into the distributed videos “on the fly", i.e. during the
purchase web transactions.
Abstract: Network exchange is now widely used. However, it still
cannot avoid the problems evolving from network exchange. For
example. A buyer may not receive the order even if he/she makes the
payment. For another example, the seller possibly get nothing even
when the merchandise is sent. Some studies about the fair exchange
have proposed protocols for the design of efficiency and exploited the
signature property to specify that two parties agree on the exchange.
The information about purchased item and price are disclosed in
this way. This paper proposes a new fair network payment protocol
with off-line trusted third party. The proposed protocol can protect
the buyers- purchase message from being traced. In addition, the
proposed protocol can meet the proposed requirements. The most
significant feature is Non-transfer property we achieved.
Abstract: The abundance and availability of rice husk, an agricultural waste, make them as a good source for precursor of activated carbon. In this work, rice husk-based activated carbons were prepared via base treated chemical activation process prior the carbonization process. The effect of carbonization temperatures (400, 600 and 800oC) on their pore structure was evaluated through morphology analysis using scanning electron microscope (SEM). Sample carbonized at 800oC showed better evolution and development of pores as compared to those carbonized at 400 and 600oC. The potential of rice husk-based activated carbon as an alternative adsorbent was investigated for the removal of Ni(II), Zn(II) and Pb(II) from single metal aqueous solution. The adsorption studies using rice husk-based activated carbon as an adsorbent were carried out as a function of contact time at room temperature and the metal ions were analyzed using atomic absorption spectrophotometer (AAS). The ability to remove metal ion from single metal aqueous solution was found to be improved with the increasing of carbonization temperature. Among the three metal ions tested, Pb(II) ion gave the highest adsorption on rice husk-based activated carbon. The results obtained indicate the potential to utilize rice husk as a promising precursor for the preparation of activated carbon for removal of heavy metals.
Abstract: 16-Mercaptohexadecanoic acid (MHDA) and tripeptide glutathione conjugated with gold nanoparticles (Au-NPs) are characterized by Fourier Transform InfaRared (FTIR) spectroscopy combined with Surface-enhanced Raman scattering (SERS) spectroscopy. Surface Plasmon Resonance (SPR) technique based on FTIR spectroscopy has become an important tool in biophysics, which is perspective for the study of organic compounds. FTIR-spectra of MHDA shows the line at 2500 cm-1 attributed to thiol group which is modified by presence of Au-NPs, suggesting the formation of bond between thiol group and gold. We also can observe the peaks originate from characteristic chemical group. A Raman spectrum of the same sample is also promising. Our preliminary experiments confirm that SERS-effect takes place for MHDA connected with Au-NPs and enable us to detected small number (less than 106 cm-2) of MHDA molecules. Combination of spectroscopy methods: FTIR and SERS – enable to study optical properties of Au- NPs and immobilized bio-molecules in context of a bio-nano-sensors.
Abstract: Recent evidences on liquidity and valuation of securities in the capital markets clearly show the importance of stock market liquidity and valuation of firms. In this paper, relationship between transparency, liquidity, and valuation is studied by using data obtained from 70 companies listed in Tehran Stock Exchange during2003-2012. In this study, discriminatory earnings management, as a sign of lack of transparency and Tobin's Q, was used as the criteria of valuation. The results indicate that there is a significant and reversed relationship between earnings management and liquidity. On the other hand, there is a relationship between liquidity and transparency.The results also indicate a significant relationship between transparency and valuation. Transparency has an indirect effect on firm valuation alone or through the liquidity channel. Although the effect of transparency on the value of a firm was reduced by adding the variable of liquidity, the cumulative effect of transparency and liquidity increased.
Abstract: QoS Routing aims to find paths between senders and
receivers satisfying the QoS requirements of the application which
efficiently using the network resources and underlying routing
algorithm to be able to find low-cost paths that satisfy given QoS
constraints. The problem of finding least-cost routing is known to be
NP hard or complete and some algorithms have been proposed to
find a near optimal solution. But these heuristics or algorithms either
impose relationships among the link metrics to reduce the complexity
of the problem which may limit the general applicability of the
heuristic, or are too costly in terms of execution time to be applicable
to large networks. In this paper, we analyzed two algorithms namely
Characterized Delay Constrained Routing (CDCR) and Optimized
Delay Constrained Routing (ODCR). The CDCR algorithm dealt an
approach for delay constrained routing that captures the trade-off
between cost minimization and risk level regarding the delay
constraint. The ODCR which uses an adaptive path weight function
together with an additional constraint imposed on the path cost, to
restrict search space and hence ODCR finds near optimal solution in
much quicker time.
Abstract: Ground-level tropospheric ozone is one of the air
pollutants of most concern. It is mainly produced by photochemical
processes involving nitrogen oxides and volatile organic compounds
in the lower parts of the atmosphere. Ozone levels become
particularly high in regions close to high ozone precursor emissions
and during summer, when stagnant meteorological conditions with
high insolation and high temperatures are common.
In this work, some results of a study about urban ozone
distribution patterns in the city of Badajoz, which is the largest and
most industrialized city in Extremadura region (southwest Spain) are
shown. Fourteen sampling campaigns, at least one per month, were
carried out to measure ambient air ozone concentrations, during
periods that were selected according to favourable conditions to
ozone production, using an automatic portable analyzer.
Later, to evaluate the ozone distribution at the city, the measured
ozone data were analyzed using geostatistical techniques. Thus, first,
during the exploratory analysis of data, it was revealed that they were
distributed normally, which is a desirable property for the subsequent
stages of the geostatistical study. Secondly, during the structural
analysis of data, theoretical spherical models provided the best fit for
all monthly experimental variograms. The parameters of these
variograms (sill, range and nugget) revealed that the maximum
distance of spatial dependence is between 302-790 m and the
variable, air ozone concentration, is not evenly distributed in reduced
distances. Finally, predictive ozone maps were derived for all points
of the experimental study area, by use of geostatistical algorithms
(kriging). High prediction accuracy was obtained in all cases as
cross-validation showed. Useful information for hazard assessment
was also provided when probability maps, based on kriging
interpolation and kriging standard deviation, were produced.
Abstract: Internet is largely composed of textual contents and a
huge volume of digital contents gets floated over the Internet daily.
The ease of information sharing and re-production has made it
difficult to preserve author-s copyright. Digital watermarking came
up as a solution for copyright protection of plain text problem after
1993. In this paper, we propose a zero text watermarking algorithm
based on occurrence frequency of non-vowel ASCII characters and
words for copyright protection of plain text. The embedding
algorithm makes use of frequency non-vowel ASCII characters and
words to generate a specialized author key. The extraction algorithm
uses this key to extract watermark, hence identify the original
copyright owner. Experimental results illustrate the effectiveness of
the proposed algorithm on text encountering meaning preserving
attacks performed by five independent attackers.
Abstract: The question of interethnic and interreligious conflicts
in ex-Yugoslavia receives much attention within the framework of
the international context created after 1991 because of the impact of
these conflicts on the security and the stability of the region of
Balkans and of Europe.
This paper focuses on the rationales leading to the declaration of
independence by Kosovo according to ethnic and religious criteria
and analyzes why these same rationales were not applied in Bosnia
and Herzegovina. The approach undertaken aims at comparatively
examining the cases of Kosovo, and Bosnia and Herzegovina. At the
same time, it aims at understanding the political decision making of
the international community in the case of Kosovo. Specifically, was
this a good political decision for the security and the stability of the
region of Balkans, of Europe, or even for global security and
stability?
This research starts with an overview on the European security
framework post 1991, paying particular attention to Kosovo and
Bosnia and Herzegovina. It then presents the theoretical and
methodological framework and compares the representative cases.
Using the constructivism issue and the comparative methodology, it
arrives at the results of the study. An important issue of the paper is
the thesis that this event modifies the principles of international law
and creates dangerous precedents for regional stability in the
Balkans.
Abstract: The objective of this work which is based on the
approach of simultaneous engineering is to contribute to the
development of a CIM tool for the synthesis of functional design
dimensions expressed by average values and tolerance intervals. In
this paper, the dispersions method known as the Δl method which
proved reliable in the simulation of manufacturing dimensions is
used to develop a methodology for the automation of the simulation.
This methodology is constructed around three procedures. The first
procedure executes the verification of the functional requirements by
automatically extracting the functional dimension chains in the
mechanical sub-assembly. Then a second procedure performs an
optimization of the dispersions on the basis of unknown variables.
The third procedure uses the optimized values of the dispersions to
compute the optimized average values and tolerances of the
functional dimensions in the chains. A statistical and cost based
approach is integrated in the methodology in order to take account of
the capabilities of the manufacturing processes and to distribute
optimal values among the individual components of the chains.
Abstract: In this study, a system of encryption based on chaotic
sequences is described. The system is used for encrypting digital
image data for the purpose of secure image transmission. An image
secure communication scheme based on Logistic map chaotic
sequences with a nonlinear function is proposed in this paper.
Encryption and decryption keys are obtained by one-dimensional
Logistic map that generates secret key for the input of the nonlinear
function. Receiver can recover the information using the received
signal and identical key sequences through the inverse system
technique. The results of computer simulations indicate that the
transmitted source image can be correctly and reliably recovered by
using proposed scheme even under the noisy channel. The
performance of the system will be discussed through evaluating the
quality of recovered image with and without channel noise.
Abstract: Fuzzy Load forecasting plays a paramount role in the operation and management of power systems. Accurate estimation of future power demands for various lead times facilitates the task of generating power reliably and economically. The forecasting of future loads for a relatively large lead time (months to few years) is studied here (long term load forecasting). Among the various techniques used in forecasting load, artificial intelligence techniques provide greater accuracy to the forecasts as compared to conventional techniques. Fuzzy Logic, a very robust artificial intelligent technique, is described in this paper to forecast load on long term basis. The paper gives a general algorithm to forecast long term load. The algorithm is an Extension of Short term load forecasting method to Long term load forecasting and concentrates not only on the forecast values of load but also on the errors incorporated into the forecast. Hence, by correcting the errors in the forecast, forecasts with very high accuracy have been achieved. The algorithm, in the paper, is demonstrated with the help of data collected for residential sector (LT2 (a) type load: Domestic consumers). Load, is determined for three consecutive years (from April-06 to March-09) in order to demonstrate the efficiency of the algorithm and to forecast for the next two years (from April-09 to March-11).