Abstract: Because of importance of energy, optimization of
power generation systems is necessary. Gas turbine cycles are
suitable manner for fast power generation, but their efficiency is
partly low. In order to achieving higher efficiencies, some
propositions are preferred such as recovery of heat from exhaust
gases in a regenerator, utilization of intercooler in a multistage
compressor, steam injection to combustion chamber and etc.
However thermodynamic optimization of gas turbine cycle, even
with above components, is necessary. In this article multi-objective
genetic algorithms are employed for Pareto approach optimization of
Regenerative-Intercooling-Gas Turbine (RIGT) cycle. In the multiobjective
optimization a number of conflicting objective functions
are to be optimized simultaneously. The important objective
functions that have been considered for optimization are entropy
generation of RIGT cycle (Ns) derives using Exergy Analysis and
Gouy-Stodola theorem, thermal efficiency and the net output power
of RIGT Cycle. These objectives are usually conflicting with each
other. The design variables consist of thermodynamic parameters
such as compressor pressure ratio (Rp), excess air in combustion
(EA), turbine inlet temperature (TIT) and inlet air temperature (T0).
At the first stage single objective optimization has been investigated
and the method of Non-dominated Sorting Genetic Algorithm
(NSGA-II) has been used for multi-objective optimization.
Optimization procedures are performed for two and three objective
functions and the results are compared for RIGT Cycle. In order to
investigate the optimal thermodynamic behavior of two objectives,
different set, each including two objectives of output parameters, are
considered individually. For each set Pareto front are depicted. The
sets of selected decision variables based on this Pareto front, will
cause the best possible combination of corresponding objective
functions. There is no superiority for the points on the Pareto front
figure, but they are superior to any other point. In the case of three
objective optimization the results are given in tables.
Abstract: For complete support of Quality of Service, it is better that environment itself predicts resource requirements of a job by using special methods in the Grid computing. The exact and correct prediction causes exact matching of required resources with available resources. After the execution of each job, the used resources will be saved in the active database named "History". At first some of the attributes will be exploit from the main job and according to a defined similarity algorithm the most similar executed job will be exploited from "History" using statistic terms such as linear regression or average, resource requirements will be predicted. The new idea in this research is based on active database and centralized history maintenance. Implementation and testing of the proposed architecture results in accuracy percentage of 96.68% to predict CPU usage of jobs and 91.29% of memory usage and 89.80% of the band width usage.
Abstract: This study examined the underlying dimensions of
brand equity in the chocolate industry. For this purpose, researchers
developed a model to identify which factors are influential in
building brand equity. The second purpose was to assess brand
loyalty and brand images mediating effect between brand attitude,
brand personality, brand association with brand equity. The study
employed structural equation modeling to investigate the causal
relationships between the dimensions of brand equity and brand
equity itself. It specifically measured the way in which consumers’
perceptions of the dimensions of brand equity affected the overall
brand equity evaluations. Data were collected from a sample of
consumers of chocolate industry in Iran. The results of this empirical
study indicate that brand loyalty and brand image are important
components of brand equity in this industry. Moreover, the role of
brand loyalty and brand image as mediating factors in the intention of
brand equity are supported. The principal contribution of the present
research is that it provides empirical evidence of the
multidimensionality of consumer based brand equity, supporting
Aaker´s and Keller´s conceptualization of brand equity. The present
research also enriched brand equity building by incorporating the
brand personality and brand image, as recommended by previous
researchers. Moreover, creating the brand equity index in chocolate
industry of Iran particularly is novel.
Abstract: The present study aims at determining the effect of ageing on the impact toughness and microstructure of 2024 Al-Cu - Mg alloy. Following the 2 h solutionizing treatment at 450°C and water quench, the specimens were aged at 200°C for various periods (1 to 18 h). The precipitation stages during ageing were monitored by hardness measurements. For each specimen group, Charpy impact and hardness tests were carried out. During ageing the impact toughness of the alloy first increased, and then, following a maxima decreased due to the precipitation of intermediate phases, finally it reached its minimum at the peak hardness. Correlations between hardness and impact toughness were investigated.
Abstract: The purpose of the present study is the investigation
of the relationship between knowledge management and enabling
managers based on achieving proper function. This research is
descriptive and investigative. The sample includes all male and
female high school managers of first and second regions of Urmia
including 98 school and accordingly 98 managers. The instrument
applied was a questionnaire. To sum up, there is a statistically
significant relationship between knowledge management and
empowering managers. In the end, several suggestions are provided.
Abstract: In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.
Abstract: We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.
Abstract: Cell formation is the first step in the design of cellular
manufacturing systems. In this study, a general purpose
computational scheme employing a hybrid tabu search algorithm as
the core is proposed to solve the cell formation problem and its
variants. In the proposed scheme, great flexibilities are left to the
users. The core solution searching algorithm embedded in the scheme
can be easily changed to any other meta-heuristic algorithms, such as
the simulated annealing, genetic algorithm, etc., based on the
characteristics of the problems to be solved or the preferences the
users might have. In addition, several counters are designed to control
the timing of conducting intensified solution searching and diversified
solution searching strategies interactively.
Abstract: In this paper we present an enhanced noise reduction method for robust speech recognition using Adaptive Gain Equalizer with Non linear Spectral Subtraction. In Adaptive Gain Equalizer method (AGE), the input signal is divided into a number of subbands that are individually weighed in time domain, in accordance to the short time Signal-to-Noise Ratio (SNR) in each subband estimation at every time instant. Instead of focusing on suppression the noise on speech enhancement is focused. When analysis was done under various noise conditions for speech recognition, it was found that Adaptive Gain Equalizer method algorithm has an obvious failing point for a SNR of -5 dB, with inadequate levels of noise suppression for SNR less than this point. This work proposes the implementation of AGE when coupled with Non linear Spectral Subtraction (AGE-NSS) for robust speech recognition. The experimental result shows that out AGE-NSS performs the AGE when SNR drops below -5db level.
Abstract: Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.
Abstract: This paper reviews recent studies and particularly the
effects of Climate Change in the North Tropical Atlantic by studying
atmospheric conditions that prevailed in 2005 ; Coral Bleaching
HotSpot and Hurricane Katrina. In the aim to better understand and
estimate the impact of the physical phenomenon, i.e. Thermal
Oceanic HotSpot (TOHS), isotopic studies of δ18O and δ13C on
marine animals from Guadeloupe (French Caribbean Island) were
carried out. Recorded measures show Sea Surface Temperature (SST)
up to 35°C in August which is much higher than data recorded by
NOAA satellites 32°C. After having reviewed the process that led to
the creation of Hurricane Katrina which hit New Orleans in August
29, 2005, it will be shown that the climatic conditions in the
Caribbean from August to October 2005 have influenced Katrina
evolution. This TOHS is a combined effect of various phenomenon
which represent an additional factor to estimate future climate
changes.
Abstract: In this paper optical code-division multiple-access (OCDMA) packet network is considered, which offers inherent security in the access networks. Two types of random access protocols are proposed for packet transmission. In protocol 1, all distinct codes and in protocol 2, distinct codes as well as shifted versions of all these codes are used. O-CDMA network performance using optical orthogonal codes (OOCs) 1-D and two-dimensional (2-D) wavelength/time single-pulse-per-row (W/T SPR) codes are analyzed. The main advantage of using 2-D codes instead of onedimensional (1-D) codes is to reduce the errors due to multiple access interference among different users. In this paper, correlation receiver is considered in the analysis. Using analytical model, we compute and compare packet-success probability for 1-D and 2-D codes in an O-CDMA network and the analysis shows improved performance with 2-D codes as compared to 1-D codes.
Abstract: In this work a new platform for mobile-health systems is
presented. System target application is providing decision support to
rescue corps or military medical personnel in combat areas. Software
architecture relies on a distributed client-server system that manages a
wireless ad-hoc networks hierarchy in which several different types of
client operate. Each client is characterized for different hardware and
software requirements. Lower hierarchy levels rely in a network of
completely custom devices that store clinical information and patient
status and are designed to form an ad-hoc network operating in the
2.4 GHz ISM band and complying with the IEEE 802.15.4 standard
(ZigBee). Medical personnel may interact with such devices, that are
called MICs (Medical Information Carriers), by means of a PDA
(Personal Digital Assistant) or a MDA (Medical Digital Assistant),
and transmit the information stored in their local databases as well as
issue a service request to the upper hierarchy levels by using IEEE
802.11 a/b/g standard (WiFi). The server acts as a repository that
stores both medical evacuation forms and associated events (e.g., a
teleconsulting request). All the actors participating in the diagnostic
or evacuation process may access asynchronously to such repository
and update its content or generate new events. The designed system
pretends to optimise and improve information spreading and flow
among all the system components with the aim of improving both
diagnostic quality and evacuation process.
Abstract: The measurement of anesthetic depth is necessary in
anesthesiology. NN10 is very simple method among the RR intervals
analysis methods. NN10 parameter means the numbers of above the 10
ms intervals of the normal to normal RR intervals.
Bispectrum analysis is defined as 2D FFT. EEG signal reflected the
non-linear peristalsis phenomena according to the change brain
function. After analyzing the bispectrum of the 2 dimension, the most
significant power spectrum density peaks appeared abundantly at the
specific area in awakening and anesthesia state. These points are
utilized to create the new index since many peaks appeared at the
specific area in the frequency coordinate. The measured range of an
index was 0-100. An index is 20-50 at an anesthesia, while the index is
90-60 at the awake.
In this paper, the relation between NN10 parameter using ECG and
bisepctrum index using EEG is observed to estimate the depth of
anesthesia during anesthesia and then we estimated the utility of the
anesthetic.
Abstract: Classical Bose-Chaudhuri-Hocquenghem (BCH) codes C that contain their dual codes can be used to construct quantum stabilizer codes this chapter studies the properties of such codes. It had been shown that a BCH code of length n which contains its dual code satisfies the bound on weight of any non-zero codeword in C and converse is also true. One impressive difficulty in quantum communication and computation is to protect informationcarrying quantum states against undesired interactions with the environment. To address this difficulty, many good quantum errorcorrecting codes have been derived as binary stabilizer codes. We were able to shed more light on the structure of dual containing BCH codes. These results make it possible to determine the parameters of quantum BCH codes in terms of weight of non-zero dual codeword.
Abstract: The posteroanterior manipulation technique is usually include in the procedure of the lumbar spine to evaluate the intervertebral motion according to mechanical resistance. The mechanical device with visual feedback was proposed that allows one to analysis the lumbar segments mobility “in vivo" facilitating for the therapist to take its treatment evolution. The measuring system uses load cell and displacement sensor to estimate spine stiffness. In this work, the device was tested by 2 therapists, female, applying posteroanterior force techniques to 5 volunteers, female, with frequency of approximately 1.2-1.8 Hz. A test-retest procedure was used for 2 periods of day. The visual feedback results small variation of forces and cycle time during 6 cycles rhythmic application. The stiffness values showed good agreement between test-retest procedures when used same order of maximum forces.
Abstract: This paper presents a CFD analysis of the flow around
a 30° inclined flat plate of infinite span. Numerical predictions have
been compared to experimental measurements, in order to assess the
potential of the finite volume code of determining the aerodynamic
forces acting on a flat plate invested by a fluid stream of infinite
extent.
Several turbulence models and spatial node distributions have
been tested and flow field characteristics in the neighborhood of the
flat plate have been numerically investigated, allowing the
development of a preliminary procedure to be used as guidance in
selecting the appropriate grid configuration and the corresponding
turbulence model for the prediction of the flow field over a twodimensional
inclined plate.
Abstract: The complex structure of lignocellulose leads to great
difficulties in converting it to fermentable sugars for the ethanol
production. The major hydrolysis impediments are the crystallinity of
cellulose and the lignin content. To improve the efficiency of
enzymatic hydrolysis, microbial pretreatment of corncob was
investigated using two bacterial strains of Bacillus subtilis A 002 and
Cellulomonas sp. TISTR 784 (expected to break open the crystalline
part of cellulose) and lignin-degrading fungus, Phanerochaete
sordida SK7 (expected to remove lignin from lignocellulose). The
microbial pretreatment was carried out with each strain under its
optimum conditions. The pretreated corncob samples were further
hydrolyzed to produce reducing glucose with low amounts of
commercial cellulase (25 U·g-1 corncob) from Aspergillus niger. The
corncob samples were determined for composition change by X-ray
diffraction (XRD), Fourier transform infrared spectroscopy (FTIR),
and scanning electron microscope (SEM). According to the results,
the microbial pretreatment with fungus, P. sordida SK7 was the most
effective for enhancing enzymatic hydrolysis, approximately, 40%
improvement.
Abstract: There are many virtual payment systems available to
conduct micropayments. It is essential that the protocols satisfy the
highest standards of correctness. This paper examines the Netpay
Protocol [3], provide its formalization as automata model, and prove
two important correctness properties, namely absence of deadlock
and validity of an ecoin during the execution of the protocol. This
paper assumes a cooperative customer and will prove that the
protocol is executing according to its description.
Abstract: The liquid cargo contained in a partly-filled road tank
vehicle is prone to dynamic slosh movement when subjected to
external disturbances. The slosh behavior has been identified as a
significant factor impairing the safety of liquid cargo transportation.
The laboratory experiments have been conducted for analyzing fluid
slosh in partly filled tanks. The experiment results measured under
forced harmonic excitations reveal the three-dimensional nature of
the fluid motion and coupling between the lateral and longitudinal
fluid slosh at resonance. Several spectral components are observed
for the transient slosh forces, which can be associated with the
excitation, resonance, and beat frequencies. The peak slosh forces
and moments in the vicinity of resonance are significantly larger than
those of the equivalent rigid mass. Due to the nature of coupling
between sloshing fluid and vehicle body, the issue of the dynamic
fluid-structure interaction is essential in the analysis of tank-vehicle
dynamics. A dynamic pitch plane model of a Tridem truck
incorporated the fluid slosh dynamics is developed to analyze the
fluid-vehicle interaction under the straight-line braking maneuvers.
The results show that the vehicle responses are highly associated
with the characteristics of fluid slosh force and moment.