Abstract: A multicriteria linear programming problem with integer variables and parameterized optimality principle "from lexicographic to Slater" is considered. A situation in which initial coefficients of penalty cost functions are not fixed but may be potentially a subject to variations is studied. For any efficient solution, appropriate measures of the quality are introduced which incorporate information about variations of penalty cost function coefficients. These measures correspond to the so-called stability and accuracy functions defined earlier for efficient solutions of a generic multicriteria combinatorial optimization problem with Pareto and lexicographic optimality principles. Various properties of such functions are studied and maximum norms of perturbations for which an efficient solution preserves the property of being efficient are calculated.
Abstract: While compressing text files is useful, compressing
still image files is almost a necessity. A typical image takes up much
more storage than a typical text message and without compression
images would be extremely clumsy to store and distribute. The
amount of information required to store pictures on modern
computers is quite large in relation to the amount of bandwidth
commonly available to transmit them over the Internet and
applications. Image compression addresses the problem of reducing
the amount of data required to represent a digital image. Performance
of any image compression method can be evaluated by measuring the
root-mean-square-error & peak signal to noise ratio. The method of
image compression that will be analyzed in this paper is based on the
lossy JPEG image compression technique, the most popular
compression technique for color images. JPEG compression is able to
greatly reduce file size with minimal image degradation by throwing
away the least “important" information. In JPEG, both color
components are downsampled simultaneously, but in this paper we
will compare the results when the compression is done by
downsampling the single chroma part. In this paper we will
demonstrate more compression ratio is achieved when the
chrominance blue is downsampled as compared to downsampling the
chrominance red in JPEG compression. But the peak signal to noise
ratio is more when the chrominance red is downsampled as compared
to downsampling the chrominance blue in JPEG compression. In
particular we will use the hats.jpg as a demonstration of JPEG
compression using low pass filter and demonstrate that the image is
compressed with barely any visual differences with both methods.
Abstract: Demand of energy is increasing faster than the
generation. It leads shortage of power in all sectors of society. At
peak hours this shortage is higher. Unless we utilize energy efficient
technology, it is very difficult to minimize the shortage of energy. So
energy efficiency program and energy conservation has an important
role. Energy efficient technologies are cost intensive hence it is
always not possible to implement in country like India. In the recent
study, an educational building with operating hours from 10:00 a.m.
to 05:00 p.m. has been selected to quantify the possibility of lighting
energy conservation. As the operating hour is in daytime, integration
of daylight with artificial lighting system will definitely reduce the
lighting energy consumption. Moreover the initial investment has
been given priority and hence the existing lighting installation was
unaltered. An automatic controller has been designed which will be
operated as a function of daylight through windows and the lighting
system of the room will function accordingly. The result of the study
of integrating daylight gave quite satisfactory for visual comfort as
well as energy conservation.
Abstract: Explosions may cause intensive damage to buildings
and sometimes lead to total and progressive destruction. Pressures
induced by explosions are one of the most destructive loads a
structure may experience. While designing structures for great
explosions may be expensive and impractical, engineers are looking
for methods for preventing destructions resulted from explosions. A
favorable structural system is a system which does not disrupt totally
due to local explosion, since such structures sustain less loss in
comparison with structural ones which really bear the load and
suddenly disrupt. Designing and establishing vital and necessary
installations in a way that it is resistant against direct hit of bomb and
rocket is not practical, economical, or expedient in many cases,
because the cost of construction and installation with such
specifications is several times more than the total cost of the related
equipment.
Abstract: In this paper delamination phenomenon in
Carbon-Epoxy laminated composite material is investigated
numerically. Arcan apparatus and specimen is modeled in ABAQUS
finite element software for different loading conditions and crack
geometries. The influence of variation of crack geometry on
interlaminar fracture stress intensity factor and energy release rate for
various mixed mode ratios and pure mode I and II was studied. Also,
correction factors for this specimen for different crack length ratios
were calculated. The finite element results indicate that for loading
angles close to pure mode-II loading, a high ratio of mode-II to
mode-I fracture is dominant and there is an opposite trend for loading
angles close to pure mode-I loading. It confirms that by varying the
loading angle of Arcan specimen pure mode-I, pure mode-II and a
wide range of mixed-mode loading conditions can be created and
tested. Also, numerical results confirm that the increase of the mode-
II loading contribution leads to an increase of fracture resistance in
the CF/PEI composite (i.e., a reduction in the total strain energy
release rate) and the increase of the crack length leads to a reduction
of interlaminar fracture resistance in the CF/PEI composite (i.e., an
increase in the total interlaminar strain energy release rate).
Abstract: This paper proposes a hybrid method for eyes localization
in facial images. The novelty is in combining techniques
that utilise colour, edge and illumination cues to improve accuracy.
The method is based on the observation that eye regions have dark
colour, high density of edges and low illumination as compared
to other parts of face. The first step in the method is to extract
connected regions from facial images using colour, edge density and
illumination cues separately. Some of the regions are then removed
by applying rules that are based on the general geometry and shape
of eyes. The remaining connected regions obtained through these
three cues are then combined in a systematic way to enhance the
identification of the candidate regions for the eyes. The geometry
and shape based rules are then applied again to further remove the
false eye regions. The proposed method was tested using images from
the PICS facial images database. The proposed method has 93.7%
and 87% accuracies for initial blobs extraction and final eye detection
respectively.
Abstract: This research aims to study the lead pollution in the air of Babylon governorate that resulted generally from vehicles exhausts in addition to industrial and human activities.Vehicles number in Babylon governorate increased significantly after year 2003 that resulted with increase in lead emissions into the air.Measurement of lead emissions was done in seven stations distributed randomly in Babylon governorate. These stations where located in Industrial (Al-Sena'ay) Quarter, 60 street (near to Babylon sewer directorate), 40 Street (near to the first intersection), Al-Hashmia city, Al-Mahaweel city, , Al- Musayab city in addition to another station in Sayd Idris village belong to Abugharaq district (Agricultural station for comparison). The measured concentrations in these stations were compared with the standard limits of Environmental Protection Agency EPA (2 μg /m3). The results of this study showed that the average of lead concentrations ,in Babylon governorate during year 2010, was (3.13 μg/m3) which was greater than standard limits (2 μg/m3). The maximum concentration of lead was (6.41 μg / m3) recorded in the Industrial (Al-Sena'ay) Quarter during April month, while the minimum concentrations was (0.36 μg / m3) recorded in the agricultural station (Abugharaq) during December month.
Abstract: A framework to estimate the state of dynamically
varying environment where data are generated from heterogeneous
sources possessing partial knowledge about the environment is presented.
This is entirely derived within Dempster-Shafer and Evidence
Filtering frameworks. The belief about the current state is expressed
as belief and plausibility functions. An addition to Single Input
Single Output Evidence Filter, Multiple Input Single Output Evidence
Filtering approach is introduced. Variety of applications such as
situational estimation of an emergency environment can be developed
within the framework successfully. Fire propagation scenario is used
to justify the proposed framework, simulation results are presented.
Abstract: Recent advances in wireless sensor networks have led
to many routing methods designed for energy-efficiency in wireless
sensor networks. Despite that many routing methods have been
proposed in USN, a single routing method cannot be energy-efficient
if the environment of the ubiquitous sensor network varies. We present
the controlling network access to various hosts and the services they
offer, rather than on securing them one by one with a network security
model. When ubiquitous sensor networks are deployed in hostile
environments, an adversary may compromise some sensor nodes and
use them to inject false sensing reports. False reports can lead to not
only false alarms but also the depletion of limited energy resource in
battery powered networks. The interleaved hop-by-hop authentication
scheme detects such false reports through interleaved authentication.
This paper presents a LMDD (Low energy method for data delivery)
algorithm that provides energy-efficiency by dynamically changing
protocols installed at the sensor nodes. The algorithm changes
protocols based on the output of the fuzzy logic which is the fitness
level of the protocols for the environment.
Abstract: Matrix metalloproteinase-3 (MMP3) is key member
of the MMP family, and is known to be present in coronary
atherosclerotic. Several studies have demonstrated that MMP-3
5A/6A polymorphism modify each transcriptional activity in allele
specific manner. We hypothesized that this polymorphism may play
a role as risk factor for development of coronary stenosis. The aim of
our study was to estimate MMP-3 (5A/6A) gene polymorphism on
interindividual variability in risk for coronary stenosis in an Iranian
population.DNA was extracted from white blood cells and genotypes
were obtained from coronary stenosis cases (n=95) and controls
(n=100) by PCR (polymerase chain reaction) and restriction
fragment length polymorphism techniques. Significant differences
between cases and controls were observed for MMP3 genotype
frequencies (X2=199.305, p< 0.001); the 6A allele was less
frequently seen in the control group, compared to the disease group
(85.79 vs. 78%, 6A/6A+5A/6A vs. 5A/5A, P≤0.001). These data
imply the involvement of -1612 5A/6A polymorphism in coronary
stenosis, and suggest that probably the 6A/6A MMP-3 genotype is a
genetic susceptibility factor for coronary stenosis.
Abstract: Addition of milli or micro sized particles to the heat
transfer fluid is one of the many techniques employed for improving
heat transfer rate. Though this looks simple, this method has
practical problems such as high pressure loss, clogging and erosion
of the material of construction. These problems can be overcome by
using nanofluids, which is a dispersion of nanosized particles in a
base fluid. Nanoparticles increase the thermal conductivity of the
base fluid manifold which in turn increases the heat transfer rate.
Nanoparticles also increase the viscosity of the basefluid resulting in
higher pressure drop for the nanofluid compared to the base fluid. So
it is imperative that the Reynolds number (Re) and the volume
fraction have to be optimum for better thermal hydraulic
effectiveness. In this work, the heat transfer enhancement using
aluminium oxide nanofluid using low and high volume fraction
nanofluids in turbulent pipe flow with constant wall temperature has
been studied by computational fluid dynamic modeling of the
nanofluid flow adopting the single phase approach. Nanofluid, up till
a volume fraction of 1% is found to be an effective heat transfer
enhancement technique. The Nusselt number (Nu) and friction factor
predictions for the low volume fractions (i.e. 0.02%, 0.1 and 0.5%)
agree very well with the experimental values of Sundar and Sharma
(2010). While, predictions for the high volume fraction nanofluids
(i.e. 1%, 4% and 6%) are found to have reasonable agreement with
both experimental and numerical results available in the literature.
So the computationally inexpensive single phase approach can be
used for heat transfer and pressure drop prediction of new nanofluids.
Abstract: This paper describes the design of a programmable
FSK-modulator based on VCO and its implementation in 0.35m
CMOS process. The circuit is used to transmit digital data at
100Kbps rate in the frequency range of 400-600MHz. The design
and operation of the modulator is discussed briefly. Further the
characteristics of PLL, frequency synthesizer, VCO and the whole
design are elaborated. The variation among the proposed and tested
specifications is presented. Finally, the layout of sub-modules, pin
configurations, final chip and test results are presented.
Abstract: In comparison to the original SVM, which involves a
quadratic programming task; LS–SVM simplifies the required
computation, but unfortunately the sparseness of standard SVM is
lost. Another problem is that LS-SVM is only optimal if the training
samples are corrupted by Gaussian noise. In Least Squares SVM
(LS–SVM), the nonlinear solution is obtained, by first mapping the
input vector to a high dimensional kernel space in a nonlinear
fashion, where the solution is calculated from a linear equation set. In
this paper a geometric view of the kernel space is introduced, which
enables us to develop a new formulation to achieve a sparse and
robust estimate.
Abstract: The utilize of renewable energy sources becomes
more crucial and fascinatingly, wider application of renewable
energy devices at domestic, commercial and industrial levels is not
only affect to stronger awareness but also significantly installed
capacities. Moreover, biomass principally is in form of woods and
converts to be energy for using by humans for a long time.
Gasification is a process of conversion of solid carbonaceous fuel
into combustible gas by partial combustion. Many gasified models
have various operating conditions because the parameters kept in
each model are differentiated. This study applied the experimental
data including three inputs variables including biomass consumption;
temperature at combustion zone and ash discharge rate and gas flow
rate as only one output variable. In this paper, response surface
methods were applied for identification of the gasified system
equation suitable for experimental data. The result showed that linear
model gave superlative results.
Abstract: CScheme, a concurrent programming paradigm based
on scheme concept enables concurrency schemes to be constructed
from smaller synchronization units through a GUI based composer
and latter be reused on other concurrency problems of a similar
nature. This paradigm is particularly important in the multi-core
environment prevalent nowadays. In this paper, we demonstrate
techniques to separate concurrency from functional code using the
CScheme paradigm. Then we illustrate how the CScheme
methodology can be used to solve some of the traditional
concurrency problems – critical section problem, and readers-writers
problem - using synchronization schemes such as Single Threaded
Execution Scheme, and Readers Writers Scheme.
Abstract: Fischer-Tropsch synthesis is one of the most
important catalytic reactions that convert the synthetic gas to light
and heavy hydrocarbons. One of the main issues is selecting the type
of reactor. The slurry bubble reactor is suitable choice for Fischer-
Tropsch synthesis because of its good qualification to transfer heat
and mass, high durability of catalyst, low cost maintenance and
repair. The more common catalysts for Fischer-Tropsch synthesis are
Iron-based and Cobalt-based catalysts, the advantage of these
catalysts on each other depends on which type of hydrocarbons we
desire to produce. In this study, Fischer-Tropsch synthesis is modeled
with Iron and Cobalt catalysts in a slurry bubble reactor considering
mass and momentum balance and the hydrodynamic relations effect
on the reactor behavior. Profiles of reactant conversion and reactant
concentration in gas and liquid phases were determined as the
functions of residence time in the reactor. The effects of temperature,
pressure, liquid velocity, reactor diameter, catalyst diameter, gasliquid
and liquid-solid mass transfer coefficients and kinetic
coefficients on the reactant conversion have been studied. With 5%
increase of liquid velocity (with Iron catalyst), H2 conversions
increase about 6% and CO conversion increase about 4%, With 8%
increase of liquid velocity (with Cobalt catalyst), H2 conversions
increase about 26% and CO conversion increase about 4%. With
20% increase of gas-liquid mass transfer coefficient (with Iron
catalyst), H2 conversions increase about 12% and CO conversion
increase about 10% and with Cobalt catalyst H2 conversions increase
about 10% and CO conversion increase about 6%. Results show that
the process is sensitive to gas-liquid mass transfer coefficient and
optimum condition operation occurs in maximum possible liquid
velocity. This velocity must be more than minimum fluidization
velocity and less than terminal velocity in such a way that avoid
catalysts particles from leaving the fluidized bed.
Abstract: Sparse representation which can represent high dimensional
data effectively has been successfully used in computer vision
and pattern recognition problems. However, it doesn-t consider the
label information of data samples. To overcome this limitation,
we develop a novel dimensionality reduction algorithm namely
dscriminatively regularized sparse subspace learning(DR-SSL) in this
paper. The proposed DR-SSL algorithm can not only make use of
the sparse representation to model the data, but also can effective
employ the label information to guide the procedure of dimensionality
reduction. In addition,the presented algorithm can effectively deal
with the out-of-sample problem.The experiments on gene-expression
data sets show that the proposed algorithm is an effective tool for
dimensionality reduction and gene-expression data classification.
Abstract: This paper presents a robust method to detect obstacles in stereo images using shadow removal technique and color information. Stereo vision based obstacle detection is an algorithm that aims to detect and compute obstacle depth using stereo matching and disparity map. The proposed advanced method is divided into three phases, the first phase is detecting obstacles and removing shadows, the second one is matching and the last phase is depth computing. We propose a robust method for detecting obstacles in stereo images using a shadow removal technique based on color information in HIS space, at the first phase. In this paper we use Normalized Cross Correlation (NCC) function matching with a 5 × 5 window and prepare an empty matching table τ and start growing disparity components by drawing a seed s from S which is computed using canny edge detector, and adding it to τ. In this way we achieve higher performance than the previous works [2,17]. A fast stereo matching algorithm is proposed that visits only a small fraction of disparity space in order to find a semi-dense disparity map. It works by growing from a small set of correspondence seeds. The obstacle identified in phase one which appears in the disparity map of phase two enters to the third phase of depth computing. Finally, experimental results are presented to show the effectiveness of the proposed method.
Abstract: The study aimed to evaluated the reproductive performance response to short term oestrus synchronization during the transition period. One hundred and sixty-five indigenous multiparous non-lactating goats were subdivided into the following six treatment groups for oestrus synchronization: NT control Group (N= 30), Fe-21d, FGA vaginal sponge for 21days+eCG at 19thd; FPe- 11d, FGA 11d + PGF2α and eCG at 9th d; FPe-10d, FGA 10d+ PGF2α and eCG at 8th d; FPe-9d, FGA 9d +PGF2α and eCG at 7thd; PFe-5d, PGF2α at d0 + FGA 5d + eCG at 5thd. The goats were natural mated (1 male/6 females). Fecundity rates (n. births /n. females treated x 100) were statistically higher (P < 0.05) in short term FPe-9d (157.9%), FPe- 11d (115.4%), FPe-10d (111.1%) and PFe-5d (107.7%) groups compared to the NT control Group (66.7%).
Abstract: The primary purpose of this article is an attempt to
find the implication of globalization on education. Globalization has
an important role as a process in the economical, political, cultural
and technological dimensions in the life of the contemporary human
being and has been affected by it. Education has its effects in this
procedure and while influencing it through educating global citizens
having universal human features and characteristics, has been
influenced by this phenomenon too. Nowadays, the role of education
is not just to develop in the students the knowledge and skills
necessary for the new kinds of jobs. If education wants to help
students be prepared of the new global society, it has to make them
engaged productive and critical citizens for the global era, so that
they can reflect about their roles as key actors in a dynamic often
uneven, matrix of economic and cultural exchanges. If education
wants to reinforce and raise the national identity, the value system
and the children and teenagers, it should make them ready for living
in the global era of this century. The used method in this research is
documentary and analyzing the documents. Studies in this field show
globalization has influences on the processes of the production,
distribution and consuming of knowledge. The happening of this
event in the information era has not only provide the necessary
opportunities for the exchanges of education worldwide but also has
privileges for the developing countries which enables them to
strengthen educational bases of their society and have an important
step toward their future.