Abstract: Genetic Zone Routing Protocol (GZRP) is a new
hybrid routing protocol for MANETs which is an extension of ZRP
by using Genetic Algorithm (GA). GZRP uses GA on IERP and BRP
parts of ZRP to provide a limited set of alternative routes to the
destination in order to load balance the network and robustness
during node/link failure during the route discovery process. GZRP is
studied for its performance compared to ZRP in many folds like
scalability for packet delivery and proved with improved results. This
paper presents the results of the effect of load balancing on GZRP.
The results show that GZRP outperforms ZRP while balancing the
load.
Abstract: Sleep spindles are the most interesting hallmark of
stage 2 sleep EEG. Their accurate identification in a
polysomnographic signal is essential for sleep professionals to help
them mark Stage 2 sleep. Sleep Spindles are also promising objective
indicators for neurodegenerative disorders. Visual spindle scoring
however is a tedious workload. In this paper three different
approaches are used for the automatic detection of sleep spindles:
Short Time Fourier Transform, Wavelet Transform and Wave
Morphology for Spindle Detection. In order to improve the results, a
combination of the three detectors is presented and comparison with
human expert scorers is performed. The best performance is obtained
with a combination of the three algorithms which resulted in a
sensitivity and specificity of 94% when compared to human expert
scorers.
Abstract: Microtomographic images and thin section (TS)
images were analyzed and compared against some parameters of
geological interest such as porosity and its distribution along the
samples. The results show that microtomography (CT) analysis,
although limited by its resolution, have some interesting information
about the distribution of porosity (homogeneous or not) and can also
quantify the connected and non-connected pores, i.e., total porosity.
TS have no limitations concerning resolution, but are limited by the
experimental data available in regards to a few glass sheets for
analysis and also can give only information about the connected
pores, i.e., effective porosity. Those two methods have their own
virtues and flaws but when paired together they are able to
complement one another, making for a more reliable and complete
analysis.
Abstract: We present a method to create special domain
collections from news sites. The method only requires a single
sample article as a seed. No prior corpus statistics are needed and the
method is applicable to multiple languages. We examine various
similarity measures and the creation of document collections for
English and Japanese. The main contributions are as follows. First,
the algorithm can build special domain collections from as little as
one sample document. Second, unlike other algorithms it does not
require a second “general" corpus to compute statistics. Third, in our
testing the algorithm outperformed others in creating collections
made up of highly relevant articles.
Abstract: A scaffold is necessary for tooth regeneration because of its three-dimensional geometry. For restoration of defect, it is necessary for the scaffold to be prepared in the shape of the defect. Sponges made from polyvinyl alcohol with formalin cross-linking (PVF sponge) have been used for scaffolds for bone formation in vivo. To induce osteogenesis within the sponge, methods of growing rat bone marrow cells (rBMCs) among the fiber structures in the sponge might be considered. Storage of rBMCs among the fibers in the sponge coated with dextran (10 kDa) was tried. After seeding of rBMCs to PVF sponge immersed in dextran solution at 2 g/dl concentration, osteogenesis was recognized in subcutaneously implanted PVF sponge as a scaffold in vivo. The level of osteocalcin was 25.28±5.71 ng/scaffold and that of Ca was 129.20±19.69 µg/scaffold. These values were significantly higher than those in sponges without dextran coating (p
Abstract: Recently, Genetic Algorithms (GA) and Differential
Evolution (DE) algorithm technique have attracted considerable
attention among various modern heuristic optimization techniques.
Since the two approaches are supposed to find a solution to a given
objective function but employ different strategies and computational
effort, it is appropriate to compare their performance. This paper
presents the application and performance comparison of DE and GA
optimization techniques, for flexible ac transmission system
(FACTS)-based controller design. The design objective is to enhance
the power system stability. The design problem of the FACTS-based
controller is formulated as an optimization problem and both the PSO
and GA optimization techniques are employed to search for optimal
controller parameters. The performance of both optimization
techniques has been compared. Further, the optimized controllers are
tested on a weekly connected power system subjected to different
disturbances, and their performance is compared with the
conventional power system stabilizer (CPSS). The eigenvalue
analysis and non-linear simulation results are presented and
compared to show the effectiveness of both the techniques in
designing a FACTS-based controller, to enhance power system
stability.
Abstract: After the accounting scandals and the financial crisis, regulators have stressed the need for more financial experts on boards. Several studies conducted in countries with developed capital markets report positive effects of board financial competencies. As each country offers a different context and specific institutional factors this paper addresses the subject in the context of Romania. The Romanian capital market offers an interesting research field because of the heterogeneity of listed firms. After analyzing board members education based on public information posted on listed companies websites and their annual reports we found a positive association between the proportion of board members holding a postgraduate degree in financial fields and market based performance measured by Tobin q. We found also that the proportion of Board members holding degrees in financial fields is higher in bigger firms and firms with more concentrated ownership.
Abstract: The H.264/AVC standard uses an intra prediction, 9
directional modes for 4x4 luma blocks and 8x8 luma blocks, 4
directional modes for 16x16 macroblock and 8x8 chroma blocks,
respectively. It means that, for a macroblock, it has to perform 736
different RDO calculation before a best RDO modes is determined.
With this Multiple intra-mode prediction, intra coding of H.264/AVC
offers a considerably higher improvement in coding efficiency
compared to other compression standards, but computational
complexity is increased significantly. This paper presents a fast intra
prediction algorithm for H.264/AVC intra prediction based a
characteristic of homogeneity information. In this study, the gradient
prediction method used to predict the homogeneous area and the
quadratic prediction function used to predict the nonhomogeneous
area. Based on the correlation between the homogeneity and block
size, the smaller block is predicted by gradient prediction and
quadratic prediction, so the bigger block is predicted by gradient
prediction. Experimental results are presented to show that the
proposed method reduce the complexity by up to 76.07%
maintaining the similar PSNR quality with about 1.94%bit rate
increase in average.
Abstract: In this paper, an analysis is presented, which
demonstrates the effect pre-logic factoring could have on an
automated combinational logic synthesis process succeeding it. The
impact of pre-logic factoring for some arbitrary combinatorial
circuits synthesized within a FPGA based logic design environment
has been analyzed previously. This paper explores a similar effect,
but with the non-regenerative logic synthesized using elements of a
commercial standard cell library. On an overall basis, the results
obtained pertaining to the analysis on a variety of MCNC/IWLS
combinational logic benchmark circuits indicate that pre-logic
factoring has the potential to facilitate simultaneous power, delay and
area optimized synthesis solutions in many cases.
Abstract: Segmentation is an important step in medical image
analysis and classification for radiological evaluation or computer
aided diagnosis. The CAD (Computer Aided Diagnosis ) of lung CT
generally first segment the area of interest (lung) and then analyze
the separately obtained area for nodule detection in order to
diagnosis the disease. For normal lung, segmentation can be
performed by making use of excellent contrast between air and
surrounding tissues. However this approach fails when lung is
affected by high density pathology. Dense pathologies are present in
approximately a fifth of clinical scans, and for computer analysis
such as detection and quantification of abnormal areas it is vital that
the entire and perfectly lung part of the image is provided and no
part, as present in the original image be eradicated. In this paper we
have proposed a lung segmentation technique which accurately
segment the lung parenchyma from lung CT Scan images. The
algorithm was tested against the 25 datasets of different patients
received from Ackron Univeristy, USA and AGA Khan Medical
University, Karachi, Pakistan.
Abstract: A suspension bridge is the most suitable type of structure for a long-span bridge due to rational use of structural materials. Increased deformability, which is conditioned by appearance of the elastic and kinematic displacements, is the major disadvantage of suspension bridges. The problem of increased kinematic displacements under the action of non-symmetrical load can be solved by prestressing. The prestressed suspension bridge with the span of 200 m was considered as an object of investigations. The cable truss with the cross web was considered as the main load carrying structure of the prestressed suspension bridge. The considered cable truss was optimized by 47 variable factors using Genetic algorithm and FEM program ANSYS. It was stated, that the maximum total displacements are reduced up to 29.9% by using of the cable truss with the rational characteristics instead of the single cable in the case of the worst situated load.
Abstract: The tray/multi-tray distillation process is a topic that
has been investigated to great detail over the last decade by many
teams such as Jubran et al. [1], Adhikari et al. [2], Mowla et al. [3],
Shatat et al. [4] and Fath [5] to name a few. A significant amount of
work and effort was spent focusing on modeling and/simulation of
specific distillation hardware designs. In this work, we have focused
our efforts on investigating and gathering experimental data on
several engineering and design variables to quantify their influence
on the yield of the multi-tray distillation process. Our goals are to
generate experimental performance data to bridge some existing gaps
in the design, engineering, optimization and theoretical modeling
aspects of the multi-tray distillation process.
Abstract: In this paper, a two factor scheme is proposed to
generate cryptographic keys directly from biometric data, which
unlike passwords, are strongly bound to the user. Hash value of the
reference iris code is used as a cryptographic key and its length
depends only on the hash function, being independent of any other
parameter. The entropy of such keys is 94 bits, which is much higher
than any other comparable system. The most important and distinct
feature of this scheme is that it regenerates the reference iris code by
providing a genuine iris sample and the correct user password. Since
iris codes obtained from two images of the same eye are not exactly
the same, error correcting codes (Hadamard code and Reed-Solomon
code) are used to deal with the variability. The scheme proposed here
can be used to provide keys for a cryptographic system and/or for
user authentication. The performance of this system is evaluated on
two publicly available databases for iris biometrics namely CBS and
ICE databases. The operating point of the system (values of False
Acceptance Rate (FAR) and False Rejection Rate (FRR)) can be set
by properly selecting the error correction capacity (ts) of the Reed-
Solomon codes, e.g., on the ICE database, at ts = 15, FAR is 0.096%
and FRR is 0.76%.
Abstract: This paper proposes a new algebraic scheme to design a PID controller for higher order linear time invariant continuous systems. Modified PSO (MPSO) based model order formulation techniques have applied to obtain the effective formulated second order system. A controller is tuned to meet the desired performance specification by using pole-zero cancellation method. Proposed PID controller is attached with both higher order system and formulated second order system. The closed loop response is observed for stabilization process and compared with general PSO based formulated second order system. The proposed method is illustrated through numerical example from literature.
Abstract: A co-generation system in automobile can improve
thermal efficiency of vehicle in some degree. The waste heat from the
engine exhaust and coolant is still attractive energy source that reaches
around 60% of the total energy converted from fuel. To maximize the
effectiveness of heat exchangers for recovering the waste heat, it is
vital to select the most suitable working fluid for the system, not to
mention that it is important to find the optimum design for the heat
exchangers. The design of heat exchanger is out of scoop of this study;
rather, the main focus has been on the right selection of working fluid
for the co-generation system. Simulation study was carried out to find
the most suitable working fluid that can allow the system to achieve
the optimum efficiency in terms of the heat recovery rate and thermal
efficiency.
Abstract: In April 2009, a new variant of Influenza A virus
subtype H1N1 emerged in Mexico and spread all over the world. The
influenza has three subtypes in human (H1N1, H1N2 and H3N2)
Types B and C influenza tend to be associated with local or regional
epidemics. Preliminary genetic characterization of the influenza
viruses has identified them as swine influenza A (H1N1) viruses.
Nucleotide sequence analysis of the Haemagglutinin (HA) and
Neuraminidase (NA) are similar to each other and the majority of
their genes of swine influenza viruses, two genes coding for the
neuraminidase (NA) and matrix (M) proteins are similar to
corresponding genes of swine influenza. Sequence similarity between
the 2009 A (H1N1) virus and its nearest relatives indicates that its
gene segments have been circulating undetected for an extended
period. Nucleic acid sequence Maximum Likelihood (MCL) and
DNA Empirical base frequencies, Phylogenetic relationship amongst
the HA genes of H1N1 virus isolated in Genbank having high
nucleotide sequence homology.
In this paper we used 16 HA nucleotide sequences from NCBI for
computing sequence relationships similarity of swine influenza A
virus using the following method MCL the result is 28%, 36.64% for
Optimal tree with the sum of branch length, 35.62% for Interior
branch phylogeny Neighber – Join Tree, 1.85% for the overall
transition/transversion, and 8.28% for Overall mean distance.
Abstract: Modern applications realized onto FPGAs exhibit high connectivity demands. Throughout this paper we study the routing constraints of Virtex devices and we propose a systematic methodology for designing a novel general-purpose interconnection network targeting to reconfigurable architectures. This network consists of multiple segment wires and SB patterns, appropriately selected and assigned across the device. The goal of our proposed methodology is to maximize the hardware utilization of fabricated routing resources. The derived interconnection scheme is integrated on a Virtex style FPGA. This device is characterized both for its high-performance, as well as for its low-energy requirements. Due to this, the design criterion that guides our architecture selections was the minimal Energy×Delay Product (EDP). The methodology is fully-supported by three new software tools, which belong to MEANDER Design Framework. Using a typical set of MCNC benchmarks, extensive comparison study in terms of several critical parameters proves the effectiveness of the derived interconnection network. More specifically, we achieve average Energy×Delay Product reduction by 63%, performance increase by 26%, reduction in leakage power by 21%, reduction in total energy consumption by 11%, at the expense of increase of channel width by 20%.
Abstract: In this paper our aim is to explore the construction of schoolgirl femininities, drawing on the results of an ethnographic study conducted in a high school in Ankara, Turkey. In this case study which tries to explore the complexities of gender discourses, we were initially motivated by the questions that have been put forward by critical and feminist literature on education which emphasize the necessarily conflicting and partial nature of both reproduction and resistance and the importance of gendered power relations in the school context. Drawing on this paradigm our research tries to address to a more specific question: how are multiple schoolgirl femininities constructed within the context of gendered school culture, and especially in relation to hegemonic masculinity? Our study reveals that the general framework of multiple femininities is engendered by a tension between two inter-related positions. The first one is different strategies of accommodation and resistance to the gender-related problems of education. The second one is the school experience of girls which is conditioned by their differential position vis-à-vis the masculine resistance culture that is dominant in the school.
Abstract: Impinging jets are used in various industrial areas as a cooling and drying technique. The current research is concerned with the means of improving the heat transfer for configurations with a minimum distance of the nozzle to the impingement surface. The impingement heat transfer is described using numerical methods over a wide range of parameters for an array of planar jets. These parameters include varying jet flow speed, width of nozzle, distance of nozzle, angle of the jet flow, velocity and geometry of the impingement surface. Normal pressure and shear stress are computed as additional parameters. Using dimensionless characteristic numbers the parameters and the results are correlated to gain generalized equations. The results demonstrate the effect of the investigated parameters on the flow.
Abstract: Automatic reading of handwritten cheque is a computationally
complex process and it plays an important role in financial
risk management. Machine vision and learning provide a viable
solution to this problem. Research effort has mostly been focused
on recognizing diverse pitches of cheques and demand drafts with an
identical outline. However most of these methods employ templatematching
to localize the pitches and such schemes could potentially
fail when applied to different types of outline maintained by the
bank. In this paper, the so-called outline problem is resolved by
a cheque information tree (CIT), which generalizes the localizing
method to extract active-region-of-entities. In addition, the weight
based density plot (WBDP) is performed to isolate text entities and
read complete pitches. Recognition is based on texture features using
neural classifiers. Legal amount is subsequently recognized by both
texture and perceptual features. A post-processing phase is invoked
to detect the incorrect readings by Type-2 grammar using the Turing
machine. The performance of the proposed system was evaluated
using cheque and demand drafts of 22 different banks. The test data
consists of a collection of 1540 leafs obtained from 10 different
account holders from each bank. Results show that this approach
can easily be deployed without significant design amendments.