Abstract: Evaluation of contact pressure, surface and
subsurface contact stresses are essential to know the functional
response of surface coatings and the contact behavior mainly depends
on surface roughness, material property, thickness of layer and the
manner of loading. Contact parameter evaluation of real rough
surface contacts mostly relies on statistical single asperity contact
approaches. In this work, a three dimensional layered solid rough
surface in contact with a rigid flat is modeled and analyzed using
finite element method. The rough surface of layered solid is
generated by FFT approach. The generated rough surface is exported
to a finite element method based ANSYS package through which the
bottom up solid modeling is employed to create a deformable solid
model with a layered solid rough surface on top. The discretization
and contact analysis are carried by using the same ANSYS package.
The elastic, elastoplastic and plastic deformations are continuous in
the present finite element method unlike many other contact models.
The Young-s modulus to yield strength ratio of layer is varied in the
present work to observe the contact parameters effect while keeping
the surface roughness and substrate material properties as constant.
The contacting asperities attain elastic, elastoplastic and plastic states
with their continuity and asperity interaction phenomena is inherently
included. The resultant contact parameters show that neighboring
asperity interaction and the Young-s modulus to yield strength ratio
of layer influence the bulk deformation consequently affect the
interface strength.
Abstract: Natural gas is the most popular fossil fuel in the
current era and future as well. Natural gas is existed in underground
reservoirs so it may contain many of non-hydrocarbon components
for instance, hydrogen sulfide, nitrogen and water vapor. These
impurities are undesirable compounds and cause several technical
problems for example, corrosion and environment pollution.
Therefore, these impurities should be reduce or removed from natural
gas stream. Khurmala dome is located in southwest Erbil-Kurdistan
region. The Kurdistan region government has paid great attention for
this dome to provide the fuel for Kurdistan region. However, the
Khurmala associated natural gas is currently flaring at the field.
Moreover, nowadays there is a plan to recover and trade this gas and
to use it either as feedstock to power station or to sell it in global
market. However, the laboratory analysis has showed that the
Khurmala sour gas has huge quantities of H2S about (5.3%) and CO2
about (4.4%). Indeed, Khurmala gas sweetening process has been
removed in previous study by using Aspen HYSYS. However,
Khurmala sweet gas still contents some quintets of water about 23
ppm in sweet gas stream. This amount of water should be removed or
reduced. Indeed, water content in natural gas cause several technical
problems such as hydrates and corrosion. Therefore, this study aims
to simulate the prospective Khurmala gas dehydration process by
using Aspen HYSYS V. 7.3 program. Moreover, the simulation
process succeeded in reducing the water content to less than 0.1ppm.
In addition, the simulation work is also achieved process
optimization by using several desiccant types for example, TEG and
DEG and it also study the relationship between absorbents type and
its circulation rate with HCs losses from glycol regenerator tower.
Abstract: A design of communication area for infrared
electronic-toll-collection systems to provide an extended
communication interval in the vehicle traveling direction and
regular boundary between contiguous traffic lanes is proposed.
By utilizing two typical low-cost commercial infrared LEDs with
different half-intensity angles Φ1/2 = 22◦ and 10◦, the radiation
pattern of the emitter is designed to properly adjust the spatial
distribution of the signal power. The aforementioned purpose
can be achieved with an LED array in a three-piece structure
with appropriate mounting angles. With this emitter, the influence
of the mounting parameters, including the mounting height and
mounting angles of the on-board unit and road-side unit, on the
system performance in terms of the received signal strength and
communication area are investigated. The results reveal that, for
our emitter proposed in this paper, the ideal ”long-and-narrow”
characteristic of the communication area is very little affected by
these mounting parameters. An optimum mounting configuration is
also suggested.
Abstract: The paper provides the basic overview of simulation optimization. The procedure of its practical using is demonstrated on the real example in simulator Witness. The simulation optimization is presented as a good tool for solving many problems in real praxis especially in production systems. The authors also characterize their own experiences and they mention the strengths and weakness of simulation optimization.
Abstract: This paper describes a prototype aircraft that can fly
slowly, safely and transmit wireless video for tasks like reconnaissance,
surveillance and target acquisition. The aircraft is designed to
fly in closed quarters like forests, buildings, caves and tunnels which
are often spacious but GPS reception is poor. Envisioned is that a
small, safe and slow flying vehicle can assist in performing dull,
dangerous and dirty tasks like disaster mitigation, search-and-rescue
and structural damage assessment.
Abstract: In this paper, we provide complete end-to-end delay analyses including the relay nodes for instant messages. Message Session Relay Protocol (MSRP) is used to provide congestion control for large messages in the Instant Messaging (IM) service. Large messages are broken into several chunks. These chunks may traverse through a maximum number of two relay nodes before reaching destination according to the IETF specification of the MSRP relay extensions. We discuss the current solutions of sending large instant messages and introduce a proposal to reduce message flows in the IM service. We consider virtual traffic parameter i.e., the relay nodes are stateless non-blocking for scalability purpose. This type of relay node is also assumed to have input rate at constant bit rate. We provide a new scheduling policy that schedules chunks according to their previous node?s delivery time stamp tags. Validation and analysis is shown for such scheduling policy. The performance analysis with the model introduced in this paper is simple and straight forward, which lead to reduced message flows in the IM service.
Abstract: This article describes the aspects of the formation of
the national idea and national identity through the prism of gender
control and its contradistinction to the obsolete, Soviet component.
The role of females in ethnic and national projects is considered from
the point of view of Dr. Nira Yuval-Davis: as biological reproducers
of the ethnic communities- members; as reproducers of the boarders
of ethnic/national groups; as central participants in the ideological
reproduction of community and transducers of its culture; as symbols
in ideology, reproduction and transformation of ethnic/national
categories; and as participants of national, economical, political and
military combats. The society of the transitional type uses the
symbolic resources of the formation of gender component in the
national project. The gender patterns act like cultural codes,
executing the important ideological function in formation of the
national female- image, i.e. the discussion on hijab - it-s not just the
discussion on control over the female body, it-s the discussion on the
metaphor of social order.
Abstract: The purpose of this study was to investigate effects of
modality and redundancy principles on music theory learning among
pupils of different anxiety levels. The lesson of music theory was
developed in three different modes, audio and image (AI), text with
image (TI) and audio with image and text (AIT). The independent
variables were the three modes of courseware. The moderator
variable was the anxiety level, while the dependent variable was the
post test score. The study sample consisted of 405 third-grade pupils.
Descriptive and inferential statistics were conducted to analyze the
collected data. Analyses of covariance (ANCOVA) and Post hoc
were carried out to examine the main effects as well as the
interaction effects of the independent variables on the dependent
variable. The findings of this study showed that medium anxiety
pupils performed significantly better than low and high anxiety
pupils in all the three treatment modes. The AI mode was found to
help pupils with high anxiety significantly more than the TI and AIT
modes.
Abstract: Market based models are frequently used in the resource
allocation on the computational grid. However, as the size of
the grid grows, it becomes difficult for the customer to negotiate
directly with all the providers. Middle agents are introduced to
mediate between the providers and customers and facilitate the
resource allocation process. The most frequently deployed middle
agents are the matchmakers and the brokers. The matchmaking agent
finds possible candidate providers who can satisfy the requirements
of the consumers, after which the customer directly negotiates with
the candidates. The broker agents are mediating the negotiation with
the providers in real time.
In this paper we present a new type of middle agent, the marketmaker.
Its operation is based on two parallel operations - through
the investment process the marketmaker is acquiring resources and
resource reservations in large quantities, while through the resale process
it sells them to the customers. The operation of the marketmaker
is based on the fact that through its global view of the grid it can
perform a more efficient resource allocation than the one possible in
one-to-one negotiations between the customers and providers.
We present the operation and algorithms governing the operation
of the marketmaker agent, contrasting it with the matchmaker and
broker agents. Through a series of simulations in the task oriented
domain we compare the operation of the three agents types. We find
that the use of marketmaker agent leads to a better performance in the
allocation of large tasks and a significant reduction of the messaging
overhead.
Abstract: In-vitro mouse co-culture of E14 embryonic stem cells
(ESCs) and OP9 stromal cells can recapitulate the earliest stages of
haematopoietic development, not accessible in human embryos,
supporting both haemogenic precursors and their primitive
haematopoietic progeny. 1α, 25-Dihydroxy-vitamin D3 (VD3) has
been demonstrated to be a powerful differentiation inducer for a wide
variety of neoplastic cells, and could enhance early differentiation of
ESCs into blood cells in E14/OP9 co-culture. This study aims to
ascertain whether VD3 is key in promoting differentiation and
suppressing proliferation, by separately investigating the effects of
VD3 on the proliferation phase of the E14 cell line and on stromal
OP9 cells.The results showed that VD3 inhibited the proliferation of
the cells in a dose-dependent manner, quantitatively by decreased cell
number, and qualitatively by alkaline-phosphatase staining that
revealed significant differences between VD3-treated and untreated
cells, characterised by decreased enzyme expression (colourless
cells). Propidium-iodide cell-cycle analyses showed no significant
percentage change in VD3-treated E14 and OP9 cells within their G
and S-phases, compared to the untreated controls, despite the
increased percentage of G-phase compared to the S-phase in a dosedependent
manner. These results with E14 and OP9 cells indicate that
adequate VD3 concentration enhances cellular differentiation and
inhibits proliferation. The results also suggest that if E14 and OP9
cells were co-cultured andVD3-treated, there would be furtherenhanced
differentiation of ESCs into blood cells.
Abstract: Space exploration is a highly visible endeavour of
humankind to seek profound answers to questions about the origins
of our solar system, whether life exists beyond Earth, and how we
could live on other worlds. Different platforms have been utilized in
planetary exploration missions, such as orbiters, landers, rovers, and
penetrators.
Having low mass, good mechanical contact with the surface,
ability to acquire high quality scientific subsurface data, and ability to
be deployed in areas that may not be conducive to landers or rovers,
Penetrators provide an alternative and complimentary solution that
makes possible scientific exploration of hardly accessible sites (icy
areas, gully sites, highlands etc.).
The Canadian Space Agency (CSA) has put space exploration as
one of the pillars of its space program, and established ExCo program
to prepare Canada for future international planetary exploration.
ExCo sets surface mobility as its focus and priority, and invests
mainly in the development of rovers because of Canada's niche space
robotics technology. Meanwhile, CSA is also investigating how
micro-penetrators can help Canada to fulfill its scientific objectives
for planetary exploration.
This paper presents a review of the micro-penetrator technologies,
past missions, and lessons learned. It gives a detailed analysis of the
technical challenges of micro-penetrators, such as high impact
survivability, high precision guidance navigation and control, thermal
protection, communications, and etc. Then, a Canadian perspective of
a possible micro-penetrator mission is given, including Canadian
scientific objectives and priorities, potential instruments, and flight
opportunities.
Abstract: This paper presents a method for determining the
uniaxial tensile properties such as Young-s modulus, yield strength
and the flow behaviour of a material in a virtually non-destructive
manner. To achieve this, a new dumb-bell shaped miniature
specimen has been designed. This helps in avoiding the removal of
large size material samples from the in-service component for the
evaluation of current material properties. The proposed miniature
specimen has an advantage in finite element modelling with respect
to computational time and memory space. Test fixtures have been
developed to enable the tension tests on the miniature specimen in a
testing machine. The studies have been conducted in a chromium
(H11) steel and an aluminum alloy (AR66). The output from the
miniature test viz. load-elongation diagram is obtained and the finite
element simulation of the test is carried out using a 2D plane stress
analysis. The results are compared with the experimental results. It is
observed that the results from the finite element simulation
corroborate well with the miniature test results. The approach seems
to have potential to predict the mechanical properties of the
materials, which could be used in remaining life estimation of the
various in-service structures.
Abstract: This paper describes new computer vision algorithms
that have been developed to track moving objects as part of a
long-term study into the design of (semi-)autonomous vehicles. We
present the results of a study to exploit variable kernels for tracking in
video sequences. The basis of our work is the mean shift
object-tracking algorithm; for a moving target, it is usual to define a
rectangular target window in an initial frame, and then process the data
within that window to separate the tracked object from the background
by the mean shift segmentation algorithm. Rather than use the
standard, Epanechnikov kernel, we have used a kernel weighted by the
Chamfer distance transform to improve the accuracy of target
representation and localization, minimising the distance between the
two distributions in RGB color space using the Bhattacharyya
coefficient. Experimental results show the improved tracking
capability and versatility of the algorithm in comparison with results
using the standard kernel. These algorithms are incorporated as part of
a robot test-bed architecture which has been used to demonstrate their
effectiveness.
Abstract: Model-based approaches have been applied successfully
to a wide range of tasks such as specification, simulation, testing, and
diagnosis. But one bottleneck often prevents the introduction of these
ideas: Manual modeling is a non-trivial, time-consuming task.
Automatically deriving models by observing and analyzing running
systems is one possible way to amend this bottleneck. To
derive a model automatically, some a-priori knowledge about the
model structure–i.e. about the system–must exist. Such a model
formalism would be used as follows: (i) By observing the network
traffic, a model of the long-term system behavior could be generated
automatically, (ii) Test vectors can be generated from the model,
(iii) While the system is running, the model could be used to diagnose
non-normal system behavior.
The main contribution of this paper is the introduction of a model
formalism called 'probabilistic regression automaton' suitable for the
tasks mentioned above.
Abstract: Bioinformatics and computational biology involve
the use of techniques including applied mathematics,
informatics, statistics, computer science, artificial intelligence,
chemistry, and biochemistry to solve biological problems
usually on the molecular level. Research in computational
biology often overlaps with systems biology. Major research
efforts in the field include sequence alignment, gene finding,
genome assembly, protein structure alignment, protein structure
prediction, prediction of gene expression and proteinprotein
interactions, and the modeling of evolution. Various
global rearrangements of permutations, such as reversals and
transpositions,have recently become of interest because of their
applications in computational molecular biology. A reversal is
an operation that reverses the order of a substring of a permutation.
A transposition is an operation that swaps two adjacent
substrings of a permutation. The problem of determining the
smallest number of reversals required to transform a given
permutation into the identity permutation is called sorting by
reversals. Similar problems can be defined for transpositions
and other global rearrangements. In this work we perform a
study about some genome rearrangement primitives. We show
how a genome is modelled by a permutation, introduce some
of the existing primitives and the lower and upper bounds
on them. We then provide a comparison of the introduced
primitives.
Abstract: In this study, the contact problem of a layered composite which consists of two materials with different elastic constants and heights resting on two rigid flat supports with sharp edges is considered. The effect of gravity is neglected. While friction between the layers is taken into account, it is assumed that there is no friction between the supports and the layered composite so that only compressive tractions can be transmitted across the interface. The layered composite is subjected to a uniform clamping pressure over a finite portion of its top surface. The problem is reduced to a singular integral equation in which the contact pressure is the unknown function. The singular integral equation is evaluated numerically and the results for various dimensionless quantities are presented in graphical forms.
Abstract: A generalized Digital Modulation Identification algorithm for adaptive demodulator has been developed and presented in this paper. The algorithm developed is verified using wavelet Transform and histogram computation to identify QPSK and QAM with GMSK and M–ary FSK modulations. It has been found that the histogram peaks simplifies the procedure for identification. The simulated results show that the correct modulation identification is possible to a lower bound of 5 dB and 12 dB for GMSK and QPSK respectively. When SNR is above 5 dB the throughput of the proposed algorithm is more than 97.8%. The receiver operating characteristics (ROC) has been computed to measure the performance of the proposed algorithm and the analysis shows that the probability of detection (Pd) drops rapidly when SNR is 5 dB and probability of false alarm (Pf) is smaller than 0.3. The performance of the proposed algorithm has been compared with existing methods and found it will identify all digital modulation schemes with low SNR.
Abstract: An integrated Artificial Neural Network- Particle Swarm Optimization (PSO) is presented for analyzing global electricity consumption. To aim this purpose, following steps are done: STEP 1: in the first step, PSO is applied in order to determine world-s oil, natural gas, coal and primary energy demand equations based on socio-economic indicators. World-s population, Gross domestic product (GDP), oil trade movement and natural gas trade movement are used as socio-economic indicators in this study. For each socio-economic indicator, a feed-forward back propagation artificial neural network is trained and projected for future time domain. STEP 2: in the second step, global electricity consumption is projected based on the oil, natural gas, coal and primary energy consumption using PSO. global electricity consumption is forecasted up to year 2040.
Abstract: Ethanol has been known for a long time, being
perhaps the oldest product obtained through traditional biotechnology
fermentation. Agriculture waste as substrate in fermentation is vastly
discussed as alternative to replace edible food and utilization of
organic material. Pineapple peel, highly potential source as substrate
is a by-product of the pineapple processing industry. Bio-ethanol
from pineapple (Ananas comosus) peel extract was carried out by
controlling fermentation without any treatment. Saccharomyces
ellipsoides was used as inoculum in this fermentation process as it is
naturally found at the pineapple skin. In this study, the capability of
Response Surface Methodology (RSM) for optimization of ethanol
production from pineapple peel extract using Saccharomyces
ellipsoideus in batch fermentation process was investigated. Effect of
five test variables in a defined range of inoculum concentration 6-
14% (v/v), pH (4.0-6.0), sugar concentration (14-22°Brix),
temperature (24-32°C) and time of incubation (30-54 hrs) on the
ethanol production were evaluated. Data obtained from experiment
were analyzed with RSM of MINITAB Software (Version 15)
whereby optimum ethanol concentration of 8.637% (v/v) was
determined. The optimum condition of 14% (v/v) inoculum
concentration, pH 6, 22°Brix, 26°C and 30hours of incubation. The
significant regression equation or model at the 5% level with
correlation value of 99.96% was also obtained.
Abstract: In this paper we examine the use of global texture analysis based approaches for the purpose of Persian font recognition in machine-printed document images. Most existing methods for font recognition make use of local typographical features and connected component analysis. However derivation of such features is not an easy task. Gabor filters are appropriate tools for texture analysis and are motivated by human visual system. Here we consider document images as textures and use Gabor filter responses for identifying the fonts. The method is content independent and involves no local feature analysis. Two different classifiers Weighted Euclidean Distance and SVM are used for the purpose of classification. Experiments on seven different type faces and four font styles show average accuracy of 85% with WED and 82% with SVM classifier over typefaces