Abstract: This paper presents a novel method for remaining
useful life prediction using the Elliptical Basis Function (EBF)
network and a Markov chain. The EBF structure is trained by a
modified Expectation-Maximization (EM) algorithm in order to take
into account the missing covariate set. No explicit extrapolation is
needed for internal covariates while a Markov chain is constructed to
represent the evolution of external covariates in the study. The
estimated external and the unknown internal covariates constitute an
incomplete covariate set which are then used and analyzed by the EBF
network to provide survival information of the asset. It is shown in the
case study that the method slightly underestimates the remaining
useful life of an asset which is a desirable result for early maintenance
decision and resource planning.
Abstract: Controlled modification of appropriate sharpness for
nanotips is of paramount importance to develop novel materials and
functional devices at a nanometer resolution. Herein, we present a
reliable and unique strategy of laser irradiation enhanced
physicochemical etching to manufacture super sharp tungsten tips
with reproducible shape and dimension as well as high yields
(~80%). The corresponding morphology structure evolution of
tungsten tips and laser-tip interaction mechanisms were
systematically investigated and discussed using field emission
scanning electron microscope (SEM) and physical optics statistics
method with different fluences under 532 nm laser irradiation. This
work paves the way for exploring more accessible metallic tips
applications with tunable apex diameter and aspect ratio, and,
furthermore, facilitates the potential sharpening enhancement
technique for other materials used in a variety of nanoscale devices.
Abstract: Linear and weakly nonlinear analysis of shallow wake
flows is presented in the present paper. The evolution of the most
unstable linear mode is described by the complex Ginzburg-Landau
equation (CGLE). The coefficients of the CGLE are calculated
numerically from the solution of the corresponding linear stability
problem for a one-parametric family of shallow wake flows. It is
shown that the coefficients of the CGLE are not so sensitive to the
variation of the base flow profile.
Abstract: As the new industrial revolution advances in the
nanotechnology have been followed with interest throughout the
world and also in Turkey. Media has an important role in conveying
these advances to public, rising public awareness and creating
attitudes related to nanotechnology. As well as representing how a
subject is treated, media frames determine how public think about
this subject. In literature definite frames related to nanoscience and
nanotechnology such as process, regulation, conflict and risks were
mentioned in studies focusing different countries. So how
nanotechnology news is treated by which frames and in which news
categories in Turkey as a one of developing countries? In this study
examining different variables about nanotechnology that affect
public attitudes such as category, frame, story tone, source in Turkish
media via framing analysis developed in agenda setting studies was
aimed. In the analysis data between 2005 and 2009 obtained from the
first five national newspapers with wide circulation in Turkey will be
used. In this study the direction of the media about nanotechnology,
in which frames nanotechnologic advances brought to agenda were
reported as news, and sectoral, legal, economic and social scenes
reflected by these frames to public related to nanotechnology in
Turkey were planned.
Abstract: Phylogenetic tree is a graphical representation of the
evolutionary relationship among three or more genes or organisms.
These trees show relatedness of data sets, species or genes
divergence time and nature of their common ancestors. Quality of a
phylogenetic tree requires parsimony criterion. Various approaches
have been proposed for constructing most parsimonious trees. This
paper is concerned about calculating and optimizing the changes of
state that are needed called Small Parsimony Algorithms. This paper
has proposed enhanced small parsimony algorithm to give better
score based on number of evolutionary changes needed to produce
the observed sequence changes tree and also give the ancestor of the
given input.
Abstract: The subcellular organelles called oil bodies (OBs) are lipid-filled quasi-spherical droplets produced from the endoplasmic reticulum (ER) and then released into the cytoplasm during seed development. It is believed that an OB grows by coalescence with other OBs and that its stability depends on the composition of oleosins, major proteins inserted in the hemi membrane that covers OBs. In this study, we measured the OB-volume distribution from different genotypes of A. thaliana after 7, 8, 9, 10 and 11 days of seed development. In order to test the hypothesis of OBs dynamics, we developed a simple mathematical model using non-linear differential equations inspired from the theory of coagulation. The model describes the evolution of OB-volume distribution during the first steps of seed development by taking into consideration the production of OBs, the increase of triacylglycerol volume to be stored, and the growth by coalescence of OBs. Fitted parameters values show an increase in the OB production and coalescence rates in A. thaliana oleosin mutants compared to wild type.
Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: Perspective of food security in 21 century showed
shortage of food that production is faced to vital problem. Food
security strategy is applied longtime method to assess required food.
Meanwhile, nanotechnology revolution changes the world face.
Nanotechnology is adequate method utilize of its characteristics to
decrease environmental problems and possible further access to food
for small farmers. This article will show impact of production and
adoption of nanocrops on food security. Population is researchers of
agricultural research center of Esfahan province. The results of study
show that there was a relationship between uses, conversion,
distribution, and production of nanocrops, operative human
resources, operative circumstance, and constrains of usage of
nanocrops and food security. Multivariate regression analysis by
enter model shows that operative circumstance, use, production and
constrains of usage of nanocrops had positive impact on food security
and they determine in four steps 20 percent of it.
Abstract: This paper presents a mathematical model and a
methodology to analyze the losses in transmission expansion
planning (TEP) under uncertainty in demand. The methodology is
based on discrete particle swarm optimization (DPSO). DPSO is a
useful and powerful stochastic evolutionary algorithm to solve the
large-scale, discrete and nonlinear optimization problems like TEP.
The effectiveness of the proposed idea is tested on an actual
transmission network of the Azerbaijan regional electric company,
Iran. The simulation results show that considering the losses even for
transmission expansion planning of a network with low load growth
is caused that operational costs decreases considerably and the
network satisfies the requirement of delivering electric power more
reliable to load centers.
Abstract: A satured liquid is warmed until boiling in a parallelepipedic boiler. The heat is supplied in a liquid through the horizontal bottom of the boiler, the other walls being adiabatic. During the process of boiling, the liquid evaporates through its free surface by deforming it. This surface which subdivides the boiler into two regions occupied on both sides by the boiled liquid (broth) and its vapor which surmounts it. The broth occupying the region and its vapor the superior region. A two- fluids model is used to describe the dynamics of the broth, its vapor and their interface. In this model, the broth is treated as a monophasic fluid (homogeneous model) and form with its vapor adiphasic pseudo fluid (two-fluid model). Furthermore, the interface is treated as a zone of mixture characterized by superficial void fraction noted α* . The aim of this article is to describe the dynamics of the interface between the boiled fluid and its vapor within a boiler. The resolution of the problem allowed us to show the evolution of the broth and the level of the liquid.
Abstract: Reduction of Single Input Single Output (SISO) discrete systems into lower order model, using a conventional and an evolutionary technique is presented in this paper. In the conventional technique, the mixed advantages of Modified Cauer Form (MCF) and differentiation are used. In this method the original discrete system is, first, converted into equivalent continuous system by applying bilinear transformation. The denominator of the equivalent continuous system and its reciprocal are differentiated successively, the reduced denominator of the desired order is obtained by combining the differentiated polynomials. The numerator is obtained by matching the quotients of MCF. The reduced continuous system is converted back into discrete system using inverse bilinear transformation. In the evolutionary technique method, Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical example.
Abstract: With Power system movement toward restructuring along with factors such as life environment pollution, problems of transmission expansion and with advancement in construction technology of small generation units, it is expected that small units like wind turbines, fuel cells, photovoltaic, ... that most of the time connect to the distribution networks play a very essential role in electric power industry. With increase in developing usage of small generation units, management of distribution networks should be reviewed. The target of this paper is to present a new method for optimal management of active and reactive power in distribution networks with regard to costs pertaining to various types of dispersed generations, capacitors and cost of electric energy achieved from network. In other words, in this method it-s endeavored to select optimal sources of active and reactive power generation and controlling equipments such as dispersed generations, capacitors, under load tapchanger transformers and substations in a way that firstly costs in relation to them are minimized and secondly technical and physical constraints are regarded. Because the optimal management of distribution networks is an optimization problem with continuous and discrete variables, the new evolutionary method based on Ant Colony Algorithm has been applied. The simulation results of the method tested on two cases containing 23 and 34 buses exist and will be shown at later sections.
Abstract: In this paper we present an adaptive method for image
compression that is based on complexity level of the image. The
basic compressor/de-compressor structure of this method is a multilayer
perceptron artificial neural network. In adaptive approach
different Back-Propagation artificial neural networks are used as
compressor and de-compressor and this is done by dividing the
image into blocks, computing the complexity of each block and then
selecting one network for each block according to its complexity
value. Three complexity measure methods, called Entropy, Activity
and Pattern-based are used to determine the level of complexity in
image blocks and their ability in complexity estimation are evaluated
and compared. In training and evaluation, each image block is
assigned to a network based on its complexity value. Best-SNR is
another alternative in selecting compressor network for image blocks
in evolution phase which chooses one of the trained networks such
that results best SNR in compressing the input image block. In our
evaluations, best results are obtained when overlapping the blocks is
allowed and choosing the networks in compressor is based on the
Best-SNR. In this case, the results demonstrate superiority of this
method comparing with previous similar works and JPEG standard
coding.
Abstract: The Prediction of aerodynamic characteristics and
shape optimization of airfoil under the ground effect have been carried
out by integration of computational fluid dynamics and the multiobjective
Pareto-based genetic algorithm. The main flow
characteristics around an airfoil of WIG craft are lift force, lift-to-drag
ratio and static height stability (H.S). However, they show a strong
trade-off phenomenon so that it is not easy to satisfy the design
requirements simultaneously. This difficulty can be resolved by the
optimal design. The above mentioned three characteristics are chosen
as the objective functions and NACA0015 airfoil is considered as a
baseline model in the present study. The profile of airfoil is
constructed by Bezier curves with fourteen control points and these
control points are adopted as the design variables. For multi-objective
optimization problems, the optimal solutions are not unique but a set
of non-dominated optima and they are called Pareto frontiers or Pareto
sets. As the results of optimization, forty numbers of non- dominated
Pareto optima can be obtained at thirty evolutions.
Abstract: A transient finite element model has been developed
to study the heat transfer and fluid flow during spot Gas Tungsten
Arc Welding (GTAW) on stainless steel. Temperature field, fluid
velocity and electromagnetic fields are computed inside the cathode,
arc-plasma and anode using a unified MHD formulation. The
developed model is then used to study the influence of different
helium-argon gas mixtures on both the energy transferred to the
workpiece and the time evolution of the weld pool dimensions. It is
found that the addition of helium to argon increases the heat flux
density on the weld axis by a factor that can reach 6.5. This induces
an increase in the weld pool depth by a factor of 3. It is also found
that the addition of only 10% of argon to helium decreases
considerably the weld pool depth, which is due to the electrical
conductivity of the mixture that increases significantly when argon is
added to helium.
Abstract: This paper explores university course timetabling
problem. There are several characteristics that make scheduling and
timetabling problems particularly difficult to solve: they have huge
search spaces, they are often highly constrained, they require
sophisticated solution representation schemes, and they usually
require very time-consuming fitness evaluation routines. Thus
standard evolutionary algorithms lack of efficiency to deal with
them. In this paper we have proposed a memetic algorithm that
incorporates the problem specific knowledge such that most of
chromosomes generated are decoded into feasible solutions.
Generating vast amount of feasible chromosomes makes the progress
of search process possible in a time efficient manner. Experimental
results exhibit the advantages of the developed Hybrid Genetic
Algorithm than the standard Genetic Algorithm.
Abstract: Over the past decade, mobile has experienced a
revolution that will ultimately change the way we communicate.All
these technologies have a common denominator exploitation of
computer information systems, but their operation can be tedious
because of problems with heterogeneous data sources.To overcome
the problems of heterogeneous data sources, we propose to use a
technique of adding an extra layer interfacing applications of
management or supervision at the different data sources.This layer
will be materialized by the implementation of a mediator between
different host applications and information systems frequently used
hierarchical and relational manner such that the heterogeneity is
completely transparent to the VoIP platform.
Abstract: In two studies we tested the hypothesis that the
appropriate linguistic formulation of a deontic rule – i.e. the
formulation which clarifies the monadic nature of deontic operators
- should produce more correct responses than the conditional
formulation in Wason selection task. We tested this assumption by
presenting a prescription rule and a prohibition rule in conditional
vs. proper deontic formulation. We contrasted this hypothesis with
two other hypotheses derived from social contract theory and
relevance theory. According to the first theory, a deontic rule
expressed in terms of cost-benefit should elicit a cheater detection
module, sensible to mental states attributions and thus able to
discriminate intentional rule violations from accidental rule
violations. We tested this prevision by distinguishing the two types
of violations. According to relevance theory, performance in
selection task should improve by increasing cognitive effect and
decreasing cognitive effort. We tested this prevision by focusing
experimental instructions on the rule vs. the action covered by the
rule. In study 1, in which 480 undergraduates participated, we
tested these predictions through a 2 x 2 x 2 x 2 (type of the rule x
rule formulation x type of violation x experimental instructions)
between-subjects design. In study 2 – carried out by means of a 2 x
2 (rule formulation x type of violation) between-subjects design -
we retested the hypothesis of rule formulation vs. the cheaterdetection
hypothesis through a new version of selection task in
which intentional vs. accidental rule violations were better
discriminated. 240 undergraduates participated in this study.
Results corroborate our hypothesis and challenge the contrasting
assumptions. However, they show that the conditional formulation
of deontic rules produces a lower performance than what is
reported in literature.
Abstract: Nowadays, the earth is countered with serious problem
of air pollution. This problem has been started from the industrial
revolution and has been faster in recent years, so that leads the earth
to ecological and environmental disaster. One of its results is the
global warming problem and its related increase in global
temperature. The most important factors in air pollution especially in
urban environments are Automobiles and residential buildings that are
the biggest consumers of the fossil energies, so that if the residential
buildings as a big part of the consumers of such energies reduce their
consumption rate, the air pollution will be decreased. Since
Metropolises are the main centers of air pollution in the world,
assessment and analysis of efficient strategies in decreasing air
pollution in such cities, can lead to the desirable and suitable results
and can solve the problem at least in critical level. Tabriz city is one
of the most important metropolises in North west of Iran that about
two million people are living there. for its situation in cold dry
climate, has a high rate of fossil energies consumption that make air
pollution in its urban environment. These two factors, being both
metropolis and in cold dry climate, make this article try to analyze the
strategies of climatic design in old districts of the city and use them in
new districts of the future. These strategies can be used in this city
and other similar cities and pave the way to reduce energy
consumption and related air pollution to save whole world.
Abstract: Due to the non- intuitive nature of Quantum
algorithms, it becomes difficult for a classically trained person to
efficiently construct new ones. So rather than designing new
algorithms manually, lately, Genetic algorithms (GA) are being
implemented for this purpose. GA is a technique to automatically
solve a problem using principles of Darwinian evolution. This has
been implemented to explore the possibility of evolving an n-qubit
circuit when the circuit matrix has been provided using a set of
single, two and three qubit gates. Using a variable length population
and universal stochastic selection procedure, a number of possible
solution circuits, with different number of gates can be obtained for
the same input matrix during different runs of GA. The given
algorithm has also been successfully implemented to obtain two and
three qubit Boolean circuits using Quantum gates. The results
demonstrate the effectiveness of the GA procedure even when the
search spaces are large.