Abstract: The porous silicon (PS), formed from the anodization
of a p+ type substrate silicon, consists of a network organized in a
pseudo-column as structure of multiple side ramifications. Structural
micro-topology can be interpreted as the fraction of the interconnected
solid phase contributing to thermal transport. The
reduction of dimensions of silicon of each nanocristallite during the
oxidation induced a reduction in thermal conductivity. Integration of
thermal sensors in the Microsystems silicon requires an effective
insulation of the sensor element. Indeed, the low thermal conductivity
of PS consists in a very promising way in the fabrication of integrated
thermal Microsystems.In this work we are interesting in the
measurements of thermal conductivity (on the surface and in depth)
of PS by the micro-Raman spectroscopy. The thermal conductivity is
studied according to the parameters of anodization (initial doping and
current density. We also, determine porosity of samples by
spectroellipsometry.
Abstract: In this paper a new concept named Intuitionistic Fuzzy
Multiset is introduced. The basic operations on Intuitionistic Fuzzy
Multisets such as union, intersection, addition, multiplication etc. are
discussed. An application of Intuitionistic Fuzzy Multiset in Medical diagnosis problem using a distance function is discussed in detail.
Abstract: Cryo-electron microscopy (CEM) in combination with
single particle analysis (SPA) is a widely used technique for
elucidating structural details of macromolecular assemblies at closeto-
atomic resolutions. However, development of automated software
for SPA processing is still vital since thousands to millions of
individual particle images need to be processed. Here, we present our
workflow for automated particle picking. Our approach integrates
peak shape analysis to the classical correlation and an iterative
approach to separate macromolecules and background by
classification. This particle selection workflow furthermore provides
a robust means for SPA with little user interaction. Processing
simulated and experimental data assesses performance of the
presented tools.
Abstract: The paper presents a numerical investigation on the
rapid gas decompression in pure nitrogen which is made by using the
one-dimensional (1D) and three-dimensional (3D) mathematical
models of transient compressible non-isothermal fluid flow in pipes.
A 1D transient mathematical model of compressible thermal multicomponent
fluid mixture flow in pipes is presented. The set of the
mass, momentum and enthalpy conservation equations for gas phase
is solved in the model. Thermo-physical properties of multicomponent
gas mixture are calculated by solving the Equation of
State (EOS) model. The Soave-Redlich-Kwong (SRK-EOS) model is
chosen. This model is successfully validated on the experimental data
[1] and shows a good agreement with measurements. A 3D transient
mathematical model of compressible thermal single-component gas
flow in pipes, which is built by using the CFD Fluent code (ANSYS),
is presented in the paper. The set of unsteady Reynolds-averaged
conservation equations for gas phase is solved. Thermo-physical
properties of single-component gas are calculated by solving the Real
Gas Equation of State (EOS) model. The simplest case of gas
decompression in pure nitrogen is simulated using both 1D and 3D
models. The ability of both models to simulate the process of rapid
decompression with a high order of agreement with each other is
tested. Both, 1D and 3D numerical results show a good agreement
between each other. The numerical investigation shows that 3D CFD
model is very helpful in order to validate 1D simulation results if the
experimental data is absent or limited.
Abstract: The paper presents a one-dimensional transient
mathematical model of compressible non-isothermal multicomponent
fluid mixture flow in a pipe. The set of the mass,
momentum and enthalpy conservation equations for gas phase is
solved in the model. Thermo-physical properties of multi-component
gas mixture are calculated by solving the Equation of State (EOS)
model. The Soave-Redlich-Kwong (SRK-EOS) model is chosen. Gas
mixture viscosity is calculated on the basis of the Lee-Gonzales-
Eakin (LGE) correlation. Numerical analysis of rapid gas
decompression process in rich and base natural gases is made on the
basis of the proposed mathematical model. The model is successfully
validated on the experimental data [1]. The proposed mathematical
model shows a very good agreement with the experimental data [1] in
a wide range of pressure values and predicts the decompression in
rich and base gas mixtures much better than analytical and
mathematical models, which are available from the open source
literature.
Abstract: This paper considers a scheduling problem in flexible
flow shops environment with the aim of minimizing two important
criteria including makespan and cumulative tardiness of jobs. Since
the proposed problem is known as an Np-hard problem in literature,
we have to develop a meta-heuristic to solve it. We considered
general structure of Genetic Algorithm (GA) and developed a new
version of that based on Data Envelopment Analysis (DEA). Two
objective functions assumed as two different inputs for each Decision
Making Unit (DMU). In this paper we focused on efficiency score of
DMUs and efficient frontier concept in DEA technique. After
introducing the method we defined two different scenarios with
considering two types of mutation operator. Also we provided an
experimental design with some computational results to show the
performance of algorithm. The results show that the algorithm
implements in a reasonable time.
Abstract: In this study, the hydrogen transport phenomenon was
numerically evaluated by using hydrogen-enhanced localized
plasticity (HELP) mechanisms. Two dominant governing equations,
namely, the hydrogen transport model and the elasto-plastic model,
were introduced. In addition, the implicitly formulated equations of
the governing equations were implemented into ABAQUS UMAT
user-defined subroutines. The simulation results were compared to
published results to validate the proposed method.
Abstract: For the last years, the variants of the Newton-s method with cubic convergence have become popular iterative methods to find approximate solutions to the roots of non-linear equations. These methods both enjoy cubic convergence at simple roots and do not require the evaluation of second order derivatives. In this paper, we present a new Newton-s method based on contra harmonic mean with cubically convergent. Numerical examples show that the new method can compete with the classical Newton's method.
Abstract: recurrent neural network (RNN) is an efficient tool for
modeling production control process as well as modeling services. In
this paper one RNN was combined with regression model and were
employed in order to be checked whether the obtained data by the
model in comparison with actual data, are valid for variable process
control chart. Therefore, one maintenance process in workshop of
Esfahan Oil Refining Co. (EORC) was taken for illustration of
models. First, the regression was made for predicting the response
time of process based upon determined factors, and then the error
between actual and predicted response time as output and also the
same factors as input were used in RNN. Finally, according to
predicted data from combined model, it is scrutinized for test values
in statistical process control whether forecasting efficiency is
acceptable. Meanwhile, in training process of RNN, design of
experiments was set so as to optimize the RNN.
Abstract: The balancing numbers are natural numbers n satisfying
the Diophantine equation 1 + 2 + 3 + · · · + (n - 1) = (n + 1) +
(n + 2) + · · · + (n + r); r is the balancer corresponding to the
balancing number n.The nth balancing number is denoted by Bn
and the sequence {Bn}1
n=1 satisfies the recurrence relation Bn+1 =
6Bn-Bn-1. The balancing numbers posses some curious properties,
some like Fibonacci numbers and some others are more interesting.
This paper is a study of recurrent sequence {xn}1
n=1 satisfying the
recurrence relation xn+1 = Axn - Bxn-1 and possessing some
curious properties like the balancing numbers.
Abstract: In this paper we propose a new traffic simulation
package, TDMSim, which supports both macroscopic and
microscopic simulation on free-flowing and regulated traffic systems.
Both simulators are based on travel demands, which specify the
numbers of vehicles departing from origins to arrive at different
destinations. The microscopic simulator implements the carfollowing
model given the pre-defined routes of the vehicles but also
supports the rerouting of vehicles. We also propose a macroscopic
simulator which is built in integration with the microscopic simulator
to allow the simulation to be scaled for larger networks without
sacrificing the precision achievable through the microscopic
simulator. The macroscopic simulator also enables the reuse of
previous simulation results when simulating traffic on the same
networks at later time. Validations have been conducted to show the
correctness of both simulators.
Abstract: A variational method is used to obtain the growth rate of a transverse long-wavelength perturbation applied to the soliton solution of a nonlinear Schr¨odinger equation with a three-half order potential. We demonstrate numerically that this unstable perturbed soliton will eventually transform into a cylindrical soliton.
Abstract: The double exponential model (DEM), or Laplace
distribution, is used in various disciplines. However, there are issues
related to the construction of confidence intervals (CI), when using
the distribution.In this paper, the properties of DEM are considered
with intention of constructing CI based on simulated data. The
analysis of pivotal equations for the models here in comparisons with
pivotal equations for normal distribution are performed, and the
results obtained from simulation data are presented.
Abstract: Some quality control tools use non metric subjective information coming from experts, who qualify the intensity of relations existing inside processes, but without quantifying them. In this paper we have developed a quality control analytic tool, measuring the impact or strength of the relationship between process operations and product characteristics. The tool includes two models: a qualitative model, allowing relationships description and analysis; and a formal quantitative model, by means of which relationship quantification is achieved. In the first one, concepts from the Graphs Theory were applied to identify those process elements which can be sources of variation, that is, those quality characteristics or operations that have some sort of prelacy over the others and that should become control items. Also the most dependent elements can be identified, that is those elements receiving the effects of elements identified as variation sources. If controls are focused in those dependent elements, efficiency of control is compromised by the fact that we are controlling effects, not causes. The second model applied adapts the multivariate statistical technique of Covariance Structural Analysis. This approach allowed us to quantify the relationships. The computer package LISREL was used to obtain statistics and to validate the model.
Abstract: Creating3D environments, including characters and
cities, is a significantly time consuming process due to a large amount
of workinvolved in designing and modelling.There have been a
number of attempts to automatically generate 3D objects employing
shape grammars. However it is still too early to apply the mechanism
to real problems such as real-time computer games.The purpose of this
research is to introduce a time efficient and cost effective method to
automatically generatevarious 3D objects for real-time 3D games.
This Shape grammar-based real-time City Generation (RCG) model is
a conceptual model for generating 3Denvironments in real-time and
can be applied to 3D gamesoranimations. The RCG system can
generate even a large cityby applying fundamental principles of shape
grammars to building elementsin various levels of detailin real-time.
Abstract: The statistical distributions are modeled in explaining
nature of various types of data sets. Although these distributions are
mostly uni-modal, it is quite common to see multiple modes in the
observed distribution of the underlying variables, which make the
precise modeling unrealistic. The observed data do not exhibit
smoothness not necessarily due to randomness, but could also be due
to non-randomness resulting in zigzag curves, oscillations, humps
etc. The present paper argues that trigonometric functions, which
have not been used in probability functions of distributions so far,
have the potential to take care of this, if incorporated in the
distribution appropriately. A simple distribution (named as, Sinoform
Distribution), involving trigonometric functions, is illustrated in the
paper with a data set. The importance of trigonometric functions is
demonstrated in the paper, which have the characteristics to make
statistical distributions exotic. It is possible to have multiple modes,
oscillations and zigzag curves in the density, which could be suitable
to explain the underlying nature of select data set.
Abstract: Discrete choice model is the most used methodology for studying traveler-s mode choice and demand. However, to calibrate the discrete choice model needs to have plenty of questionnaire survey. In this study, an aggregative model is proposed. The historical data of passenger volumes for high speed rail and domestic civil aviation are employed to calibrate and validate the model. In this study, different models are compared so as to propose the best one. From the results, systematic equations forecast better than single equation do. Models with the external variable, which is oil price, are better than models based on closed system assumption.
Abstract: The fuel cost of the motor vehicle operating on its
common route is an important part of the operating cost. Therefore,
the importance of the fuel saving is increasing day by day. One of the
parameters which improve fuel saving is the regulation of driving
characteristics. The number and duration of stop is increased by the
heavy traffic load. It is possible to improve the fuel saving with
regulation of traffic flow and driving characteristics. The researches
show that the regulation of the traffic flow decreases fuel
consumption, but it is not enough to improve fuel saving without the
regulation of driving characteristics. This study analyses the fuel
consumption of two trips of city bus operating on its common route
and determines the effect of traffic density and driving characteristics
on fuel consumption. Finally it offers some suggestions about
regulation of driving characteristics to improve the fuel saving. Fuel
saving is determined according to the results obtained from
simulation program. When experimental and simulation results are
compared, it has been found that the fuel saving was reached up the
to 40 percent ratios.
Abstract: In the paper we submit the non-local modification of
kinetic Smoluchowski equation for binary aggregation applying to
dispersed media having memory. Our supposition consists in that that
intensity of evolution of clusters is supposed to be a function of the
product of concentrations of the lowest orders clusters at different
moments. The new form of kinetic equation for aggregation is
derived on the base of the transfer kernels approach. This approach
allows considering the influence of relaxation times hierarchy on
kinetics of aggregation process in media with memory.
Abstract: This paper considers a multi criteria cell formation
problem in Cellular Manufacturing System (CMS). Minimizing the
number of voids and exceptional elements in cells simultaneously are
two proposed objective functions. This problem is an Np-hard
problem according to the literature, and therefore, we can-t find the
optimal solution by an exact method. In this paper we developed two
ant algorithms, Ant Colony Optimization (ACO) and Max-Min Ant
System (MMAS), based on Data Envelopment Analysis (DEA). Both
of them try to find the efficient solutions based on efficiency concept
in DEA. Each artificial ant is considered as a Decision Making Unit
(DMU). For each DMU we considered two inputs, the values of
objective functions, and one output, the value of one for all of them.
In order to evaluate performance of proposed methods we provided
an experimental design with some empirical problem in three
different sizes, small, medium and large. We defined three different
criteria that show which algorithm has the best performance.