Abstract: Lattice Monte Carlo methods are an excellent
choice for the simulation of non-linear thermal diffusion
problems. In this paper, and for the first time, Lattice Monte
Carlo analysis is performed on thermal diffusion combined
with convective heat transfer. Laminar flow of water modeled
as an incompressible fluid inside a copper pipe with a constant
surface temperature is considered. For the simulation of
thermal conduction, the temperature dependence of the
thermal conductivity of the water is accounted for. Using the
novel Lattice Monte Carlo approach, temperature distributions
and energy fluxes are obtained.
Abstract: In this paper we propose a method for modeling the
correlation between the received signals by two or more antennas
operating in a multipath environment. Considering the maximum
excess delay in the channel being modeled, an elliptical region
surrounding both transmitter and receiver antennas is produced. A
number of scatterers are randomly distributed in this region and
scatter the incoming waves. The amplitude and phase of incoming
waves are computed and used to obtain statistical properties of the
received signals. This model has the distinguishable advantage of
being applicable for any configuration of antennas. Furthermore the
common PDF (Probability Distribution Function) of received wave
amplitudes for any pair of antennas can be calculated and used to
produce statistical parameters of received signals.
Abstract: In this paper, we present a novel approach to accurately
detect text regions including shop name in signboard images with
complex background for mobile system applications. The proposed
method is based on the combination of text detection using edge
profile and region segmentation using fuzzy c-means method. In the
first step, we perform an elaborate canny edge operator to extract all
possible object edges. Then, edge profile analysis with vertical and
horizontal direction is performed on these edge pixels to detect
potential text region existing shop name in a signboard. The edge
profile and geometrical characteristics of each object contour are
carefully examined to construct candidate text regions and classify the
main text region from background. Finally, the fuzzy c-means
algorithm is performed to segment and detected binarize text region.
Experimental results show that our proposed method is robust in text
detection with respect to different character size and color and can
provide reliable text binarization result.
Abstract: This paper developed the c-Chart based on a Zero- Inflated Poisson (ZIP) processes that approximated by a geometric distribution with parameter p. The p estimated that fit for ZIP distribution used in calculated the mean, median, and variance of geometric distribution for constructed the c-Chart by three difference methods. For cg-Chart, developed c-Chart by used the mean and variance of the geometric distribution constructed control limits. For cmg-Chart, the mean used for constructed the control limits. The cme- Chart, developed control limits of c-Chart from median and variance values of geometric distribution. The performance of charts considered from the Average Run Length and Average Coverage Probability. We found that for an in-control process, the cg-Chart is superior for low level of mean at all level of proportion zero. For an out-of-control process, the cmg-Chart and cme-Chart are the best for mean = 2, 3 and 4 at all level of parameter.
Abstract: In social network analysis the mean nodal degree and
density of the graph can be considered as a measure of the activity of
all actors in the network and this is an important property of a graph
and for making comparisons among networks. Since subjects in a
family or organization are subject to common environment factors, it
is prime interest to study the association between responses.
Therefore, we study the distribution of the mean nodal degree and
density of the graph under correlated binary units. The cross product
ratio is used to capture the intra-units association among subjects.
Computer program and an application are given to show the benefits
of the method.
Abstract: Scaffolds play a key role in tissue engineering and can be produced in many different ways depending on the applications and the materials used. Most researchers used an experimental trialand- error approach into new biomaterials but computer simulation applied to tissue engineering can offer a more exhaustive approach to test and screen out biomaterials. This paper develops the model of scaffolds and Computational Fluid Dynamics that show the value of computer simulations in determining the influence of the geometrical scaffold parameter porosity, pore size and shape on the permeability of scaffolds, magnitude of velocity, drop pressure, shear stress distribution and level and the proper design of the geometry of the scaffold. This creates a need for more advanced studies that include aspects of dynamic conditions of a micro fluid passing through the scaffold were characterized for tissue engineering applications and differentiation of tissues within scaffolds.
Abstract: In this paper, a Dynamic Economic Dispatch (DED) model is developed for the system consisting of both thermal generators and wind turbines. The inclusion of a significant amount of wind energy into power systems has resulted in additional constraints on DED to accommodate the intermittent nature of the output. The probability of stochastic wind power based on the Weibull probability density function is included in the model as a constraint; A Here-and-Now Approach. The Environmental Protection Agency-s hourly emission target, which gives the maximum emission during the day, is used as a constraint to reduce the atmospheric pollution. A 69-bus test system with non-smooth cost function is used to illustrate the effectiveness of the proposed model compared with static economic dispatch model with including the wind power.
Abstract: This paper presents a new and efficient approach for capacitor placement in radial distribution systems that determine the optimal locations and size of capacitor with an objective of improving the voltage profile and reduction of power loss. The solution methodology has two parts: in part one the loss sensitivity factors are used to select the candidate locations for the capacitor placement and in part two a new algorithm that employs Plant growth Simulation Algorithm (PGSA) is used to estimate the optimal size of capacitors at the optimal buses determined in part one. The main advantage of the proposed method is that it does not require any external control parameters. The other advantage is that it handles the objective function and the constraints separately, avoiding the trouble to determine the barrier factors. The proposed method is applied to 9, 34, and 85-bus radial distribution systems. The solutions obtained by the proposed method are compared with other methods. The proposed method has outperformed the other methods in terms of the quality of solution.
Abstract: A multiphase harmonic load flow algorithm is developed based on backward/forward sweep to examine the effects of various factors on the neutral to earth voltage (NEV), including unsymmetrical system configuration, load unbalance and harmonic injection. The proposed algorithm composes fundamental frequency and harmonic frequencies power flows. The algorithm and the associated models are tested on IEEE 13 bus system. The magnitude of NEV is investigated under various conditions of the number of grounding rods per feeder lengths, the grounding rods resistance and the grounding resistance of the in feeding source. Additionally, the harmonic injection of nonlinear loads has been considered and its influences on NEV under different conditions are shown.
Abstract: The paper shows some ability to manage two-phase
flows arising from the use of unsteady effects. In one case, we
consider the condition of fragmentation of the interface between the
two components leads to the intensification of mixing. The problem
is solved when the temporal and linear scale are small for the
appearance of the developed mixing layer. Showing that exist such
conditions for unsteady flow velocity at the surface of the channel,
which will lead to the creation and fragmentation of vortices at Re
numbers of order unity. Also showing that the Re is not a criterion of
similarity for this type of flows, but we can introduce a criterion that
depends on both the Re, and the frequency splitting of the vortices. It
turned out that feature of this situation is that streamlines behave
stable, and if we analyze the behavior of the interface between the
components it satisfies all the properties of unstable flows. The other
problem we consider the behavior of solid impurities in the extensive
system of channels. Simulated unsteady periodic flow modeled
breaths. Consider the behavior of the particles along the trajectories.
It is shown that, depending on the mass and diameter of the particles,
they can be collected in a caustic on the channel walls, stop in a
certain place or fly back. Of interest is the distribution of particle
velocity in frequency. It turned out that by choosing a behavior of the
velocity field of the carrier gas can affect the trajectory of individual
particles including force them to fly back.
Abstract: Binder drainage test is widely used to set an upper
limit to the design binder content of porous asphalt. However, the
presence of high amount of fine particles in the drained binder may
affect the accuracy of the test result. This paper presents a study to
characterize the composition and particle size distribution of fine
particles accumulated in the drained binder. Fine aggregates and filler
in the drained binder were extracted using a suitable solvent. Then,
wet and dry sieve analysis was carried out to identify the actual
composition of the extracted fine aggregates and filler. From the
results, almost half of the drained binder consisted of fine aggregates
and this significantly affects the accuracy of the design binder content
of porous asphalt mix. This simple finding highlights the importance
of taking into account the presence of fine aggregates in the
calculation of drained binder.
Abstract: Understanding how airborne pathogens are
transported through hospital wards is essential for determining the
infection risk to patients and healthcare workers. This study utilizes
Computational Fluid Dynamics (CFD) simulations to explore
possible pathogen transport within a six-bed partitioned Nightingalestyle
hospital ward.
Grid independence of a ward model was addressed using the Grid
Convergence Index (GCI) from solutions obtained using three fullystructured
grids. Pathogens were simulated using source terms in
conjunction with a scalar transport equation and a RANS turbulence
model. Errors were found to be less than 4% in the calculation of air
velocities but an average of 13% was seen in the scalar field.
A parametric study of variations in the pathogen release point
illustrated that its distribution is strongly influenced by the local
velocity field and the degree of air mixing present.
Abstract: To achieve accurate and precise results of finite
element analysis (FEA) of bones, it is important to represent the
load/boundary conditions as identical as possible to the human body
such as the bone properties, the type and force of the muscles, the
contact force of the joints, and the location of the muscle attachment.
In this study, the difference in the Von-Mises stress and the total
deformation was compared by classifying them into Case 1, which
shows the actual anatomical form of the muscle attached to the femur
when the same muscle force was applied, and Case 2, which gives a
simplified representation of the attached location. An inverse
dynamical musculoskeletal model was simulated using data from an
actual walking experiment to complement the accuracy of the
muscular force, the input value of FEA. The FEA method using the
results of the muscular force that were calculated through the
simulation showed that the maximum Von-Mises stress and the
maximum total deformation in Case 2 were underestimated by 8.42%
and 6.29%, respectively, compared to Case 1. The torsion energy and
bending moment at each location of the femur occurred via the stress
ingredient. Due to the geometrical/morphological feature of the femur
of having a long bone shape when the stress distribution is wide, as
shown in Case 1, a greater Von-Mises stress and total deformation are
expected from the sum of the stress ingredients. More accurate results
can be achieved only when the muscular strength and the attachment
location in the FEA of the bones and the attachment form are the same
as those in the actual anatomical condition under the various moving
conditions of the human body.
Abstract: Most of the losses in a power system relate to
the distribution sector which always has been considered.
From the important factors which contribute to increase losses
in the distribution system is the existence of radioactive flows.
The most common way to compensate the radioactive power
in the system is the power to use parallel capacitors. In
addition to reducing the losses, the advantages of capacitor
placement are the reduction of the losses in the release peak of
network capacity and improving the voltage profile. The point
which should be considered in capacitor placement is the
optimal placement and specification of the amount of the
capacitor in order to maximize the advantages of capacitor
placement.
In this paper, a new technique has been offered for the
placement and the specification of the amount of the constant
capacitors in the radius distribution network on the basis of
Genetic Algorithm (GA). The existing optimal methods for
capacitor placement are mostly including those which reduce
the losses and voltage profile simultaneously. But the
retaliation cost and load changes have not been considered as
influential UN the target function .In this article, a holistic
approach has been considered for the optimal response to this
problem which includes all the parameters in the distribution
network: The price of the phase voltage and load changes. So,
a vast inquiry is required for all the possible responses. So, in
this article, we use Genetic Algorithm (GA) as the most
powerful method for optimal inquiry.
Abstract: The plastic forming process of sheet plate takes an
important place in forming metals. The traditional techniques of tool
design for sheet forming operations used in industry are experimental
and expensive methods. Prediction of the forming results,
determination of the punching force, blank holder forces and the
thickness distribution of the sheet metal will decrease the production
cost and time of the material to be formed. In this paper, multi-stage
deep drawing simulation of an Industrial Part has been presented
with finite element method. The entire production steps with
additional operations such as intermediate annealing and springback
has been simulated by ABAQUS software under axisymmetric
conditions. The simulation results such as sheet thickness
distribution, Punch force and residual stresses have been extracted in
any stages and sheet thickness distribution was compared with
experimental results. It was found through comparison of results, the
FE model have proven to be in close agreement with those of
experiment.
Abstract: Cryptography provides the secure manner of
information transmission over the insecure channel. It authenticates
messages based on the key but not on the user. It requires a lengthy
key to encrypt and decrypt the sending and receiving the messages,
respectively. But these keys can be guessed or cracked. Moreover,
Maintaining and sharing lengthy, random keys in enciphering and
deciphering process is the critical problem in the cryptography
system. A new approach is described for generating a crypto key,
which is acquired from a person-s iris pattern. In the biometric field,
template created by the biometric algorithm can only be
authenticated with the same person. Among the biometric templates,
iris features can efficiently be distinguished with individuals and
produces less false positives in the larger population. This type of iris
code distribution provides merely less intra-class variability that aids
the cryptosystem to confidently decrypt messages with an exact
matching of iris pattern. In this proposed approach, the iris features
are extracted using multi resolution wavelets. It produces 135-bit iris
codes from each subject and is used for encrypting/decrypting the
messages. The autocorrelators are used to recall original messages
from the partially corrupted data produced by the decryption process.
It intends to resolve the repudiation and key management problems.
Results were analyzed in both conventional iris cryptography system
(CIC) and non-repudiation iris cryptography system (NRIC). It
shows that this new approach provides considerably high
authentication in enciphering and deciphering processes.
Abstract: Rainfall data at fine resolution and knowledge of its
characteristics plays a major role in the efficient design and operation
of agricultural, telecommunication, runoff and erosion control as well
as water quality control systems. The paper is aimed to study the
statistical distribution of hourly rainfall depth for 12 representative
stations spread across Peninsular Malaysia. Hourly rainfall data of 10
to 22 years period were collected and its statistical characteristics
were estimated. Three probability distributions namely, Generalized
Pareto, Exponential and Gamma distributions were proposed to
model the hourly rainfall depth, and three goodness-of-fit tests,
namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared
tests were used to evaluate their fitness. Result indicates that the east
cost of the Peninsular receives higher depth of rainfall as compared
to west coast. However, the rainfall frequency is found to be
irregular. Also result from the goodness-of-fit tests show that all the
three models fit the rainfall data at 1% level of significance.
However, Generalized Pareto fits better than Exponential and
Gamma distributions and is therefore recommended as the best fit.
Abstract: In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.
Abstract: DG application has received increasing attention during
recent years. The impact of DG on various aspects of distribution system
operation, such as reliability and energy loss, depend highly on DG
location in distribution feeder. Optimal DG placement is an important
subject which has not been fully discussed yet.
This paper presents an optimization method to determine optimal DG
placement, based on a cost/worth analysis approach. This method
considers technical and economical factors such as energy loss, load point
reliability indices and DG costs, and particularly, portability of DG. The
proposed method is applied to a test system and the impacts of different
parameters such as load growth rate and load forecast uncertainty (LFU)
on optimum DG location are studied.
Abstract: LSP routing is among the prominent issues in MPLS
networks traffic engineering. The objective of this routing is to
increase number of the accepted requests while guaranteeing the
quality of service (QoS). Requested bandwidth is the most important
QoS criterion that is considered in literatures, and a various number
of heuristic algorithms have been presented with that regards. Many
of these algorithms prevent flows through bottlenecks of the network
in order to perform load balancing, which impedes optimum
operation of the network. Here, a modern routing algorithm is
proposed as MIRAD: having a little information of the network
topology, links residual bandwidth, and any knowledge of the
prospective requests it provides every request with a maximum
bandwidth as well as minimum end-to-end delay via uniform load
distribution across the network. Simulation results of the proposed
algorithm show a better efficiency in comparison with similar
algorithms.