Abstract: In the present research, steam cracking of two types of
feedstocks i.e., naphtha and ethane is simulated for Pyrocrack1-1 and
2/2 coil configurations considering two key parameters of coil outlet
temperature (COT) and coil capacity using a radical based kinetic
model. The computer model is confirmed using the industrial data
obtained from Amirkabir Petrochemical Complex. The results are in
good agreement with performance data for naphtha cracking in a
wide range of severity (0.4-0.7), and for ethane cracking on various
conversions (50-70). It was found that Pyrocrack2-2 coil type is an
appropriate choice for steam cracking of ethane at reasonable
ethylene yield while resulting in much lower tube wall temperature
while Pyrocrack1-1 coil type is a proper selection for liquid
feedstocks i.e. naphtha. It can be used for cracking of liquid
feedstocks at optimal ethylene yield whereas not exceeding the
allowable maximum tube temperature.
Abstract: The present paper considers the steady free
convection boundary layer flow of a viscoelastics fluid with constant
temperature in the presence of heat generation. The boundary layer
equations are an order higher than those for the Newtonian (viscous)
fluid and the adherence boundary conditions are insufficient to
determine the solution of these equations completely. The governing
boundary layer equations are first transformed into non-dimensional
form by using special dimensionless group. Computations are
performed numerically by using Keller-box method by augmenting
an extra boundary condition at infinity and the results are displayed
graphically to illustrate the influence of viscoelastic K, heat
generation γ , and Prandtl Number, Pr parameters on the velocity
and temperature profiles. The results of the surface shear stress in
terms of the local skin friction and the surface rate of heat transfer in
terms of the local Nusselt number for a selection of the heat
generation parameterγ (=0.0, 0.2, 0.5, 0.8, 1.0) are obtained and
presented in both tabular and graphical formats. Without effect of the
internal heat generation inside the fluid domain for which we take
γ = 0.0, the present numerical results show an excellent agreement
with previous publication.
Abstract: Lurking behavior is common in information-seeking oriented communities. Transferring users with lurking behavior to be contributors can assist virtual communities to obtain competitive advantages. Based on the ecological cognition framework, this study proposes a model to examine the antecedents of lurking behavior in information-seeking oriented virtual communities. This study argues desire for emotional support, desire for information support, desire for performance-approach, desire for performance -avoidance, desire for mastery-approach, desire for mastery-avoidance, desire for ability trust, desire for benevolence trust, and desire for integrity trust effect on lurking behavior. This study offers an approach to understanding the determinants of lurking behavior in online contexts.
Abstract: This paper study about using of nonparametric
models for Gross National Product data in Turkey and Stanford heart
transplant data. It is discussed two nonparametric techniques called
smoothing spline and kernel regression. The main goal is to compare
the techniques used for prediction of the nonparametric regression
models. According to the results of numerical studies, it is concluded
that smoothing spline regression estimators are better than those of
the kernel regression.
Abstract: Histogram plays an important statistical role in digital
image processing. However, the existing quantum image models are
deficient to do this kind of image statistical processing because
different gray scales are not distinguishable. In this paper, a novel
quantum image representation model is proposed firstly in which the
pixels with different gray scales can be distinguished and operated
simultaneously. Based on the new model, a fast quantum algorithm of
constructing histogram for quantum image is designed. Performance
comparison reveals that the new quantum algorithm could achieve an
approximately quadratic speedup than the classical counterpart. The
proposed quantum model and algorithm have significant meanings for
the future researches of quantum image processing.
Abstract: In this study, we used a two-stage process and
potassium hydroxide (KOH) to transform waste biomass (rice straw)
into activated carbon and then evaluated the adsorption capacity of the
waste for removing carbofuran from an aqueous solution. Activated
carbon was fast and effective for the removal of carbofuran because of
its high surface area. The native and carbofuran-loaded adsorbents
were characterized by elemental analysis. Different adsorption
parameters, such as the initial carbofuran concentration, contact time,
temperature and pH for carbofuran adsorption, were studied using a
batch system. This study demonstrates that rice straw can be very
effective in the adsorption of carbofuran from bodies of water.
Abstract: In this paper the application of neuro-fuzzy system for equalization of channel distortion is considered. The structure and operation algorithm of neuro-fuzzy equalizer are described. The use of neuro-fuzzy equalizer in digital signal transmission allows to decrease training time of parameters and decrease the complexity of the network. The simulation of neuro-fuzzy equalizer is performed. The obtained result satisfies the efficiency of application of neurofuzzy technology in channel equalization.
Abstract: Electrical resistivity is a fundamental parameter of metals or electrical conductors. Since resistivity is a function of temperature, in order to completely understand the behavior of metals, a temperature dependent theoretical model is needed. A model based on physics principles has recently been developed to obtain an equation that relates electrical resistivity to temperature. This equation is dependent upon a parameter associated with the electron travel time before being scattered, and a parameter that relates the energy of the atoms and their separation distance. Analysis of the energy parameter reveals that the equation is optimized if the proportionality term in the equation is not constant but varies over the temperature range. Additional analysis reveals that the theoretical equation can be used to determine the mean free path of conduction electrons, the number of defects in the atomic lattice, and the ‘equivalent’ charge associated with the metallic bonding of the atoms. All of this analysis provides validation for the theoretical model and provides insight into the behavior of metals where performance is affected by temperatures (e.g., integrated circuits and temperature sensors).
Abstract: We propose a multi-agent based utilitarian approach
to model and understand information flows in social networks that
lead to Pareto optimal informational exchanges. We model the
individual expected utility function of the agents to reflect the net
value of information received. We show how this model, adapted
from a theorem by Karl Borch dealing with an actuarial Risk
Exchange concept in the Insurance industry, can be used for social
network analysis. We develop a utilitarian framework that allows us
to interpret Pareto optimal exchanges of value as potential
information flows, while achieving a maximization of a sum of
expected utilities of information of the group of agents. We examine
some interesting conditions on the utility function under which the
flows are optimal. We illustrate the promise of this new approach to
attach economic value to information in networks with a synthetic
example.
Abstract: This study extends research on the relationship
between marketing strategy and market segmentation by
investigating on market segments in the cement industry.
Competitive strength and rivals distance from the factory were used
as business environment. A three segment (positive, neutral or
indifferent and zero zones) were identified as strategic segments. For
each segment a marketing strategy (aggressive, defensive and
decline) were developed. This study employed data from cement
industry to fulfill two objectives, the first is to give a framework to
the segmentation of cement industry and the second is developing
marketing strategy with varying competitive strength. Fifty six
questionnaires containing close-and open-ended questions were
collected and analyzed. Results supported the theory that segments
tend to be more aggressive than defensive when competitive strength
increases. It is concluded that high strength segments follow total
market coverage, concentric diversification and frontal attack to their
competitors. With decreased competitive strength, Business tends to
follow multi-market strategy, product modification/improvement and
flank attack to direct competitors for this kind of segments. Segments
with weak competitive strength followed focus strategy and decline
strategy.
Abstract: Along with forward supply chain organization needs
to consider the impact of reverse logistics due to its economic
advantage, social awareness and strict legislations. In this paper, we
develop a system dynamics framework for a closed-loop supply
chain with fuzzy demand and fuzzy collection rate by incorporating
product exchange policy in forward channel and various recovery
options in reverse channel. The uncertainty issues associated with
acquisition and collection of used product have been quantified using
possibility measures. In the simulation study, we analyze order
variation at both retailer and distributor level and compare bullwhip
effects of different logistics participants over time between the
traditional forward supply chain and the closed-loop supply chain.
Our results suggest that the integration of reverse logistics can reduce
order variation and bullwhip effect of a closed-loop system. Finally,
sensitivity analysis is performed to examine the impact of various
parameters on recovery process and bullwhip effect.
Abstract: The peng-Robinson (PR), a cubic equation of state (EoS), is extended to polymers by using a single set of energy (A1, A2, A3) and co-volume (b) parameters per polymer fitted to experimental volume data. Excellent results for the volumetric behavior of the 11 polymer up to 2000 bar pressure are obtained. The EoS is applied to the correlation and prediction of Henry constants in polymer solutions comprising three polymer and many nonpolar and polar solvents, including supercritical gases. The correlation achieved with two adjustable parameter is satisfactory compared with the experimental data. As a result, the present work provides a simple and useful model for the prediction of Henry's constant for polymer containing systems including those containing polar, nonpolar and supercritical fluids.
Abstract: When programming in languages such as C, Java, etc.,
it is difficult to reconstruct the programmer's ideas only from the
program code. This occurs mainly because, much of the programmer's
ideas behind the implementation are not recorded in the code during
implementation. For example, physical aspects of computation such as
spatial structures, activities, and meaning of variables are not required
as instructions to the computer and are often excluded. This makes the
future reconstruction of the original ideas difficult. AIDA, which is a
multimedia programming language based on the cyberFilm model, can
solve these problems allowing to describe ideas behind programs
using advanced annotation methods as a natural extension to
programming. In this paper, a development environment that
implements the AIDA language is presented with a focus on the
annotation methods. In particular, an actual scientific numerical
computation code is created and the effects of the annotation methods
are analyzed.
Abstract: Context awareness is a capability whereby mobile
computing devices can sense their physical environment and adapt
their behavior accordingly. The term context-awareness, in
ubiquitous computing, was introduced by Schilit in 1994 and has
become one of the most exciting concepts in early 21st-century
computing, fueled by recent developments in pervasive computing
(i.e. mobile and ubiquitous computing). These include computing
devices worn by users, embedded devices, smart appliances, sensors
surrounding users and a variety of wireless networking technologies.
Context-aware applications use context information to adapt
interfaces, tailor the set of application-relevant data, increase the
precision of information retrieval, discover services, make the user
interaction implicit, or build smart environments. For example: A
context aware mobile phone will know that the user is currently in a
meeting room, and reject any unimportant calls. One of the major
challenges in providing users with context-aware services lies in
continuously monitoring their contexts based on numerous sensors
connected to the context aware system through wireless
communication. A number of context aware frameworks based on
sensors have been proposed, but many of them have neglected the
fact that monitoring with sensors imposes heavy workloads on
ubiquitous devices with limited computing power and battery. In this
paper, we present CALEEF, a lightweight and energy efficient
context aware framework for resource limited ubiquitous devices.
Abstract: Localization is one of the critical issues in the field of
robot navigation. With an accurate estimate of the robot pose, robots will be capable of navigating in the environment autonomously and efficiently. In this paper, a hybrid Distributed Vision System (DVS)
for robot localization is presented. The presented approach integrates
odometry data from robot and images captured from overhead cameras
installed in the environment to help reduce possibilities of fail
localization due to effects of illumination, encoder accumulated errors,
and low quality range data. An odometry-based motion model is applied to predict robot poses, and robot images captured by overhead
cameras are then used to update pose estimates with HSV histogram-based measurement model. Experiment results show the
presented approach could localize robots in a global world coordinate system with localization errors within 100mm.
Abstract: A new stochastic algorithm called Probabilistic Global Search Johor (PGSJ) has recently been established for global optimization of nonconvex real valued problems on finite dimensional Euclidean space. In this paper we present convergence guarantee for this algorithm in probabilistic sense without imposing any more condition. Then, we jointly utilize this algorithm along with control
parameterization technique for the solution of constrained optimal control problem. The numerical simulations are also included to illustrate the efficiency and effectiveness of the PGSJ algorithm in the solution of control problems.
Abstract: The purpose of this study is to revisit the concept of
rape as represented by professionals in the literature as well as its
perception (beliefs and attitudes) in the population at large and to
propose methodological improvements to its measurement tool. Rape
is a serious crime threatening its victim-s physical and mental health
and integrity; and as such is legally prosecuted in all modern
societies. The problem is not in accepting or rejecting rape as a
criminal act, but rather in the vagueness of its interpretations and
“justifications" maintained in the mentality of modern societies -
known in the literature as the phenomenon of "rape-myth". The rapemyth
can be studied from different perspectives: criminology,
sociology, ethics, medicine and psychology. Its investigation requires
rigorous scientific objectivity, free of passion (victims of rape are at
risk of emotional bias), free of activism (social activists, even if wellintentioned
are also biased), free of any pre-emptive assumptions or
prejudices. To apply a rigorous scientific procedure, we need a solid,
valid and reliable measurement. Rape is a form of heterosexual or
homosexual aggression, violently forcing the victim to give-in in the
sexual activity of the aggressor against her/his will. Human beings
always try to “understand" or find a reason justifying their acts.
Psychological literature provides multiple clinical and experimental
examples of it; just to mention the famous studies by Milgram on the
level of electroshock delivered by the “teacher" towards the “learner"
if “scientifically justifiable" or the studies on the behavior of
“prisoners" and the “guards" and many other experiments and field
observations. Sigmund Freud presented the phenomenon of
unconscious justification and called it rationalization. The multiple
justifications, rationalizations and repeated opinions about sexual
behavior contribute to a myth maintained in the society. What kind of
“rationale" our societies apply to “understand" the non-consensual
sexual behavior? There are many, just to mention few:
• Sex is a ludistic activity for both participants, therefore –
even if not consented – it should bring pleasure to both.
• Everybody wants sex, but only men are allowed to manifest
it openly while women have to pretend the opposite, thus men have
to initiate sexual behavior and women would follow.
• A person who strongly needs sex is free to manifest it and
struggle to get it; the person who doesn-t want it must not reveal
her/his sexual attraction and avoid risky situations; otherwise she/he
is perceived as a promiscuous seducer.
• A person who doesn-t fight against the sexual initiator
unconsciously accepts the rape (does it explain why homosexual
rapes are reported less frequently than rapes against women?).
• Women who are raped deserve it because their wardrobe is
very revealing and seducing and they ''willingly'' go to highly risky
places (alleys, dark roads, etc.).
• Men need to ventilate their sexual energy and if they are
deprived of a partner their urge to have sex is difficult to control.
• Men are supposed to initiate and insist even by force to have
sex (their testosterone makes them both sexual and aggressive).
The paper overviews numerous cultural beliefs about masculine
versus feminine behavior and their impact on the “rape myth".
Abstract: The knowledge of the nature of loading is very
important in order to hold account on the total behavior such as
vibration, shock, fatigue, etc. Fatigue present 90% of failure when
loadings fatigues are very complex. In this paper a study of double
through crack at hole for plate subjected to fatigue loading is
presented. Various modes loading are studied where the applied load
is the same one. The fatigue life is given where the effect of stress
ratio is highlighted. This work is conducted on aluminum alloy 2024
T351 used for much aerospace and aeronautics applications. The
fatigue crack growth behavior with constant amplitude is studied
using the AFGROW code when Forman model is applied. The
fatigue crack growth rate and fatigue life for different loading modes
are compared with variation of others geometrical parameter such as
thickness and dimensions of notch hole.
Abstract: In this paper, periodic force operation of a wastewater treatment process has been studied for the improved process performance. A previously developed dynamic model for the process is used to conduct the performance analysis. The static version of the model was utilized first to determine the optimal productivity conditions for the process. Then, feed flow rate in terms of dilution rate i.e. (D) is transformed into sinusoidal function. Nonlinear model predictive control algorithm is utilized to regulate the amplitude and period of the sinusoidal function. The parameters of the feed cyclic functions are determined which resulted in improved productivity than the optimal productivity under steady state conditions. The improvement in productivity is found to be marginal and is satisfactory in substrate conversion compared to that of the optimal condition and to the steady state condition, which corresponds to the average value of the periodic function. Successful results were also obtained in the presence of modeling errors and external disturbances.
Abstract: Many measures have been proposed for machine
translation evaluation (MTE) while little research has been done on
the performance of MTE methods. This paper is an effort for MTE
performance analysis. A general frame is proposed for the description
of the MTE measure and the test suite, including whether the
automatic measure is consistent with human evaluation, whether
different results from various measures or test suites are consistent,
whether the content of the test suite is suitable for performance
evaluation, the degree of difficulty of the test suite and its influence
on the MTE, the relationship of MTE result significance and the size
of the test suite, etc. For a better clarification of the frame, several
experiment results are analyzed relating human evaluation, BLEU
evaluation, and typological MTE. A visualization method is
introduced for better presentation of the results. The study aims for
aid in construction of test suite and method selection in MTE
practice.