Abstract: Perishable goods constitute a large portion of retailer inventory and lose value with time due to deterioration and/or obsolescence. Retailers dealing with such goods required considering the factors of short shelf life and the dependency of sales on inventory displayed in determining optimal procurement policy. Many retailers follow the practice of using two bins - primary bin sales fresh items at a list price and secondary bin sales unsold items at a discount price transferred from primary bin on attaining certain age. In this paper, mathematical models are developed for primary bin and for secondary bin that maximizes profit with decision variables of order quantities, optimal review period and optimal selling price at secondary bin. The demand rates in two bins are assumed to be deterministic and dependent on displayed inventory level, price and age but independent of each other. The validity of the model is shown by solving an example and the sensitivity analysis of the model is also reported.
Abstract: A numerical study on the influence of electroosmotic flow on analyte preconcentration by isotachophoresis ( ITP) is made. We consider that the double layer induced electroosmotic flow ( EOF) counterbalance the electrophoretic velocity and a stationary ITP stacked zones results. We solve the Navier-Stokes equations coupled with the Nernst-Planck equations to determine the local convective velocity and the preconcentration dynamics of ions. Our numerical algorithm is based on a finite volume method along with a secondorder upwind scheme. The present numerical algorithm can capture the the sharp boundaries of step-changes ( plateau mode) or zones of steep gradients ( peak mode) accurately. The convection of ions due to EOF reduces the resolution of the ITP transition zones and produces a dispersion in analyte zones. The role of the electrokinetic parameters which induces dispersion is analyzed. A one-dimensional model for the area-averaged concentrations based on the Taylor-Aristype effective diffusivity is found to be in good agreement with the computed solutions.
Abstract: The paper presents a complete discrete statistical framework, based on a novel vector quantization (VQ) front-end process. This new VQ approach performs an optimal distribution of VQ codebook components on HMM states. This technique that we named the distributed vector quantization (DVQ) of hidden Markov models, succeeds in unifying acoustic micro-structure and phonetic macro-structure, when the estimation of HMM parameters is performed. The DVQ technique is implemented through two variants. The first variant uses the K-means algorithm (K-means- DVQ) to optimize the VQ, while the second variant exploits the benefits of the classification behavior of neural networks (NN-DVQ) for the same purpose. The proposed variants are compared with the HMM-based baseline system by experiments of specific Arabic consonants recognition. The results show that the distributed vector quantization technique increase the performance of the discrete HMM system.
Abstract: Facial expression analysis is rapidly becoming an
area of intense interest in computer science and human-computer
interaction design communities. The most expressive way humans
display emotions is through facial expressions. In this paper we
present a method to analyze facial expression from images by
applying Gabor wavelet transform (GWT) and Discrete Cosine
Transform (DCT) on face images. Radial Basis Function (RBF)
Network is used to classify the facial expressions. As a second stage,
the images are preprocessed to enhance the edge details and non
uniform down sampling is done to reduce the computational
complexity and processing time. Our method reliably works even
with faces, which carry heavy expressions.
Abstract: The effects of different parameters on the
hydrodynamics of trickle bed reactors were discussed for Newtonian
and non-Newtonian foaming systems. The varying parameters are
varying liquid velocities, gas flow velocities and surface tension. The
range for gas velocity is particularly large, thanks to the use of dense
gas to simulate very high pressure conditions. This data bank has
been used to compare the prediction accuracy of the different
trendlines and transition points from the literature. More than 240
experimental points for the trickle flow (GCF) and foaming pulsing
flow (PF/FPF) regime were obtained for present study.
Hydrodynamic characteristics involving dynamic liquid saturation
significantly influenced by gas and liquid flow rates. For 15 and 30
ppm air-aqueous surfactant solutions, dynamic liquid saturation
decreases with higher liquid and gas flow rates considerably in high
interaction regime. With decrease in surface tension i.e. for 45 and 60
ppm air-aqueous surfactant systems, effect was more pronounced
with decreases dynamic liquid saturation very sharply during regime
transition significantly at both low liquid and gas flow rates.
Abstract: In Southeast Asia, during the dry season (August to
October) forest fires in Indonesia emit pollutants into the atmosphere.
For two years during this period, a total of 67 samples of 2.5 μm
particulate matters were collected and analyzed for total mass and
elemental composition with ICP - MS after microwave digestion. A
study of 60 elements measured during these periods suggest that the
concentration of most of elements, even those usually related to
crustal source, are extremely high and unpredictable during the haze
period. In By contrast, trace element concentration in non - haze
months is more stable and covers a lower range. Other unexpected
events and their effects on the findings are discussed.
Abstract: Economic dispatch (ED) has been considered to be one of the key functions in electric power system operation which can help to build up effective generating management plans. The practical ED problem has non-smooth cost function with nonlinear constraints which make it difficult to be effectively solved. This paper presents a novel heuristic and efficient optimization approach based on the new Bat algorithm (BA) to solve the practical non-smooth economic dispatch problem. The proposed algorithm easily takes care of different constraints. In addition, two newly introduced modifications method is developed to improve the variety of the bat population when increasing the convergence speed simultaneously. The simulation results obtained by the proposed algorithms are compared with the results obtained using other recently develop methods available in the literature.
Abstract: Initial values of reference vectors have significant influence on recognition accuracy in LVQ. There are several existing techniques, such as SOM and k-means, for setting initial values of reference vectors, each of which has provided some positive results. However, those results are not sufficient for the improvement of recognition accuracy. This study proposes an ACO-used method for initializing reference vectors with an aim to achieve recognition accuracy higher than those obtained through conventional methods. Moreover, we will demonstrate the effectiveness of the proposed method by applying it to the wine data and English vowel data and comparing its results with those of conventional methods.
Abstract: Sputter deposition processes, especially for sputtering
from metal targets, are well investigated. For practical reasons, i.e.
for industrial processes, energetic considerations for sputter
deposition are useful in order to optimize the sputtering process. In
particular, for substrates at floating conditions it is required to obtain
energetic conditions during film growth that enables sufficient dense
metal films of good quality. The influence of ion energies, energy
density and momentum transfer is thus examined both for sputtering
at the target as well as during film growth. Different regimes
dominated by ion energy, energy density and momentum transfer
were identified by using different plasma sources and by varying
power input, pressure and bias voltage.
Abstract: Finite element method was applied to model damage
development in the femoral neck during a sideways fall. The femoral
failure was simulated using the maximum principal strain criterion.
The evolution of damage was consistent with previous studies. It was
initiated by compressive failure at the junction of the superior aspect
of the femoral neck and the greater trochanter. It was followed by
tensile failure that occurred at the inferior aspect of the femoral neck
before a complete transcervical fracture was observed. The estimated
failure line was less than 50° from the horizontal plane (Pauwels type
II).
Abstract: Recent advancements in sensor technologies and
Wireless Body Area Networks (WBANs) have led to the
development of cost-effective healthcare devices which can be used
to monitor and analyse a person-s physiological parameters from
remote locations. These advancements provides a unique opportunity
to overcome current healthcare challenges of low quality service
provisioning, lack of easy accessibility to service varieties, high costs
of services and increasing population of the elderly experienced
globally. This paper reports on a prototype implementation of an
architecture that seamlessly integrates Wireless Body Area Network
(WBAN) with Web services (WS) to proactively collect
physiological data of remote patients to recommend diagnostic
services. Technologies based upon WBAN and WS can provide
ubiquitous accessibility to a variety of services by allowing
distributed healthcare resources to be massively reused to provide
cost-effective services without individuals physically moving to the
locations of those resources. In addition, these technologies can
reduce costs of healthcare services by allowing individuals to access
services to support their healthcare. The prototype uses WBAN body
sensors implemented on arduino fio platforms to be worn by the
patient and an android smart phone as a personal server. The
physiological data are collected and uploaded through GPRS/internet
to the Medical Health Server (MHS) to be analysed. The prototype
monitors the activities, location and physiological parameters such as
SpO2 and Heart Rate of the elderly and patients in rehabilitation.
Medical practitioners would have real time access to the uploaded
information through a web application.
Abstract: The PAX6, a transcription factor, is essential for the morphogenesis of the eyes, brain, pituitary and pancreatic islets. In rodents, the loss of Pax6 function leads to central nervous system defects, anophthalmia, and nasal hypoplasia. The haplo-insufficiency of Pax6 causes microphthalmia, aggression and other behavioral abnormalities. It is also required in brain patterning and neuronal plasticity. In human, heterozygous mutation of Pax6 causes loss of iris [aniridia], mental retardation and glucose intolerance. The 3- deletion in Pax6 leads to autism and aniridia. The phenotypes are variable in peneterance and expressivity. However, mechanism of function and interaction of PAX6 with other proteins during development and associated disease are not clear. It is intended to explore interactors of PAX6 to elucidated biology of PAX6 function in the tissues where it is expressed and also in the central regulatory pathway. This report describes In-silico approaches to explore interacting proteins of PAX6. The models show several possible proteins interacting with PAX6 like MITF, SIX3, SOX2, SOX3, IPO13, TRIM, and OGT. Since the Pax6 is a critical transcriptional regulator and master control gene of eye and brain development it might be interacting with other protein involved in morphogenesis [TGIF, TGF, Ras etc]. It is also presumed that matricelluar proteins [SPARC, thrombospondin-1 and osteonectin etc] are likely to interact during transport and processing of PAX6 and are somewhere its cascade. The proteins involved in cell survival and cell proliferation can also not be ignored.
Abstract: Natural disasters, including earthquake, kill many people around the world every year. Society rescue actions, which start after the earthquake and are called LAST in abbreviation, include locating, access, stabilization and transportation. In the present article, we have studied the process of local accessibility to the injured and transporting them to health care centers. With regard the heavy traffic load due to earthquake, the destruction of connecting roads and bridges and the heavy debris in alleys and street, which put the lives of the injured and the people buried under the debris in danger, accelerating the rescue actions and facilitating the accessibilities are of great importance, obviously. Tehran, the capital of Iran, is among the crowded cities in the world and is the center of extensive economic, political, cultural and social activities. Tehran has a population of about 9.5 millions and because of the immigration of people from the surrounding cities. Furthermore, considering the fact that Tehran is located on two important and large faults, a 6 Richter magnitude earthquake in this city could lead to the greatest catastrophe during the entire human history. The present study is a kind of review and a major part of the required information for it, has been obtained from libraries all of the rescue vehicles around the world, including rescue helicopters, ambulances, fire fighting vehicles and rescue boats, and their applied technology, and also the robots specifically designed for the rescue system and the advantages and disadvantages of them, have been investigated. The studies show that there is a significant relationship between the rescue team-s arrival time at the incident zone and the number of saved people; so that, if the duration of burial under debris 30 minutes, the probability of survival is %99.3, after a day is %81, after 2days is %19 and after 5days is %7.4. The exiting transport systems all have some defects. If these defects are removed, more people could be saved each hour and the preparedness against natural disasters is increased. In this study, transport system has been designed for the rescue team and the injured; which could carry the rescue team to the incident zone and the injured to the health care centers. In addition, this system is able to fly in the air and move on the earth as well; so that the destruction of roads and the heavy traffic load could not prevent the rescue team from arriving early at the incident zone. The system also has the equipment required firebird for debris removing, optimum transport of the injured and first aid.
Abstract: We theoretically demonstrate modulation of light
polarization by a crossed rectangular hole array with asymmetric arm
lengths. There are two waveguide modes that can modulate the x- and
y- polarized incident waves independently. A specific structure is
proposed to convert a left-hand incident wave to a right-hand outgoing
wave by transmission.
Abstract: This paper presents a novel algorithm of stereo
correspondence with rank transform. In this algorithm we used the
genetic algorithm to achieve the accurate disparity map. Genetic
algorithms are efficient search methods based on principles of
population genetic, i.e. mating, chromosome crossover, gene
mutation, and natural selection. Finally morphology is employed to
remove the errors and discontinuities.
Abstract: We study in this paper the effect of the scene
changing on image sequences coding system using Embedded
Zerotree Wavelet (EZW). The scene changing considered here is the
full motion which may occurs. A special image sequence is generated
where the scene changing occurs randomly. Two scenarios are
considered: In the first scenario, the system must provide the
reconstruction quality as best as possible by the management of the
bit rate (BR) while the scene changing occurs. In the second scenario,
the system must keep the bit rate as constant as possible by the
management of the reconstruction quality. The first scenario may be
motivated by the availability of a large band pass transmission
channel where an increase of the bit rate may be possible to keep the
reconstruction quality up to a given threshold. The second scenario
may be concerned by the narrow band pass transmission channel
where an increase of the bit rate is not possible. In this last case,
applications for which the reconstruction quality is not a constraint
may be considered. The simulations are performed with five scales
wavelet decomposition using the 9/7-tap filter bank biorthogonal
wavelet. The entropy coding is performed using a specific defined
binary code book and EZW algorithm. Experimental results are
presented and compared to LEAD H263 EVAL. It is shown that if
the reconstruction quality is the constraint, the system increases the
bit rate to obtain the required quality. In the case where the bit rate
must be constant, the system is unable to provide the required quality
if the scene change occurs; however, the system is able to improve
the quality while the scene changing disappears.
Abstract: MATCH project [1] entitle the development of an
automatic diagnosis system that aims to support treatment of colon
cancer diseases by discovering mutations that occurs to tumour
suppressor genes (TSGs) and contributes to the development of
cancerous tumours. The constitution of the system is based on a)
colon cancer clinical data and b) biological information that will be
derived by data mining techniques from genomic and proteomic
sources The core mining module will consist of the popular, well
tested hybrid feature extraction methods, and new combined
algorithms, designed especially for the project. Elements of rough
sets, evolutionary computing, cluster analysis, self-organization maps
and association rules will be used to discover the annotations
between genes, and their influence on tumours [2]-[11].
The methods used to process the data have to address their high
complexity, potential inconsistency and problems of dealing with the
missing values. They must integrate all the useful information
necessary to solve the expert's question. For this purpose, the system
has to learn from data, or be able to interactively specify by a domain
specialist, the part of the knowledge structure it needs to answer a
given query. The program should also take into account the
importance/rank of the particular parts of data it analyses, and adjusts
the used algorithms accordingly.
Abstract: The major part of light weight timber constructions
consists of insulation. Mineral wool is the most commonly used
insulation due to its cost efficiency and easy handling. The fiber
orientation and porosity of this insulation material enables flowthrough.
The air flow resistance is low. If leakage occurs in the
insulated bay section, the convective flow may cause energy losses
and infiltration of the exterior wall with moisture and particles. In
particular the infiltrated moisture may lead to thermal bridges and
growth of health endangering mould and mildew. In order to prevent
this problem, different numerical calculation models have been
developed. All models developed so far have a potential for
completion. The implementation of the flow-through properties of
mineral wool insulation may help to improve the existing models.
Assuming that the real pressure difference between interior and
exterior surface is larger than the prescribed pressure difference in the
standard test procedure for mineral wool ISO 9053 / EN 29053,
measurements were performed using the measurement setup for
research on convective moisture transfer “MSRCMT".
These measurements show, that structural inhomogeneities of
mineral wool effect the permeability only at higher pressure
differences, as applied in MSRCMT. Additional microscopic
investigations show, that the location of a leak within the
construction has a crucial influence on the air flow-through and the
infiltration rate. The results clearly indicate that the empirical values
for the acoustic resistance of mineral wool should not be used for the
calculation of convective transfer mechanisms.
Abstract: In today's day and age, one of the important topics in
information security is authentication. There are several alternatives
to text-based authentication of which includes Graphical Password
(GP) or Graphical User Authentication (GUA). These methods stems
from the fact that humans recognized and remembers images better
than alphanumerical text characters. This paper will focus on the
security aspect of GP algorithms and what most researchers have
been working on trying to define these security features and
attributes. The goal of this study is to develop a fuzzy decision model
that allows automatic selection of available GP algorithms by taking
into considerations the subjective judgments of the decision makers
who are more than 50 postgraduate students of computer science. The
approach that is being proposed is based on the Fuzzy Analytic
Hierarchy Process (FAHP) which determines the criteria weight as a
linear formula.
Abstract: Literature reveals that many investors rely on technical trading rules when making investment decisions. If stock markets are efficient, one cannot achieve superior results by using these trading rules. However, if market inefficiencies are present, profitable opportunities may arise. The aim of this study is to investigate the effectiveness of technical trading rules in 34 emerging stock markets. The performance of the rules is evaluated by utilizing White-s Reality Check and the Superior Predictive Ability test of Hansen, along with an adjustment for transaction costs. These tests are able to evaluate whether the best model performs better than a buy-and-hold benchmark. Further, they provide an answer to data snooping problems, which is essential to obtain unbiased outcomes. Based on our results we conclude that technical trading rules are not able to outperform a naïve buy-and-hold benchmark on a consistent basis. However, we do find significant trading rule profits in 4 of the 34 investigated markets. We also present evidence that technical analysis is more profitable in crisis situations. Nevertheless, this result is relatively weak.