Abstract: In this paper, a heuristic method for simultaneous
rescue robot path-planning and mission scheduling is introduced
based on project management techniques, multi criteria decision
making and artificial potential fields path-planning. Groups of
injured people are trapped in a disastrous situation. These people are
categorized into several groups based on the severity of their
situation. A rescue robot, whose ultimate objective is reaching
injured groups and providing preliminary aid for them through a path
with minimum risk, has to perform certain tasks on its way towards
targets before the arrival of rescue team. A decision value is assigned
to each target based on the whole degree of satisfaction of the criteria
and duties of the robot toward the target and the importance of
rescuing each target based on their category and the number of
injured people. The resulted decision value defines the strength of the
attractive potential field of each target. Dangerous environmental
parameters are defined as obstacles whose risk determines the
strength of the repulsive potential field of each obstacle. Moreover,
negative and positive energies are assigned to the targets and
obstacles, which are variable with respects to the factors involved.
The simulation results show that the generated path for two cases
studies with certain differences in environmental conditions and
other risk factors differ considerably.
Abstract: This paper addresses the stabilization issues for a class of uncertain switched neutral systems with nonlinear perturbations. Based on new classes of piecewise Lyapunov functionals, the stability assumption on all the main operators or the convex combination of coefficient matrices is avoid, and a new switching rule is introduced to stabilize the neutral systems. The switching rule is designed from the solution of the so-called Lyapunov-Metzler linear matrix inequalities. Finally, three simulation examples are given to demonstrate the significant improvements over the existing results.
Abstract: This paper presents the experiment results of investigating the effects of adding various types and proportions of fibre on mechanical strength and permeability characteristics of recycled aggregate concrete (RAC), which was produced with treated coarse recycled concrete aggregate (RCA). Two types of synthetic fibres (i.e., barchip and polypropylene fibre) with various volume fractions were added to the RAC, which was calculated by the weight of the cement. The hardened RAC properties such as compressive strength, flexural strength, ultrasonic pulse velocity, water absorption and total porosity at the curing ages of 7 and 28 days were evaluated and compared with the properties of the control specimens. Results indicate that the treated coarse RCA enhances the mechanical strength and permeability properties of RAC and adding barchip fibre further optimises the results. Adding 1.2% barchip fibre has the best effect on the mechanical strength performance of the RAC.
Abstract: The gamma radiation in samples of a variety of
natural tiling rocks (granites) produced and imported in Iran use in
the building industry was measured, employing high-resolution
Gamma-ray spectroscopy. The rock samples were pulverized, sealed
in 0.5 liter plastic Marinelli beakers, and measured in the laboratory
with an accumulating time between 50000 and 80000 second each.
From the measured Gamma-ray spectra, activity concentrations were
determined for 232Th (range from 6.5 to 172.2 Bq kg-1), 238U (from
7.5 to 178.1 Bq kg-1 ),226Ra( from 3.8 to 94.2 Bq kg-1 ) 40K (from
556.9 to 1539.2 Bq kg-1). From the 29 samples measured in this
study, “Nehbndan ( Berjand )" appears to present the highest
concentrations for 232Th,“Big Red Flower (China) "for 238U , “
Khoram dareh" for 226 Ra and “ Peranshahr" for 40K , respectively.
Abstract: This paper discusses the use of explorative data
mining tools that allow the educator to explore new relationships
between reported learning experiences and actual activities,
even if there are multiple dimensions with a large number
of measured items. The underlying technology is based on
the so-called Compendium Platform for Reproducible Computing
(http://www.freestatistics.org) which was built on top the computational
R Framework (http://www.wessa.net).
Abstract: Lutein is a dietary oxycarotenoid which is found
to reduce the risks of Age-related Macular Degeneration
(AMD). Supercritical fluid extraction of lutein esters from
marigold petals was carried out and was found to be much
effective than conventional solvent extraction. The
saponification of pre-concentrated lutein esters to produce free
lutein was studied which showed a composition of about 88%
total carotenoids (UV-VIS spectrophotometry) and 90.7%
lutein (HPLC). The lipase catalyzed hydrolysis of lutein esters
in conventional medium was investigated. The optimal
temperature, pH, enzyme concentration and water activity
were found to be 50°C, 7, 15% and 0.33 respectively and the
activity loss of lipase was about 25% after 8 times re-use in at
50°C for 12 days. However, the lipase catalyzed hydrolysis of
lutein esters in conventional media resulted in poor
conversions (16.4%).
Abstract: Basel III (or the Third Basel Accord) is a global
regulatory standard on bank capital adequacy, stress testing and
market liquidity risk agreed upon by the members of the Basel
Committee on Banking Supervision in 2010-2011, and scheduled to
be introduced from 2013 until 2018. Basel III is a comprehensive set
of reform measures. These measures aim to; (1) improve the banking
sector-s ability to absorb shocks arising from financial and economic
stress, whatever the source, (2) improve risk management and
governance, (3) strengthen banks- transparency and disclosures.
Similarly the reform target; (1) bank level or micro-prudential,
regulation, which will help raise the resilience of individual banking
institutions to periods of stress. (2) Macro-prudential regulations,
system wide risk that can build up across the banking sector as well
as the pro-cyclical implication of these risks over time. These two
approaches to supervision are complementary as greater resilience at
the individual bank level reduces the risk system wide shocks.
Macroeconomic impact of Basel III; OECD estimates that the
medium-term impact of Basel III implementation on GDP growth is
in the range -0,05 percent to -0,15 percent per year. On the other hand
economic output is mainly affected by an increase in bank lending
spreads as banks pass a rise in banking funding costs, due to higher
capital requirements, to their customers. Consequently the estimated
effects on GDP growth assume no active response from monetary
policy. Basel III impact on economic output could be offset by a
reduction (or delayed increase) in monetary policy rates by about 30
to 80 basis points. The aim of this paper is to create a framework
based on the recent regulations in order to prevent financial crises.
Thus the need to overcome the global financial crisis will contribute
to financial crises that may occur in the future periods. In the first
part of the paper, the effects of the global crisis on the banking
system examine the concept of financial regulations. In the second
part; especially in the financial regulations and Basel III are analyzed.
The last section in this paper explored the possible consequences of
the macroeconomic impacts of Basel III.
Abstract: In this paper, the two-dimensional stagger grid
interface pressure (SGIP) model has been generalized and presented
into three-dimensional form. For this purpose, various models of
surface tension force for interfacial flows have been investigated and
compared with each other. The VOF method has been used for
tracking the interface. To show the ability of the SGIP model for
three-dimensional flows in comparison with other models, pressure
contours, maximum spurious velocities, norm spurious flow
velocities and pressure jump error for motionless drop of liquid and
bubble of gas are calculated using different models. It has been
pointed out that SGIP model in comparison with the CSF, CSS and
PCIL models produces the least maximum and norm spurious
velocities. Additionally, the new model produces more accurate
results in calculating the pressure jumps across the interface for
motionless drop of liquid and bubble of gas which is generated in
surface tension force.
Abstract: India is currently the second most populous nation in
the world with over 1.2 billion people, growing annually at the rate of
1.5%. It is experiencing a surge in energy demands, expected to grow
more than three to four times in 25 years. Most of the energy
requirements are currently satisfied by the import of fossil fuels –
coal, petroleum-based products and natural gas. Biofuels can satisfy
these energy needs in an environmentally benign and cost effective
manner while reducing dependence on import of fossil fuels, thus
providing National Energy Security. Among various forms of
bioenergy, bioethanol is one of the major options for India because of
availability of feed stock crops.
This paper presents an overview on bioethanol production and
technology, steps taken by the Indian government to facilitate and
bring about optimal development and utilization of indigenous
biomass feedstocks for production of this biofuel.
Abstract: This paper presents the use of a newly created network
structure known as a Self-Delaying Dynamic Network (SDN) to
create a high resolution image from a set of time stepped input
frames. These SDNs are non-recurrent temporal neural networks
which can process time sampled data. SDNs can store input data
for a lifecycle and feature dynamic logic based connections between
layers. Several low resolution images and one high resolution image
of a scene were presented to the SDN during training by a Genetic
Algorithm. The SDN was trained to process the input frames in order
to recreate the high resolution image. The trained SDN was then used
to enhance a number of unseen noisy image sets. The quality of high
resolution images produced by the SDN is compared to that of high
resolution images generated using Bi-Cubic interpolation. The SDN
produced images are superior in several ways to the images produced
using Bi-Cubic interpolation.
Abstract: Money laundering has been described by many as the lifeblood of crime and is a major threat to the economic and social well-being of societies. It has been recognized that the banking system has long been the central element of money laundering. This is in part due to the complexity and confidentiality of the banking system itself. It is generally accepted that effective anti-money laundering (AML) measures adopted by banks will make it tougher for criminals to get their "dirty money" into the financial system. In fact, for law enforcement agencies, banks are considered to be an important source of valuable information for the detection of money laundering. However, from the banks- perspective, the main reason for their existence is to make as much profits as possible. Hence their cultural and commercial interests are totally distinct from that of the law enforcement authorities. Undoubtedly, AML laws create a major dilemma for banks as they produce a significant shift in the way banks interact with their customers. Furthermore, the implementation of the laws not only creates significant compliance problems for banks, but also has the potential to adversely affect the operations of banks. As such, it is legitimate to ask whether these laws are effective in preventing money launderers from using banks, or whether they simply put an unreasonable burden on banks and their customers. This paper attempts to address these issues and analyze them against the background of the Malaysian AML laws. It must be said that effective coordination between AML regulator and the banking industry is vital to minimize problems faced by the banks and thereby to ensure effective implementation of the laws in combating money laundering.
Abstract: A series of microarray experiments produces observations
of differential expression for thousands of genes across multiple
conditions.
Principal component analysis(PCA) has been widely used in
multivariate data analysis to reduce the dimensionality of the data in
order to simplify subsequent analysis and allow for summarization of
the data in a parsimonious manner. PCA, which can be implemented
via a singular value decomposition(SVD), is useful for analysis of
microarray data.
For application of PCA using SVD we use the DNA microarray
data for the small round blue cell tumors(SRBCT) of childhood
by Khan et al.(2001). To decide the number of components which
account for sufficient amount of information we draw scree plot.
Biplot, a graphic display associated with PCA, reveals important
features that exhibit relationship between variables and also the
relationship of variables with observations.
Abstract: We present a novel scheme to evaluate sinusoidal functions with low complexity and high precision using cubic spline interpolation. To this end, two different approaches are proposed to find the interpolating polynomial of sin(x) within the range [- π , π]. The first one deals with only a single data point while the other with two to keep the realization cost as low as possible. An approximation error optimization technique for cubic spline interpolation is introduced next and is shown to increase the interpolator accuracy without increasing complexity of the associated hardware. The architectures for the proposed approaches are also developed, which exhibit flexibility of implementation with low power requirement.
Abstract: In this empirical research, how marketing managers evaluate their firms- performances and decide to make innovation is examined. They use some standards which are past performance of the firm, target performance of the firm, competitor performance, and average performance of the industry to compare and evaluate the firms- performances. It is hypothesized that marketing managers and owners of the firm compare the firms- current performance with these four standards at the same time to decide when to make innovation relating to any aspects of the firm, either management style or products. Relationship between the comparison of the firm-s performance with these standards and innovation are searched in the same regression model. The results of the regression analysis are discussed and some recommendations are made for future studies and applicants.
Abstract: The objectives of this research are to produce
prototype coconut oil based solvent offset printing inks and to
analyze a basic quality of printing work derived from coconut oil
based solvent offset printing inks, by mean of bringing coconut oil
for producing varnish and bringing such varnish to produce black
offset printing inks. Then, analysis of qualities i.e. CIELAB value,
density value, and dot gain value of printing work from coconut oil
based solvent offset printing inks which printed on gloss-coated
woodfree paper weighs 130 grams were done. The research result of
coconut oil based solvent offset printing inks indicated that the
suitable varnish formulation is using 51% of coconut oil, 36% of
phenolic resin, and 14% of solvent oil 14%, while the result of
producing black offset ink displayed that the suitable formula of
printing ink is using varnish mixed with 20% of coconut oil, and the
analyzing printing work of coconut oil based solvent offset printing
inks which printed on paper, the results were as follows: CIELAB
value of black offset printing ink is at L* = 31.90, a* = 0.27, and b* =
1.86, density value is at 1.27 and dot gain value was high at mid tone
area of image area.
Abstract: With the development of Internet and databases application techniques, the demand that lots of databases in the Internet are permitted to remote query and access for authorized users becomes common, and the problem that how to protect the copyright of relational databases arises. This paper simply introduces the knowledge of cloud model firstly, includes cloud generators and similar cloud. And then combined with the property of the cloud, a method of protecting relational databases copyright with cloud watermark is proposed according to the idea of digital watermark and the property of relational databases. Meanwhile, the corresponding watermark algorithms such as cloud watermark embedding algorithm and detection algorithm are proposed. Then, some experiments are run and the results are analyzed to validate the correctness and feasibility of the watermark scheme. In the end, the foreground of watermarking relational database and its research direction are prospected.
Abstract: An immunomodulator bioproduct is prepared in a
batch bioprocess with a modified bacterium Pseudomonas
aeruginosa. The bioprocess is performed in 100 L Bioengineering
bioreactor with 42 L cultivation medium made of peptone, meat
extract and sodium chloride. The optimal bioprocess parameters were
determined: temperature – 37 0C, agitation speed - 300 rpm, aeration
rate – 40 L/min, pressure – 0.5 bar, Dow Corning Antifoam M-max.
4 % of the medium volume, duration - 6 hours. This kind of
bioprocesses are appreciated as difficult to control because their
dynamic behavior is highly nonlinear and time varying. The aim of
the paper is to present (by comparison) different models based on
experimental data.
The analysis criteria were modeling error and convergence rate.
The estimated values and the modeling analysis were done by using
the Table Curve 2D.
The preliminary conclusions indicate Andrews-s model with a
maximum specific growth rate of the bacterium in the range of
0.8 h-1.
Abstract: In this paper, a mathematical model of human immunodeficiency
virus (HIV) is utilized and an optimization problem is
proposed, with the final goal of implementing an optimal 900-day
structured treatment interruption (STI) protocol. Two type of commonly
used drugs in highly active antiretroviral therapy (HAART),
reverse transcriptase inhibitors (RTI) and protease inhibitors (PI), are
considered. In order to solving the proposed optimization problem an
adaptive memetic algorithm with population management (AMAPM)
is proposed. The AMAPM uses a distance measure to control the
diversity of population in genotype space and thus preventing the
stagnation and premature convergence. Moreover, the AMAPM uses
diversity parameter in phenotype space to dynamically set the population
size and the number of crossovers during the search process.
Three crossover operators diversify the population, simultaneously.
The progresses of crossover operators are utilized to set the number
of each crossover per generation. In order to escaping the local optima
and introducing the new search directions toward the global optima,
two local searchers assist the evolutionary process. In contrast to
traditional memetic algorithms, the activation of these local searchers
is not random and depends on both the diversity parameters in
genotype space and phenotype space. The capability of AMAPM in
finding optimal solutions compared with three popular metaheurestics
is introduced.
Abstract: The most severe damage of the turbine rotor is its
distortion. The rotor straightening process must lead, at the first
stage, to removal of the stresses from the material by annealing and
next, to straightening of the plastic distortion without leaving any
stress by hot spotting. The straightening method does not produce
stress accumulations and the heating technique, developed
specifically for solid forged rotors and disks, enables to avoid local
overheating and structural changes in the material. This process also
does not leave stresses in the shaft material. An experimental study
of hot spotting is carried out on a large turbine rotor and some of the
most important effective parameters that must be considered on
annealing and hot spotting processes are investigated in this paper.
Abstract: Developing a stable early warning system (EWS)
model that is capable to give an accurate prediction is a challenging
task. This paper introduces k-nearest neighbour (k-NN) method
which never been applied in predicting currency crisis before with the
aim of increasing the prediction accuracy. The proposed k-NN
performance depends on the choice of a distance that is used where in
our analysis; we take the Euclidean distance and the Manhattan as a
consideration. For the comparison, we employ three other methods
which are logistic regression analysis (logit), back-propagation neural
network (NN) and sequential minimal optimization (SMO). The
analysis using datasets from 8 countries and 13 macro-economic
indicators for each country shows that the proposed k-NN method
with k = 4 and Manhattan distance performs better than the other
methods.