Abstract: With the implied volatility as an important factor in
financial decision-making, in particular in option pricing valuation,
and also the given fact that the pricing biases of Leland option pricing
models and the implied volatility structure for the options are related,
this study considers examining the implied adjusted volatility smile
patterns and term structures in the S&P/ASX 200 index options using
the different Leland option pricing models. The examination of the
implied adjusted volatility smiles and term structures in the
Australian index options market covers the global financial crisis in
the mid-2007. The implied adjusted volatility was found to escalate
approximately triple the rate prior the crisis.
Abstract: Crucial information barely visible to the human eye is
often embedded in a series of low resolution images taken of the
same scene. Super resolution reconstruction is the process of
combining several low resolution images into a single higher
resolution image. The ideal algorithm should be fast, and should add
sharpness and details, both at edges and in regions without adding
artifacts. In this paper we propose a super resolution blind
reconstruction technique for linearly degraded images. In our
proposed technique the algorithm is divided into three parts an image
registration, wavelets based fusion and an image restoration. In this
paper three low resolution images are considered which may sub
pixels shifted, rotated, blurred or noisy, the sub pixel shifted images
are registered using affine transformation model; A wavelet based
fusion is performed and the noise is removed using soft thresolding.
Our proposed technique reduces blocking artifacts and also
smoothens the edges and it is also able to restore high frequency
details in an image. Our technique is efficient and computationally
fast having clear perspective of real time implementation.
Abstract: This study was investigated on sampling and
analyzing water quality in water reservoir & water tower installed in
two kind of residential buildings and school facilities. Data of water
quality was collected for correlation analysis with frequency of
sanitization of water reservoir through questioning managers of
building about the inspection charts recorded on equipment for water
reservoir. Statistical software packages (SPSS) were applied to the
data of two groups (cleaning frequency and water quality) for
regression analysis to determine the optimal cleaning frequency of
sanitization. The correlation coefficient (R) in this paper represented
the degree of correlation, with values of R ranging from +1 to -1.After
investigating three categories of drinking water users; this study found
that the frequency of sanitization of water reservoir significantly
influenced the water quality of drinking water. A higher frequency of
sanitization (more than four times per 1 year) implied a higher quality
of drinking water. Results indicated that sanitizing water reservoir &
water tower should at least twice annually for achieving the aim of
safety of drinking water.
Abstract: This paper presents an advance in monitoring and
process control of surface roughness in CNC machine for the turning
and milling processes. An integration of the in-process monitoring
and process control of the surface roughness is proposed and
developed during the machining process by using the cutting force
ratio. The previously developed surface roughness models for turning
and milling processes of the author are adopted to predict the inprocess
surface roughness, which consist of the cutting speed, the
feed rate, the tool nose radius, the depth of cut, the rake angle, and
the cutting force ratio. The cutting force ratios obtained from the
turning and the milling are utilized to estimate the in-process surface
roughness. The dynamometers are installed on the tool turret of CNC
turning machine and the table of 5-axis machining center to monitor
the cutting forces. The in-process control of the surface roughness
has been developed and proposed to control the predicted surface
roughness. It has been proved by the cutting tests that the proposed
integration system of the in-process monitoring and the process
control can be used to check the surface roughness during the cutting
by utilizing the cutting force ratio.
Abstract: In this paper, we propose novel algorithmic models
based on information fusion and feature transformation in crossmodal
subspace for different types of residue features extracted from
several intra-frame and inter-frame pixel sub-blocks in video
sequences for detecting digital video tampering or forgery. An
evaluation of proposed residue features – the noise residue features
and the quantization features, their transformation in cross-modal
subspace, and their multimodal fusion, for emulated copy-move
tamper scenario shows a significant improvement in tamper detection
accuracy as compared to single mode features without transformation
in cross-modal subspace.
Abstract: The P-Bigram method is a string comparison methods
base on an internal two characters-based similarity measure. The edit
distance between two strings is the minimal number of elementary
editing operations required to transform one string into the other. The
elementary editing operations include deletion, insertion, substitution
two characters. In this paper, we address the P-Bigram method to
sole the similarity problem in DNA sequence. This method provided
an efficient algorithm that locates all minimum operation in a string.
We have been implemented algorithm and found that our program
calculated that smaller distance than one string. We develop PBigram
edit distance and show that edit distance or the similarity and
implementation using dynamic programming. The performance of
the proposed approach is evaluated using number edit and percentage
similarity measures.
Abstract: In this paper we propose a family of algorithms based
on 3rd and 4th order cumulants for blind single-input single-output
(SISO) Non-Minimum Phase (NMP) Finite Impulse Response (FIR)
channel estimation driven by non-Gaussian signal. The input signal
represents the signal used in 10GBASE-T (or IEEE 802.3an-2006)
as a Tomlinson-Harashima Precoded (THP) version of random
Pulse-Amplitude Modulation with 16 discrete levels (PAM-16). The
proposed algorithms are tested using three non-minimum phase
channel for different Signal-to-Noise Ratios (SNR) and for different
data input length. Numerical simulation results are presented to
illustrate the performance of the proposed algorithms.
Abstract: System MEMORI automatically detects and recognizes rotated and/or rescaled versions of the objects of a database within digital color images with cluttered background. This task is accomplished by means of a region grouping algorithm guided by heuristic rules, whose parameters concern some geometrical properties and the recognition score of the database objects. This paper focuses on the strategies implemented in MEMORI for the estimation of the heuristic rule parameters. This estimation, being automatic, makes the system a highly user-friendly tool.
Abstract: This work considered the thermodynamic feasibility
of scrubbing volatile organic compounds into biodiesel in view of
designing a gas treatment process with this absorbent. A detailed
vapour – liquid equilibrium investigation was performed using the
original UNIFAC group contribution method. The four biodiesels
studied in this work are methyl oleate, methyl palmitate, methyl
linolenate and ethyl stearate. The original UNIFAC procedure was
used to estimate the infinite dilution activity coefficients of 13
selected volatile organic compounds in the biodiesels. The
calculations were done at the VOC mole fraction of 9.213x10-8. Ethyl
stearate gave the most favourable phase equilibrium. A close
agreement was found between the infinite dilution activity coefficient
of toluene found in this work and those reported in literature.
Thermodynamic models can efficiently be used to calculate vast
amount of phase equilibrium behaviour using limited number of
experimental data.
Abstract: Based on a non-linear single track model which
describes the dynamics of vehicle, an optimal path planning strategy
is developed. Real time optimization is used to generate reference
control values to allow leading the vehicle alongside a calculated lane
which is optimal for different objectives such as energy consumption,
run time, safety or comfort characteristics. Strict mathematic
formulation of the autonomous driving allows taking decision on
undefined situation such as lane change or obstacle avoidance. Based
on position of the vehicle, lane situation and obstacle position, the
optimization problem is reformulated in real-time to avoid the
obstacle and any car crash.
Abstract: There are three distinct stages in the evolution of
economic thought, namely:
1. in the first stage, the major concern was to accelerate
economic growth with increased availability of material
goods, especially in developing economies with very low
living standards, because poverty eradication meant faster
economic growth.
2. in the second stage, economists made distinction between
growth and development. Development was seen as going
beyond economic growth, and bringing certain changes in
the structure of the economy with more equitable
distribution of the benefits of growth, with the growth
coming automatic and sustained.
3. the third stage is now reached. Our concern is now with
“sustainable development", that is, development not only
for the present but also of the future.
Thus the focus changed from “sustained growth" to “sustained
development". Sustained development brings to the fore the long
term relationship between the ecology and economic development.
Since the creation of UNEP in 1972 it has worked for
development without destruction for environmentally sound and
sustained development. It was realised that the environment cannot
be viewed in a vaccum, it is not separate from development, nor is it
competing. It suggested for the integration of the environment with
development whereby ecological factors enter development planning,
socio-economic policies, cost-benefit analysis, trade, technology
transfer, waste management, educational and other specific areas.
Industrialisation has contributed to the growth of economy of
several countries. It has improved the standards of living of its people
and provided benefits to the society. It has also created in the process
great environmental problems like climate change, forest destruction
and denudation, soil erosion and desertification etc.
On the other hand, industry has provided jobs and improved the
prospects of wealth for the industrialists. The working class
communities had to simply put up with the high levels of pollution in
order to keep up their jobs and also to save their income.
There are many roots of the environmental problem. They may be
political, economic, cultural and technological conditions of the
modern society. The experts concede that industrial growth lies
somewhere close to the heart of the matter. Therefore, the objective
of this paper is not to document all roots of an environmental crisis
but rather to discuss the effects of industrial growth and
development.
We have come to the conclusion that although public intervention
is often unnecessary to ensure that perfectly competitive markets will
function in society-s best interests, such intervention is necessary
when firms or consumers pollute.
Abstract: Most Decision Support Systems (DSS) for waste
management (WM) constructed are not widely marketed and lack
practical applications. This is due to the number of variables and
complexity of the mathematical models which include the
assumptions and constraints required in decision making. The
approach made by many researchers in DSS modelling is to isolate a
few key factors that have a significant influence to the DSS. This
segmented approach does not provide a thorough understanding of
the complex relationships of the many elements involved. The
various elements in constructing the DSS must be integrated and
optimized in order to produce a viable model that is marketable and
has practical application. The DSS model used in assisting decision
makers should be integrated with GIS, able to give robust prediction
despite the inherent uncertainties of waste generation and the plethora
of waste characteristics, and gives optimal allocation of waste stream
for recycling, incineration, landfill and composting.
Abstract: The correct design of the regulators structure requires complete prediction of the ultimate dimensions of the scour hole profile formed downstream the solid apron. The study of scour downstream regulator is studied either on solid aprons by means of velocity distribution or on movable bed by studying the topography of the scour hole formed in the downstream. In this paper, a new technique was developed to study the scour hole downstream regulators on movable beds. The study was divided into two categories; the first is to find out the sum of the lengths of rigid apron behind the gates in addition to the length of scour hole formed downstream, while the second is to find the minimum length of rigid apron behind the gates to prevent erosion downstream it. The study covers free and submerged hydraulic jump conditions in both symmetrical and asymmetrical under-gated regulations. From the comparison between the studied categories, we found that the minimum length of rigid apron to prevent scour (Ls) is greater than the sum of the lengths of rigid apron and that of scour hole formed behind it (L+Xs). On the other hand, the scour hole dimensions in case of submerged hydraulic jump is always greater than free one, also the scour hole dimensions in asymmetrical operation is greater than symmetrical one.
Abstract: In this paper, we discuss the paradigm shift in bank
capital from the “gone concern" to the “going concern" mindset. We
then propose a methodology for pricing a product of this shift called
Contingent Capital Notes (“CoCos"). The Merton Model can
determine a price for credit risk by using the firm-s equity value as a
call option on those assets. Our pricing methodology for CoCos also
uses the credit spread implied by the Merton Model in a subsequent
derivative form created by John Hull et al . Here, a market implied
asset volatility is calculated by using observed market CDS spreads.
This implied asset volatility is then used to estimate the probability of
triggering a predetermined “contingency event" given the distanceto-
trigger (DTT). The paper then investigates the effect of varying
DTTs and recovery assumptions on the CoCo yield. We conclude
with an investment rationale.
Abstract: The application of a high frequency signal injection method as speed and position observer in PMSM drives has been a research focus. At present, the precision of this method is nearly good as that of ten-bit encoder. But there are some questions for estimating position polarity. Based on high frequency signal injection, this paper presents a method to compensate position polarity for permanent magnet synchronous motor (PMSM). Experiments were performed to test the effectiveness of the proposed algorithm and results present the good performance.
Abstract: In a nuclear reactor Loss of Coolant accident (LOCA)
considers wide range of postulated damage or rupture of pipe in the
heat transport piping system. In the case of LOCA with/without
failure of emergency core cooling system in a Pressurised Heavy
water Reactor, the Pressure Tube (PT) temperature could rise
significantly due to fuel heat up and gross mismatch of the heat
generation and heat removal in the affected channel. The extent and
nature of deformation is important from reactor safety point of view.
Experimental set-ups have been designed and fabricated to simulate
ballooning (radial deformation) of PT for 220 MWe IPHWRs.
Experiments have been conducted by covering the CT by ceramic
fibers and then by submerging CT in water of voided PTs. In both
the experiments, it is observed that ballooning initiates at a
temperature around 665´┐¢C and complete contact between PT and
Caldaria Tube (CT) occurs at around 700´┐¢C approximately. The
strain rate is found to be 0.116% per second. The structural integrity
of PT is retained (no breach) for all the experiments. The PT heatup
is found to be arrested after the contact between PT and CT, thus
establishing moderator acting as an efficient heat sink for IPHWRs.
Abstract: Using vision based solution in intelligent vehicle application often needs large memory to handle video stream and image process which increase complexity of hardware and software. In this paper, we present a FPGA implement of a vision based lane departure warning system. By taking frame of videos, the line gradient of line is estimated and the lane marks are found. By analysis the position of lane mark, departure of vehicle will be detected in time. This idea has been implemented in Xilinx Spartan6 FPGA. The lane departure warning system used 39% logic resources and no memory of the device. The average availability is 92.5%. The frame rate is more than 30 frames per second (fps).
Abstract: Reservoirs with high pressures and temperatures
(HPHT) that were considered to be atypical in the past are now
frequent targets for exploration. For downhole oilfield drilling tools
and components, the temperature and pressure affect the mechanical
strength. To address this issue, a finite element analysis (FEA) for
206.84 MPa (30 ksi) pressure and 165°C has been performed on the
pressure housing of the measurement-while-drilling/logging-whiledrilling
(MWD/LWD) density tool.
The density tool is a MWD/LWD sensor that measures the density
of the formation. One of the components of the density tool is the
pressure housing that is positioned in the tool. The FEA results are
compared with the experimental test performed on the pressure
housing of the density tool. Past results show a close match between
the numerical results and the experimental test. This FEA model can
be used for extreme HPHT and ultra HPHT analyses, and/or optimal
design changes.
Abstract: Many exist studies always use Markov decision
processes (MDPs) in modeling optimal route choice in
stochastic, time-varying networks. However, taking many
variable traffic data and transforming them into optimal route
decision is a computational challenge by employing MDPs in
real transportation networks. In this paper we model finite
horizon MDPs using directed hypergraphs. It is shown that the
problem of route choice in stochastic, time-varying networks
can be formulated as a minimum cost hyperpath problem, and
it also can be solved in linear time. We finally demonstrate the
significant computational advantages of the introduced
methods.
Abstract: This paper presents a very simple and efficient
algorithm for codebook search, which reduces a great deal of
computation as compared to the full codebook search. The algorithm
is based on sorting and centroid technique for search. The results
table shows the effectiveness of the proposed algorithm in terms of
computational complexity. In this paper we also introduce a new
performance parameter named as Average fractional change in pixel
value as we feel that it gives better understanding of the closeness of
the image since it is related to the perception. This new performance
parameter takes into consideration the average fractional change in
each pixel value.