Abstract: Detecting changes in multiple images of the same
scene has recently seen increased interest due to the many
contemporary applications including smart security systems, smart
homes, remote sensing, surveillance, medical diagnosis, weather
forecasting, speed and distance measurement, post-disaster forensics
and much more. These applications differ in the scale, nature, and
speed of change. This paper presents an application of image
processing techniques to implement a real-time change detection
system. Change is identified by comparing the RGB representation of
two consecutive frames captured in real-time. The detection threshold
can be controlled to account for various luminance levels. The
comparison result is passed through a filter before decision making to
reduce false positives, especially at lower luminance conditions. The
system is implemented with a MATLAB Graphical User interface
with several controls to manage its operation and performance.
Abstract: Digital cameras to reduce cost, use an image sensor to
capture color images. Color Filter Array (CFA) in digital cameras
permits only one of the three primary (red-green-blue) colors to be
sensed in a pixel and interpolates the two missing components
through a method named demosaicking. Captured data is interpolated
into a full color image and compressed in applications. Color
interpolation before compression leads to data redundancy. This
paper proposes a new Vector Quantization (VQ) technique to
construct a VQ codebook with Differential Evolution (DE)
Algorithm. The new technique is compared to conventional Linde-
Buzo-Gray (LBG) method.
Abstract: Spectrum handover is a significant topic in the
cognitive radio networks to assure an efficient data transmission in
the cognitive radio user’s communications. This paper proposes a
comparison between three spectrum handover models: VIKOR, SAW
and MEW. Four evaluation metrics are used. These metrics are,
accumulative average of failed handover, accumulative average of
handover performed, accumulative average of transmission
bandwidth and, accumulative average of the transmission delay. As a difference with related work, the performance of the three
spectrum handover models was validated with captured data of
spectrum occupancy in experiments performed at the GSM frequency
band (824 MHz - 849 MHz). These data represent the actual behavior
of the licensed users for this wireless frequency band. The results of the comparison show that VIKOR Algorithm
provides a 15.8% performance improvement compared to SAW
Algorithm and, it is 12.1% better than the MEW Algorithm.
Abstract: This paper develops a multiple channel assignment
model, which allows to take advantage of spectrum opportunities in
cognitive radio networks in the most efficient way. The developed
scheme allows making several assignments of available and
frequency adjacent channel, which require a bigger bandwidth, under
an equality environment. The hybrid assignment model it is made by
two algorithms, one that makes the ranking and selects available
frequency channels and the other one in charge of establishing the
Max-Min Fairness for not restrict the spectrum opportunities for all
the other secondary users, who also claim to make transmissions.
Measurements made were done for average bandwidth, average
delay, as well as fairness computation for several channel
assignments. Reached results were evaluated with experimental
spectrum occupational data from captured GSM frequency band. The
developed model shows evidence of improvement in spectrum
opportunity use and a wider average transmission bandwidth for each
secondary user, maintaining equality criteria in channel assignment.
Abstract: In IA-MDT, the magnetic implants are placed
strategically at the target site to greatly and locally increase the
magnetic force on MDCPs and help to attract and retain the MDCPs
at the targeted region. In the present work, we develop a
mathematical model to study the capturing of magnetic nanoparticles
flowing within a fluid in an implant assisted cylindrical channel
under magnetic field. A coil of ferromagnetic SS-430 has been
implanted inside the cylindrical channel to enhance the capturing of
magnetic nanoparticles under magnetic field. The dominant magnetic
and drag forces, which significantly affect the capturing of
nanoparticles, are incorporated in the model. It is observed through
model results that capture efficiency increases as we increase the
magnetic field from 0.1 to 0.5 T, respectively. The increase in capture
efficiency by increase in magnetic field is because as the magnetic
field increases, the magnetization force, which is attractive in nature
and responsible to attract or capture the magnetic particles, increases
and results the capturing of large number of magnetic particles due to
high strength of attractive magnetic force.
Abstract: Nowadays, illegal logging has been causing many
effects including flash flood, avalanche, global warming, and etc. The
purpose of this study was to maintain the earth ecosystem by keeping
and regulate Malaysia’s treasurable rainforest by utilizing a new
technology that will assist in real-time alert and give faster response
to the authority to act on these illegal activities. The methodology of
this research consisted of design stages that have been conducted as
well as the system model and system architecture of the prototype in
addition to the proposed hardware and software that have been
mainly used such as microcontroller, sensor with the implementation
of GSM, and GPS integrated system. This prototype was deployed at
Royal Belum forest in December 2014 for phase 1 and April 2015 for
phase 2 at 21 pinpoint locations. The findings of this research were
the capture of data in real-time such as temperature, humidity,
gaseous, fire, and rain detection which indicate the current natural
state and habitat in the forest. Besides, this device location can be
detected via GPS of its current location and then transmitted by SMS
via GSM system. All of its readings were sent in real-time for further
analysis. The data that were compared to meteorological department
showed that the precision of this device was about 95% and these
findings proved that the system is acceptable and suitable to be used
in the field.
Abstract: In this talk, we introduce a newly developed quantile
function model that can be used for estimating conditional
distributions of financial returns and for obtaining multi-step ahead
out-of-sample predictive distributions of financial returns. Since we
forecast the whole conditional distributions, any predictive quantity
of interest about the future financial returns can be obtained simply
as a by-product of the method. We also show an application of the
model to the daily closing prices of Dow Jones Industrial Average
(DJIA) series over the period from 2 January 2004 - 8 October 2010.
We obtained the predictive distributions up to 15 days ahead for
the DJIA returns, which were further compared with the actually
observed returns and those predicted from an AR-GARCH model.
The results show that the new model can capture the main features
of financial returns and provide a better fitted model together with
improved mean forecasts compared with conventional methods. We
hope this talk will help audience to see that this new model has the
potential to be very useful in practice.
Abstract: The need to merge software artifacts seems inherent
to modern software development. Distribution of development over
several teams and breaking tasks into smaller, more manageable
pieces are an effective means to deal with the kind of complexity. In
each case, the separately developed artifacts need to be assembled as
efficiently as possible into a consistent whole in which the parts still
function as described. In addition, earlier changes are introduced into
the life cycle and easier is their management by designers.
Interaction-based specifications such as UML sequence diagrams
have been found effective in this regard. As a result, sequence
diagrams can be used not only for capturing system behaviors but
also for merging changes in order to create a new version. The
objective of this paper is to suggest a new approach to deal with the
problem of software merging at the level of sequence diagrams by
using the concept of dependence analysis that captures, formally, all
mapping, and differences between elements of sequence diagrams
and serves as a key concept to create a new version of sequence
diagram.
Abstract: Current transformers are an integral part of power
system because it provides a proportional safe amount of current for
protection and measurement applications. However, when the power
system experiences an abnormal situation leading to huge current
flow, then this huge current is proportionally injected to the
protection and metering circuit. Since the protection and metering
equipment’s are designed to withstand only certain amount of current
with respect to time, these high currents pose a risk to man and
equipment. Therefore, during such instances, the CT saturation
characteristics have a huge influence on the safety of both man and
equipment and on the reliability of the protection and metering
system. This paper shows the effect of burden on the Accuracy Limiting
factor/ Instrument security factor of current transformers and the
change in saturation characteristics of the CT’s. The response of the
CT to varying levels of overcurrent at different connected burden will
be captured using the data acquisition software LabVIEW. Analysis
is done on the real time data gathered using LabVIEW. Variation of
current transformer saturation characteristics with changes in burden
will be discussed.
Abstract: Laban Movement Analysis (LMA), developed in the
dance community over the past seventy years, is an effective method
for observing, describing, notating, and interpreting human
movement to enhance communication and expression in everyday
and professional life. Many applications that use motion capture data
might be significantly leveraged if the Laban qualities will be
recognized automatically. This paper presents an automated
recognition method of Laban qualities from motion capture skeletal
recordings and it is demonstrated on the output of Microsoft’s Kinect
V2 sensor.
Abstract: The purpose of this project is to propose a quick and
environmentally friendly alternative to measure the quality of oils
used in food industry. There is evidence that repeated and
indiscriminate use of oils in food processing cause physicochemical
changes with formation of potentially toxic compounds that can
affect the health of consumers and cause organoleptic changes. In
order to assess the quality of oils, non-destructive optical techniques
such as Interferometry offer a rapid alternative to the use of reagents,
using only the interaction of light on the oil. Through this project, we
used interferograms of samples of oil placed under different heating
conditions to establish the changes in their quality. These
interferograms were obtained by means of a Mach-Zehnder
Interferometer using a beam of light from a HeNe laser of 10mW at
632.8nm. Each interferogram was captured, analyzed and measured
full width at half-maximum (FWHM) using the software from
Amcap and ImageJ. The total of FWHMs was organized in three
groups. It was observed that the average obtained from each of the
FWHMs of group A shows a behavior that is almost linear, therefore
it is probable that the exposure time is not relevant when the oil is
kept under constant temperature. Group B exhibits a slight
exponential model when temperature raises between 373 K and 393
K. Results of the t-Student show a probability of 95% (0.05) of the
existence of variation in the molecular composition of both samples.
Furthermore, we found a correlation between the Iodine Indexes
(Physicochemical Analysis) and the Interferograms (Optical
Analysis) of group C. Based on these results, this project highlights
the importance of the quality of the oils used in food industry and
shows how Interferometry can be a useful tool for this purpose.
Abstract: In this paper, we provided a literature survey on the
artificial stock problem (ASM). The paper began by exploring the
complexity of the stock market and the needs for ASM. ASM
aims to investigate the link between individual behaviors (micro
level) and financial market dynamics (macro level). The variety of
patterns at the macro level is a function of the AFM complexity. The
financial market system is a complex system where the relationship
between the micro and macro level cannot be captured analytically.
Computational approaches, such as simulation, are expected to
comprehend this connection. Agent-based simulation is a simulation
technique commonly used to build AFMs. The paper proceeds by
discussing the components of the ASM. We consider the roles
of behavioral finance (BF) alongside the traditionally risk-averse
assumption in the construction of agent’s attributes. Also, the
influence of social networks in the developing of agents interactions is
addressed. Network topologies such as a small world, distance-based,
and scale-free networks may be utilized to outline economic
collaborations. In addition, the primary methods for developing
agents learning and adaptive abilities have been summarized.
These incorporated approach such as Genetic Algorithm, Genetic
Programming, Artificial neural network and Reinforcement Learning.
In addition, the most common statistical properties (the stylized facts)
of stock that are used for calibration and validation of ASM are
discussed. Besides, we have reviewed the major related previous
studies and categorize the utilized approaches as a part of these
studies. Finally, research directions and potential research questions
are argued. The research directions of ASM may focus on the macro
level by analyzing the market dynamic or on the micro level by
investigating the wealth distributions of the agents.
Abstract: In this paper, an autonomous hovering control method
of multicopter using only Web camera is proposed. Recently, various
control method of an autonomous flight for multicopter are proposed.
But, in the previous proposed methods, a motion capture system
(i. e., OptiTrack) and laser range finder are often used to measure
the position and posture of multicopter. To achieve an autonomous
flight control of multicopter with simple equipments, we propose
an autonomous flight control method using AR marker and Web
camera. AR marker can measure the position of multicopter with
Cartesian coordinate in three dimensional, then its position connects
with aileron, elevator, and accelerator throttle operation. A simple
PID control method is applied to the each operation and adjust
the controller gains. Experimental results are given to show the
effectiveness of our proposed method. Moreover, another simple
operation method for autonomous flight control multicopter is also
proposed.
Abstract: The literature on language teaching and second
language acquisition has been largely driven by monolingual
ideology with a common assumption that a second language (L2) is
best taught and learned in the L2 only. The current study challenges
this assumption by reporting learners' positive perceptions of tertiary
level teachers' code switching practices in Vietnam. The findings of
this study contribute to our understanding of code switching practices
in language classrooms from a learners' perspective.
Data were collected from student participants who were working
towards a Bachelor degree in English within the English for Business
Communication stream through the use of focus group interviews.
The literature has documented that this method of interviewing has a
number of distinct advantages over individual student interviews. For
instance, group interactions generated by focus groups create a more
natural environment than that of an individual interview because they
include a range of communicative processes in which each individual
may influence or be influenced by others - as they are in their real
life. The process of interaction provides the opportunity to obtain the
meanings and answers to a problem that are "socially constructed
rather than individually created" leading to the capture of real-life
data. The distinct feature of group interaction offered by this
technique makes it a powerful means of obtaining deeper and richer
data than those from individual interviews. The data generated
through this study were analysed using a constant comparative
approach. Overall, the students expressed positive views of this
practice indicating that it is a useful teaching strategy. Teacher code
switching was seen as a learning resource and a source supporting
language output. This practice was perceived to promote student
comprehension and to aid the learning of content and target language
knowledge. This practice was also believed to scaffold the students'
language production in different contexts. However, the students
indicated their preference for teacher code switching to be
constrained, as extensive use was believed to negatively impact on
their L2 learning and trigger cognitive reliance on the L1 for L2
learning. The students also perceived that when the L1 was used to a
great extent, their ability to develop as autonomous learners was
negatively impacted.
This study found that teacher code switching was supported in
certain contexts by learners, thus suggesting that there is a need for
the widespread assumption about the monolingual teaching approach
to be re-considered.
Abstract: Most of the oil palm plantations have been threatened
by Basal Stem Rot (BSR) disease which causes serious economic
impact. This study was conducted to identify the healthy and BSRinfected
oil palm tree using thirteen color indices. Multispectral and
thermal camera was used to capture 216 images of the leaves taken
from frond number 1, 9 and 17. Indices of normalized difference
vegetation index (NDVI), red (R), green (G), blue (B), near infrared
(NIR), green – blue (GB), green/blue (G/B), green – red (GR),
green/red (G/R), hue (H), saturation (S), intensity (I) and thermal
index (T) were used. From this study, it can be concluded that G
index taken from frond number 9 is the best index to differentiate
between the healthy and BSR-infected oil palm trees. It not only gave
high value of correlation coefficient (R=-0.962), but also high value
of separation between healthy and BSR-infected oil palm tree.
Furthermore, power and S model developed using G index gave the
highest R2 value which is 0.985.
Abstract: The sea waves carry thousands of GWs of power
globally. Although there are a number of different approaches to
harness offshore energy, they are likely to be expensive, practically
challenging, and vulnerable to storms. Therefore, this paper considers
using the near shore waves for generating mechanical and electrical
power. It introduces two new approaches, the wave manipulation and
using a variable duct turbine, for intercepting very wide wave fronts
and coping with the fluctuations of the wave height and the sea level,
respectively. The first approach effectively allows capturing much
more energy yet with a much narrower turbine rotor. The second
approach allows using a rotor with a smaller radius but captures
energy of higher wave fronts at higher sea levels yet preventing it
from totally submerging. To illustrate the effectiveness of the first
approach, the paper contains a description and the simulation results
of a scale model of a wave manipulator. Then, it includes the results
of testing a physical model of the manipulator and a single duct, axial
flow turbine in a wave flume in the laboratory. The paper also
includes comparisons of theoretical predictions, simulation results,
and wave flume tests with respect to the incident energy, loss in wave
manipulation, minimal loss, brake torque, and the angular velocity.
Abstract: Due to today’s globalization as well as outsourcing
practices of the companies, the Supply Chain (SC) performances
have become more dependent on the efficient movement of material
among places that are geographically dispersed, where there is more
chance for disruptions. One such disruption is the quality and
delivery uncertainties of outsourcing. These uncertainties could lead
the products to be unsafe and, as is the case in a number of recent
examples, companies may have to end up in recalling their products.
As a result of these problems, there is a need to develop a
methodology for selecting suppliers globally in view of risks
associated with low quality and late delivery. Accordingly, we
developed a two-stage stochastic model that captures the risks
associated with uncertainty in quality and delivery as well as a
solution procedure for the model. The stochastic model developed
simultaneously optimizes supplier selection and purchase quantities
under price discounts over a time horizon. In particular, our target is
the study of global organizations with multiple sites and multiple
overseas suppliers, where the pricing is offered in suppliers’ local
currencies. Our proposed methodology is applied to a case study for a
US automotive company having two assembly plants and four
potential global suppliers to illustrate how the proposed model works
in practice.
Abstract: The future and the development of science is therefore
seen in interdisciplinary areas such as biomedical engineering. Selfassembled
structures, similar to stem cell niches would inhibit fast
division process and subsequently capture the stem cells from the
blood flow. By means of surface topography and the stiffness as well
as microstructure progenitor cells should be differentiated towards
the formation of endothelial cells monolayer which effectively will
inhibit activation of the coagulation cascade. The idea of the material
surface development met the interest of the clinical institutions,
which support the development of science in this area and are waiting
for scientific solutions that could contribute to the development of
heart assist systems. This would improve the efficiency of the
treatment of patients with myocardial failure, supported with artificial
heart assist systems. Innovative materials would enable the redesign,
in the post project activity, construction of ventricular heart assist.
Abstract: Boiling process is characterized by the rapid
formation of vapour bubbles at the solid–liquid interface (nucleate
boiling) with pre-existing vapour or gas pockets. Computational fluid
dynamics (CFD) is an important tool to study bubble dynamics. In
the present study, CFD simulation has been carried out to determine
the bubble detachment diameter and its terminal velocity. Volume of
fluid method is used to model the bubble and the surrounding by
solving single set of momentum equations and tracking the volume
fraction of each of the fluids throughout the domain. In the
simulation, bubble is generated by allowing water-vapour to enter a
cylinder filled with liquid water through an inlet at the bottom. After
the bubble is fully formed, the bubble detaches from the surface and
rises up during which the bubble accelerates due to the net balance
between buoyancy force and viscous drag. Finally when these forces
exactly balance each other, it attains a constant terminal velocity. The
bubble detachment diameter and the terminal velocity of the bubble
are captured by the monitor function provided in FLUENT. The
detachment diameter and the terminal velocity obtained are compared
with the established results based on the shape of the bubble. A good
agreement is obtained between the results obtained from simulation
and the equations in comparison with the established results.
Abstract: Designing cost-efficient, secure network protocols for
Wireless Sensor Networks (WSNs) is a challenging problem because
sensors are resource-limited wireless devices. Security services such
as authentication and improved pairwise key establishment are
critical to high efficient networks with sensor nodes. For sensor
nodes to correspond securely with each other efficiently, usage of
cryptographic techniques is necessary. In this paper, two key
predistribution schemes that enable a mobile sink to establish a
secure data-communication link, on the fly, with any sensor nodes.
The intermediate nodes along the path to the sink are able to verify
the authenticity and integrity of the incoming packets using a
predicted value of the key generated by the sender’s essential power.
The proposed schemes are based on the pairwise key with the mobile
sink, our analytical results clearly show that our schemes perform
better in terms of network resilience to node capture than existing
schemes if used in wireless sensor networks with mobile sinks.