Abstract: Indonesia has experienced annual forest fires that have
rapidly destroyed and degraded its forests. Fires in the peat swamp
forests of Riau Province, have set the stage for problems to worsen,
this being the ecosystem most prone to fires (which are also the most
difficult, to extinguish). Despite various efforts to curb deforestation,
and forest degradation processes, severe forest fires are still
occurring. To find an effective solution, the basic causes of the
problems must be identified. It is therefore critical to have an indepth
understanding of the underlying causal factors that have
contributed to deforestation and forest degradation as a whole, in
order to attain reductions in their rates. An assessment of the drivers of deforestation and forest
degradation was carried out, in order to design and implement
measures that could slow these destructive processes. Research was
conducted in Giam Siak Kecil–Bukit Batu Biosphere Reserve
(GSKBB BR), in the Riau Province of Sumatera, Indonesia. A
biosphere reserve was selected as the study site because such reserves
aim to reconcile conservation with sustainable development. A
biosphere reserve should promote a range of local human activities,
together with development values that are in line spatially and
economically with the area conservation values, through use of a
zoning system. Moreover, GSKBB BR is an area with vast peatlands,
and is experiencing forest fires annually. Various factors were
analysed to assess the drivers of deforestation and forest degradation
in GSKBB BR; data were collected from focus group discussions
with stakeholders, key informant interviews with key stakeholders,
field observation and a literature review. Landsat satellite imagery was used to map forest-cover changes
for various periods. Analysis of landsat images, taken during the
period 2010-2014, revealed that within the non-protected area of core
zone, there was a trend towards decreasing peat swamp forest areas,
increasing land clearance, and increasing areas of community oilpalm
and rubber plantations. Fire was used for land clearing and most
of the forest fires occurred in the most populous area (the transition
area). The study found a relationship between the deforested/
degraded areas, and certain distance variables, i.e. distance from
roads, villages and the borders between the core area and the buffer
zone. The further the distance from the core area of the reserve, the
higher was the degree of deforestation and forest degradation. Research findings suggested that agricultural expansion may be
the direct cause of deforestation and forest degradation in the reserve,
whereas socio-economic factors were the underlying driver of forest
cover changes; such factors consisting of a combination of sociocultural,
infrastructural, technological, institutional (policy and governance), demographic (population pressure) and economic
(market demand) considerations. These findings indicated that local
factors/problems were the critical causes of deforestation and
degradation in GSKBB BR. This research therefore concluded that
reductions in deforestation and forest degradation in GSKBB BR
could be achieved through ‘local actor’-tailored approaches such as
community empowerment.
Abstract: File sharing in networks is generally achieved using
Peer-to-Peer (P2P) applications. Structured P2P approaches are
widely used in adhoc networks due to its distributed and scalability
features. Efficient mechanisms are required to handle the huge
amount of data distributed to all peers. The intrinsic characteristics of
P2P system makes for easier content distribution when compared to
client-server architecture. All the nodes in a P2P network act as both
client and server, thus, distributing data takes lesser time when
compared to the client-server method. CHORD protocol is a resource
routing based where nodes and data items are structured into a 1-
dimensional ring. The structured lookup algorithm of Chord is
advantageous for distributed P2P networking applications. However,
structured approach improves lookup performance in a high
bandwidth wired network it could contribute to unnecessary overhead
in overlay networks leading to degradation of network performance.
In this paper, the performance of existing CHORD protocol on
Wireless Mesh Network (WMN) when nodes are static and dynamic
is investigated.
Abstract: The purpose of this study is the discrimination of 28
postmenopausal with osteoporotic femoral fractures from an agematched
control group of 28 women using texture analysis based on
fractals. Two pre-processing approaches are applied on radiographic
images; these techniques are compared to highlight the choice of the
pre-processing method. Furthermore, the values of the fractal
dimension are compared to those of the fractal signature in terms of
the classification of the two populations. In a second analysis, the
BMD measure at proximal femur was compared to the fractal
analysis, the latter, which is a non-invasive technique, allowed a
better discrimination; the results confirm that the fractal analysis of
texture on calcaneus radiographs is able to discriminate osteoporotic
patients with femoral fracture from controls. This discrimination was
efficient compared to that obtained by BMD alone. It was also
present in comparing subgroups with overlapping values of BMD.
Abstract: In this paper, we provided a literature survey on the
artificial stock problem (ASM). The paper began by exploring the
complexity of the stock market and the needs for ASM. ASM
aims to investigate the link between individual behaviors (micro
level) and financial market dynamics (macro level). The variety of
patterns at the macro level is a function of the AFM complexity. The
financial market system is a complex system where the relationship
between the micro and macro level cannot be captured analytically.
Computational approaches, such as simulation, are expected to
comprehend this connection. Agent-based simulation is a simulation
technique commonly used to build AFMs. The paper proceeds by
discussing the components of the ASM. We consider the roles
of behavioral finance (BF) alongside the traditionally risk-averse
assumption in the construction of agent’s attributes. Also, the
influence of social networks in the developing of agents interactions is
addressed. Network topologies such as a small world, distance-based,
and scale-free networks may be utilized to outline economic
collaborations. In addition, the primary methods for developing
agents learning and adaptive abilities have been summarized.
These incorporated approach such as Genetic Algorithm, Genetic
Programming, Artificial neural network and Reinforcement Learning.
In addition, the most common statistical properties (the stylized facts)
of stock that are used for calibration and validation of ASM are
discussed. Besides, we have reviewed the major related previous
studies and categorize the utilized approaches as a part of these
studies. Finally, research directions and potential research questions
are argued. The research directions of ASM may focus on the macro
level by analyzing the market dynamic or on the micro level by
investigating the wealth distributions of the agents.
Abstract: Experiential marketing is one of the marketing
approaches that offer an exceptional framework to integrate elements
of experience and entertainment in a product or service. Experiential
marketing is defined as a memorable experience that goes deeply into
the customer’s mind. Besides that, customer satisfaction is defined as
an emotional response to the experiences provided by and associated
with particular products or services purchased. Thus, experiential
marketing activities can affect the level of customer satisfaction and
loyalty. In this context, the research aims to explore the relationship
among experiential marketing, customer satisfaction and customer
loyalty among the cosmetic products customers in Konya. The partial
least squares (PLS) method is used to analyze the survey data.
Findings of the present study revealed that experiential marketing has
been a significant predictor of customer satisfaction and customer
loyalty, and also experiential marketing has a significantly positive
effect on customer satisfaction and customer loyalty.
Abstract: The quantitative study of cell mechanics is of
paramount interest, since it regulates the behaviour of the living cells
in response to the myriad of extracellular and intracellular
mechanical stimuli. The novel experimental techniques together with
robust computational approaches have given rise to new theories and
models, which describe cell mechanics as combination of
biomechanical and biochemical processes. This review paper
encapsulates the existing continuum-based computational approaches
that have been developed for interpreting the mechanical responses of
living cells under different loading and boundary conditions. The
salient features and drawbacks of each model are discussed from both
structural and biological points of view. This discussion can
contribute to the development of even more precise and realistic
computational models of cell mechanics based on continuum
approaches or on their combination with microstructural approaches,
which in turn may provide a better understanding of
mechanotransduction in living cells.
Abstract: This paper will seek to clarify important key terms
such as home schooling and home education as well as the legalities
attached to such terms. It will reflect on the recent proposed changes
to terminology in NSW, Australia. The various pedagogical
approaches to home education will be explored including their
prominence in the Australian context. There is a strong focus on
literature from Australia. The historical background of home
education in Australia will be explained as well as the difference
between distance education and home education. The future of home
education in Australia will be discussed.
Abstract: The lifetime of a wireless sensor network can be
effectively increased by using scheduling operations. Once the
sensors are randomly deployed, the task at hand is to find the largest
number of disjoint sets of sensors such that every sensor set provides
complete coverage of the target area. At any instant, only one of these
disjoint sets is switched on, while all other are switched off. This
paper proposes a heuristic search method to find the maximum
number of disjoint sets that completely cover the region. A
population of randomly initialized members is made to explore the
solution space. A set of heuristics has been applied to guide the
members to a possible solution in their neighborhood. The heuristics
escalate the convergence of the algorithm. The best solution explored
by the population is recorded and is continuously updated. The
proposed algorithm has been tested for applications which require
sensing of multiple target points, referred to as point coverage
applications. Results show that the proposed algorithm outclasses the
existing algorithms. It always finds the optimum solution, and that
too by making fewer number of fitness function evaluations than the
existing approaches.
Abstract: This paper presents a state-of-the-art survey of the
operations research models developed for internal audit planning.
Two alternative approaches have been followed in the literature for
audit planning: (1) identifying the optimal audit frequency; and (2)
determining the optimal audit resource allocation. The first approach
identifies the elapsed time between two successive audits, which can
be presented as the optimal number of audits in a given planning
horizon, or the optimal number of transactions after which an audit
should be performed. It also includes the optimal audit schedule. The
second approach determines the optimal allocation of audit frequency
among all auditable units in the firm. In our review, we discuss both
the deterministic and probabilistic models developed for audit
planning. In addition, game theory models are reviewed to find the
optimal auditing strategy based on the interactions between the
auditors and the clients.
Abstract: Background subtraction and temporal difference are
often used for moving object detection in video. Both approaches are
computationally simple and easy to be deployed in real-time image
processing. However, while the background subtraction is highly
sensitive to dynamic background and illumination changes, the
temporal difference approach is poor at extracting relevant pixels of
the moving object and at detecting the stopped or slowly moving
objects in the scene. In this paper, we propose a simple moving object
detection scheme based on adaptive background subtraction and
temporal difference exploiting dynamic background updates. The
proposed technique consists of histogram equalization, a linear
combination of background and temporal difference, followed by the
novel frame-based and pixel-based background updating techniques.
Finally, morphological operations are applied to the output images.
Experimental results show that the proposed algorithm can solve the
drawbacks of both background subtraction and temporal difference
methods and can provide better performance than that of each method.
Abstract: Waste Load Allocation (WLA) strategies usually
intend to find economic policies for water resource management.
Water quality trading (WQT) is an approach that uses discharge
permit market to reduce total environmental protection costs. This
primarily requires assigning discharge limits known as total
maximum daily loads (TMDLs). These are determined by monitoring
organizations with respect to the receiving water quality and
remediation capabilities. The purpose of this study is to compare two
approaches of TMDL assignment for WQT policy in small catchment
area of Haraz River, in north of Iran. At first, TMDLs are assigned
uniformly for the whole point sources to keep the concentrations of
BOD and dissolved oxygen (DO) at the standard level at checkpoint
(terminus point). This was simply simulated and controlled by
Qual2kw software. In the second scenario, TMDLs are assigned
using multi objective particle swarm optimization (MOPSO) method
in which the environmental violation at river basin and total treatment
costs are minimized simultaneously. In both scenarios, the equity
index and the WLA based on trading discharge permits (TDP) are
calculated. The comparative results showed that using economically
optimized TMDLs (2nd scenario) has slightly more cost savings rather
than uniform TMDL approach (1st scenario). The former annually
costs about 1 M$ while the latter is 1.15 M$. WQT can decrease
these annual costs to 0.9 and 1.1 M$, respectively. In other word,
these approaches may save 35 and 45% economically in comparison
with command and control policy. It means that using multi objective
decision support systems (DSS) may find more economical WLA,
however its outcome is not necessarily significant in comparison with
uniform TMDLs. This may be due to the similar impact factors of
dischargers in small catchments. Conversely, using uniform TMDLs
for WQT brings more equity that makes stakeholders not feel that
much envious of difference between TMDL and WQT allocation. In
addition, for this case, determination of TMDLs uniformly would be
much easier for monitoring. Consequently, uniform TMDL for TDP
market is recommended as a sustainable approach. However,
economical TMDLs can be used for larger watersheds.
Abstract: The sea waves carry thousands of GWs of power
globally. Although there are a number of different approaches to
harness offshore energy, they are likely to be expensive, practically
challenging, and vulnerable to storms. Therefore, this paper considers
using the near shore waves for generating mechanical and electrical
power. It introduces two new approaches, the wave manipulation and
using a variable duct turbine, for intercepting very wide wave fronts
and coping with the fluctuations of the wave height and the sea level,
respectively. The first approach effectively allows capturing much
more energy yet with a much narrower turbine rotor. The second
approach allows using a rotor with a smaller radius but captures
energy of higher wave fronts at higher sea levels yet preventing it
from totally submerging. To illustrate the effectiveness of the first
approach, the paper contains a description and the simulation results
of a scale model of a wave manipulator. Then, it includes the results
of testing a physical model of the manipulator and a single duct, axial
flow turbine in a wave flume in the laboratory. The paper also
includes comparisons of theoretical predictions, simulation results,
and wave flume tests with respect to the incident energy, loss in wave
manipulation, minimal loss, brake torque, and the angular velocity.
Abstract: In recent years a new method of combination
treatment for cancer has been developed and studied that has led to
significant advancements in the field of cancer therapy. Hyperthermia
is a traditional therapy that, along with a creation of a medically
approved level of heat with the help of an alternating magnetic AC
current, results in the destruction of cancer cells by heat. This paper
gives details regarding the production of the spherical nanocomposite
PVA/γ-Fe2O3 in order to be used for medical purposes such as tumor
treatment by hyperthermia. To reach a suitable and evenly distributed
temperature, the nanocomposite with core-shell morphology and
spherical form within a 100 to 200 nanometer size was created using
phase separation emulsion, in which the magnetic nano-particles γ-
Fe2O3 with an average particle size of 20 nano-meters and with
different percentages of 0.2, 0.4, 0.5 and 0.6 were covered by
polyvinyl alcohol. The main concern in hyperthermia and heat
treatment is achieving desirable specific absorption rate (SAR) and
one of the most critical factors in SAR is particle size. In this project
all attempts has been done to reach minimal size and consequently
maximum SAR. The morphological analysis of the spherical
structure of the nanocomposite PVA/γ-Fe2O3 was achieved by SEM
analyses and the study of the chemical bonds created was made
possible by FTIR analysis. To investigate the manner of magnetic
nanocomposite particle size distribution a DLS experiment was
conducted. Moreover, to determine the magnetic behavior of the γ-
Fe2O3 particle and the nanocomposite PVA/γ-Fe2O3 in different
concentrations a VSM test was conducted. To sum up, creating
magnetic nanocomposites with a spherical morphology that would be
employed for drug loading opens doors to new approaches in
developing nanocomposites that provide efficient heat and a
controlled release of drug simultaneously inside the magnetic field,
which are among their positive characteristics that could significantly
improve the recovery process in patients.
Abstract: Construction and reconstruction of settlements and
individual municipalities, environmental management and the
creation, deployment of the forces of production and building
transport and technical equipment requires a large expenditure of
material and human resources. That is why the economic aspects of
the majority decision in these planes built in the foreground and are
often decisive. Thereby but more serious is that the economic aspects
of the settlement, the creation and function remain in their whole,
unprocessed, and cannot speak of a set of individual techniques and
methods traditional indicators and experiments with new approaches.
This is true both at the level of the national economy, and in their
own urban designs. Still a few remain identified specific economic
shaping patterns of settlement and the less it is possible to speak of
their control. Also practical assessing economics of specific solutions
are often used non-apt indicators in addition to economics usually
identifies with the lowest acquisition cost or high-intensity land use
with little regard for functional efficiency and little studied much
higher operating and maintenance costs".
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: Ontology validation is an important part of web
applications’ development, where knowledge integration and
ontological reasoning play a fundamental role. It aims to ensure the
consistency and correctness of ontological knowledge and to
guarantee that ontological reasoning is carried out in a meaningful
way. Existing approaches to ontology validation address more or less
specific validation issues, but the overall process of validating web
ontologies has not been formally established yet. As the size and the
number of web ontologies continue to grow, more web applications’
developers will rely on the existing repository of ontologies rather
than develop ontologies from scratch. If an application utilizes
multiple independently created ontologies, their consistency must be
validated and eventually adjusted to ensure proper interoperability
between them. This paper presents a validation technique intended to
test the consistency of independent ontologies utilized by a common
application.
Abstract: Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.
Abstract: To explore how the brain may recognise objects in its
general,accurate and energy-efficient manner, this paper proposes the
use of a neuromorphic hardware system formed from a Dynamic
Video Sensor (DVS) silicon retina in concert with the SpiNNaker
real-time Spiking Neural Network (SNN) simulator. As a first step
in the exploration on this platform a recognition system for dynamic
hand postures is developed, enabling the study of the methods used
in the visual pathways of the brain. Inspired by the behaviours of
the primary visual cortex, Convolutional Neural Networks (CNNs)
are modelled using both linear perceptrons and spiking Leaky
Integrate-and-Fire (LIF) neurons.
In this study’s largest configuration using these approaches, a
network of 74,210 neurons and 15,216,512 synapses is created and
operated in real-time using 290 SpiNNaker processor cores in parallel
and with 93.0% accuracy. A smaller network using only 1/10th of the
resources is also created, again operating in real-time, and it is able
to recognise the postures with an accuracy of around 86.4% - only
6.6% lower than the much larger system. The recognition rate of the
smaller network developed on this neuromorphic system is sufficient
for a successful hand posture recognition system, and demonstrates
a much improved cost to performance trade-off in its approach.
Abstract: We present a gas-liquid microfluidic system as a
reactor to obtain magnetite nanoparticles with an excellent degree of
control regarding their crystalline phase, shape and size. Several
types of microflow approaches were selected to prevent nanomaterial
aggregation and to promote homogenous size distribution. The
selected reactor consists of a mixer stage aided by ultrasound waves
and a reaction stage using a N2-liquid segmented flow to prevent
magnetite oxidation to non-magnetic phases. A milli-fluidic reactor
was developed to increase the production rate where a magnetite
throughput close to 450 mg/h in a continuous fashion was obtained.
Abstract: This paper investigates simple implicit force control
algorithms realizable with industrial robots. A lot of approaches
already published are difficult to implement in commercial robot
controllers, because the access to the robot joint torques is necessary
or the complete dynamic model of the manipulator is used. In
the past we already deal with explicit force control of a position
controlled robot. Well known schemes of implicit force control are
stiffness control, damping control and impedance control. Using such
algorithms the contact force cannot be set directly. It is further
the result of controller impedance, environment impedance and
the commanded robot motion/position. The relationships of these
properties are worked out in this paper in detail for the chosen
implicit approaches. They have been adapted to be implementable
on a position controlled robot. The behaviors of stiffness control
and damping control are verified by practical experiments. For this
purpose a suitable test bed was configured. Using the full mechanical
impedance within the controller structure will not be practical in the
case when the robot is in physical contact with the environment. This
fact will be verified by simulation.