Abstract: Nowadays, people are going more and more mobile, both in terms of devices and associated applications. Moreover, services that these devices are offering are getting wider and much more complex. Even though actual handheld devices have considerable computing power, their contexts of utilization are different. These contexts are affected by the availability of connection, high latency of wireless networks, battery life, size of the screen, on-screen or hard keyboard, etc. Consequently, development of mobile applications and their associated mobile Web services, if any, should follow a concise methodology so they will provide a high Quality of Service. The aim of this paper is to highlight and discuss main issues to consider when developing mobile applications and mobile Web services and then propose a framework that leads developers through different steps and modules toward development of efficient and secure mobile applications. First, different challenges in developing such applications are elicited and deeply discussed. Second, a development framework is presented with different modules addressing each of these challenges. Third, the paper presents an example of a mobile application, Eivom Cinema Guide, which benefits from following our development framework.
Abstract: In this paper, we have proposed a low cost optimized solution for the movement of a three-arm manipulator using Genetic Algorithm (GA) and Analytical Hierarchy Process (AHP). A scheme is given for optimizing the movement of robotic arm with the help of Genetic Algorithm so that the minimum energy consumption criteria can be achieved. As compared to Direct Kinematics, Inverse Kinematics evolved two solutions out of which the best-fit solution is selected with the help of Genetic Algorithm and is kept in search space for future use. The Inverse Kinematics, Fitness Value evaluation and Binary Encoding like tasks are simulated and tested. Although, three factors viz. Movement, Friction and Least Settling Time (or Min. Vibration) are used for finding the Fitness Function / Fitness Values, however some more factors can also be considered.
Abstract: This paper presents a design and prototype
implementation of new home automation system that uses WiFi
technology as a network infrastructure connecting its parts. The
proposed system consists of two main components; the first part is
the server (web server), which presents system core that manages,
controls, and monitors users- home. Users and system administrator
can locally (LAN) or remotely (internet) manage and control system
code. Second part is hardware interface module, which provides
appropriate interface to sensors and actuator of home automation
system. Unlike most of available home automation system in the
market the proposed system is scalable that one server can manage
many hardware interface modules as long as it exists on WiFi
network coverage. System supports a wide range of home
automation devices like power management components, and
security components. The proposed system is better from the
scalability and flexibility point of view than the commercially
available home automation systems.
Abstract: The rapid urbanization of cities has a bane in the form
road accidents that cause extensive damage to life and limbs. A
number of location based factors are enablers of road accidents in the
city. The speed of travel of vehicles is non-uniform among locations
within a city. In this study, the perception of vehicle users is captured
on a 10-point rating scale regarding the degree of variation in speed
of travel at chosen locations in the city. The average rating is used to
cluster locations using fuzzy c-means clustering and classify them as
low, moderate and high speed of travel locations. The high speed of
travel locations can be classified proactively to ensure that accidents
do not occur due to the speeding of vehicles at such locations. The
advantage of fuzzy c-means clustering is that a location may be a
part of more than one cluster to a varying degree and this gives a
better picture about the location with respect to the characteristic
(speed of travel) being studied.
Abstract: This study analyzed environmental health risks and
people-s perceptions of risks related to waste management in poor
settlements of Abidjan, to develop integrated solutions for health and
well-being improvement. The trans-disciplinary approach used relied
on remote sensing, a geographic information system (GIS),
qualitative and quantitative methods such as interviews and a
household survey (n=1800). Mitigating strategies were then
developed using an integrated participatory stakeholder workshop.
Waste management deficiencies resulting in lack of drainage and
uncontrolled solid and liquid waste disposal in the poor settlements
lead to severe environmental health risks. Health problems were
caused by direct handling of waste, as well as through broader
exposure of the population. People in poor settlements had little
awareness of health risks related to waste management in their
community and a general lack of knowledge pertaining to sanitation
systems. This unfortunate combination was the key determinant
affecting the health and vulnerability. For example, an increased
prevalence of malaria (47.1%) and diarrhoea (19.2%) was observed
in the rainy season when compared to the dry season (32.3% and
14.3%). Concerted and adapted solutions that suited all the
stakeholders concerned were developed in a participatory workshop
to allow for improvement of health and well-being.
Abstract: In this paper multivariable predictive PID controller has
been implemented on a multi-inputs multi-outputs control problem
i.e., quadruple tank system, in comparison with a simple multiloop
PI controller. One of the salient feature of this system is an
adjustable transmission zero which can be adjust to operate in both
minimum and non-minimum phase configuration, through the flow
distribution to upper and lower tanks in quadruple tank system.
Stability and performance analysis has also been carried out for this
highly interactive two input two output system, both in minimum
and non-minimum phases. Simulations of control system revealed
that better performance are obtained in predictive PID design.
Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: Nowadays, the challenge in hydraulic turbine design is
the multi-objective design of turbine runner to reach higher
efficiency. The hydraulic performance of a turbine is strictly depends
on runner blades shape. The present paper focuses on the application
of the multi-objective optimization algorithm to the design of a small
Francis turbine runner. The optimization exercise focuses on the
efficiency improvement at the best efficiency operating point (BEP)
of the GAMM Francis turbine. A global optimization method based
on artificial neural networks (ANN) and genetic algorithms (GA)
coupled by 3D Navier-Stokes flow solver has been used to improve
the performance of an initial geometry of a Francis runner. The
results show the good ability of optimization algorithm and the final
geometry has better efficiency with initial geometry. The goal was to
optimize the geometry of the blades of GAMM turbine runner which
leads to maximum total efficiency by changing the design parameters
of camber line in at least 5 sections of a blade. The efficiency of the
optimized geometry is improved from 90.7% to 92.5%. Finally,
design parameters and the way of selection have been considered and
discussed.
Abstract: Laboratory activities have produced benefits in
student learning. With current drives of new technology resources
and evolving era of education methods, renewal status of learning
and teaching in laboratory methods are in progress, for both learners
and the educators. To enhance learning outcomes in laboratory works
particularly in engineering practices and testing, learning via handson
by instruction may not sufficient. This paper describes and
compares techniques and implementation of traditional (expository)
with open-ended laboratory (problem-based) for two consecutive
cohorts studying environmental laboratory course in civil engineering
program. The transition of traditional to problem-based findings and
effect were investigated in terms of course assessment student
feedback survey, course outcome learning measurement and student
performance grades. It was proved that students have demonstrated
better performance in their grades and 12% increase in the course
outcome (CO) in problem-based open-ended laboratory style than
traditional method; although in perception, students has responded
less favorable in their feedback.
Abstract: Due to the increasing and varying risks that economic units face with, derivative instruments gain substantial importance, and trading volumes of derivatives have reached very significant level. Parallel with these high trading volumes, researchers have developed many different models. Some are parametric, some are nonparametric. In this study, the aim is to analyse the success of artificial neural network in pricing of options with S&P 100 index options data. Generally, the previous studies cover the data of European type call options. This study includes not only European call option but also American call and put options and European put options. Three data sets are used to perform three different ANN models. One only includes data that are directly observed from the economic environment, i.e. strike price, spot price, interest rate, maturity, type of the contract. The others include an extra input that is not an observable data but a parameter, i.e. volatility. With these detail data, the performance of ANN in put/call dimension, American/European dimension, moneyness dimension is analyzed and whether the contribution of the volatility in neural network analysis make improvement in prediction performance or not is examined. The most striking results revealed by the study is that ANN shows better performance when pricing call options compared to put options; and the use of volatility parameter as an input does not improve the performance.
Abstract: The paper reviews the relationship between spatial
and transportation planning in the Southern African Development
Community (SADC) region of Sub-Saharan Africa. It argues that
most urbanisation in the region has largely occurred subsequent to
the 1950s and, accordingly, urban development has been
profoundly and negatively affected by the (misguided) spatial and
institutional tenets of modernism. It demonstrates how a
considerable amount of the poor performance of these settlements
can be directly attributed to this. Two factors in particular about the
planning systems are emphasized: the way in which programmatic
land-use planning lies at the heart of both spatial and transportation
planning; and the way on which transportation and spatial planning
have been separated into independent processes. In the final
section, the paper identifies ways of improving the planning
system. Firstly, it identifies the performance qualities which
Southern African settlements should be seeking to achieve.
Secondly, it focuses on two necessary arenas of change: the need to
replace programmatic land-use planning practices with structuralspatial
approaches; and it makes a case for making urban corridors
a spatial focus of integrated planning, as a way of beginning the
restructuring and intensification of settlements which are currently
characterised by sprawl, fragmentation and separation
Abstract: One of the criteria in production scheduling is Make
Span, minimizing this criteria causes more efficiently use of the
resources specially machinery and manpower. By assigning some
budget to some of the operations the operation time of these activities
reduces and affects the total completion time of all the operations
(Make Span). In this paper this issue is practiced in parallel flow
shops. At first we convert parallel flow shop to a network model and
by using a linear programming approach it is identified in order to
minimize make span (the completion time of the network) which
activities (operations) are better to absorb the predetermined and
limited budget. Minimizing the total completion time of all the
activities in the network is equivalent to minimizing make span in
production scheduling.
Abstract: Information is increasing in volumes; companies are overloaded with information that they may lose track in getting the intended information. It is a time consuming task to scan through each of the lengthy document. A shorter version of the document which contains only the gist information is more favourable for most information seekers. Therefore, in this paper, we implement a text summarization system to produce a summary that contains gist information of oil and gas news articles. The summarization is intended to provide important information for oil and gas companies to monitor their competitor-s behaviour in enhancing them in formulating business strategies. The system integrated statistical approach with three underlying concepts: keyword occurrences, title of the news article and location of the sentence. The generated summaries were compared with human generated summaries from an oil and gas company. Precision and recall ratio are used to evaluate the accuracy of the generated summary. Based on the experimental results, the system is able to produce an effective summary with the average recall value of 83% at the compression rate of 25%.
Abstract: Wireless sensor networks (WSN) are currently
receiving significant attention due to their unlimited potential. These
networks are used for various applications, such as habitat
monitoring, automation, agriculture, and security. The efficient nodeenergy
utilization is one of important performance factors in wireless
sensor networks because sensor nodes operate with limited battery
power. In this paper, we proposed the MiSense hierarchical cluster
based routing algorithm (MiCRA) to extend the lifetime of sensor
networks and to maintain a balanced energy consumption of nodes.
MiCRA is an extension of the HEED algorithm with two levels of
cluster heads. The performance of the proposed protocol has been
examined and evaluated through a simulation study. The simulation
results clearly show that MiCRA has a better performance in terms of
lifetime than HEED. Indeed, MiCRA our proposed protocol can
effectively extend the network lifetime without other critical
overheads and performance degradation. It has been noted that there
is about 35% of energy saving for MiCRA during the clustering
process and 65% energy savings during the routing process compared
to the HEED algorithm.
Abstract: Developing countries are facing a problem of slums and there appears to be no fool proof solution to eradicate them. For improving the quality of life there are three approaches of slum development and In-situ up-gradation approach is found to be the best one, while the relocation approach has proved to be failure. Factors responsible for failure of relocation projects are needed to be assessed, which is the basic aim of the paper. Factors responsible for failure of relocation projects are loss of livelihood, security of tenure and inefficiency of the Government. These factors are traced out & mapped from the examples of Western & Indian cities. National habitat, Resettlement policy emphasized relationship between shelter and work place. SRA has identified 55 slums for relocation due reservation of land uses, security of tenure and non- notified status of slums. The policy guidelines have been suggested for successful relocation projects. KeywordsLivelihood, Relocation, Slums, Urban poor.
Abstract: High speed networks provide realtime variable bit rate
service with diversified traffic flow characteristics and quality
requirements. The variable bit rate traffic has stringent delay and
packet loss requirements. The burstiness of the correlated traffic
makes dynamic buffer management highly desirable to satisfy the
Quality of Service (QoS) requirements. This paper presents an
algorithm for optimization of adaptive buffer allocation scheme for
traffic based on loss of consecutive packets in data-stream and buffer
occupancy level. Buffer is designed to allow the input traffic to be
partitioned into different priority classes and based on the input
traffic behavior it controls the threshold dynamically. This algorithm
allows input packets to enter into buffer if its occupancy level is less
than the threshold value for priority of that packet. The threshold is
dynamically varied in runtime based on packet loss behavior. The
simulation is run for two priority classes of the input traffic –
realtime and non-realtime classes. The simulation results show that
Adaptive Partial Buffer Sharing (ADPBS) has better performance
than Static Partial Buffer Sharing (SPBS) and First In First Out
(FIFO) queue under the same traffic conditions.
Abstract: Due to the ever growing amount of publications about
protein-protein interactions, information extraction from text is
increasingly recognized as one of crucial technologies in
bioinformatics. This paper presents a Protein Interaction Extraction
System using a Link Grammar Parser from biomedical abstracts
(PIELG). PIELG uses linkage given by the Link Grammar Parser to
start a case based analysis of contents of various syntactic roles as
well as their linguistically significant and meaningful combinations.
The system uses phrasal-prepositional verbs patterns to overcome
preposition combinations problems. The recall and precision are
74.4% and 62.65%, respectively. Experimental evaluations with two
other state-of-the-art extraction systems indicate that PIELG system
achieves better performance. For further evaluation, the system is
augmented with a graphical package (Cytoscape) for extracting
protein interaction information from sequence databases. The result
shows that the performance is remarkably promising.
Abstract: In this paper a new approach to face recognition is
presented that achieves double dimension reduction, making the
system computationally efficient with better recognition results and
out perform common DCT technique of face recognition. In pattern
recognition techniques, discriminative information of image
increases with increase in resolution to a certain extent, consequently
face recognition results change with change in face image resolution
and provide optimal results when arriving at a certain resolution
level. In the proposed model of face recognition, initially image
decimation algorithm is applied on face image for dimension
reduction to a certain resolution level which provides best
recognition results. Due to increased computational speed and feature
extraction potential of Discrete Cosine Transform (DCT), it is
applied on face image. A subset of coefficients of DCT from low to
mid frequencies that represent the face adequately and provides best
recognition results is retained. A tradeoff between decimation factor,
number of DCT coefficients retained and recognition rate with
minimum computation is obtained. Preprocessing of the image is
carried out to increase its robustness against variations in poses and
illumination level. This new model has been tested on different
databases which include ORL , Yale and EME color database.
Abstract: If price and quantity are the fundamental building
blocks of any theory of market interactions, the importance of trading
volume in understanding the behavior of financial markets is clear.
However, while many economic models of financial markets have
been developed to explain the behavior of prices -predictability,
variability, and information content- far less attention has been
devoted to explaining the behavior of trading volume. In this article,
we hope to expand our understanding of trading volume by
developing a new measure of herding behavior based on a cross
sectional dispersion of volumes betas. We apply our measure to the
Toronto stock exchange using monthly data from January 2000 to
December 2002. Our findings show that the herd phenomenon
consists of three essential components: stationary herding, intentional
herding and the feedback herding.
Abstract: In this paper, we propose an algorithm to compute
initial cluster centers for K-means clustering. Data in a cell is
partitioned using a cutting plane that divides cell in two smaller cells.
The plane is perpendicular to the data axis with the highest variance
and is designed to reduce the sum squared errors of the two cells as
much as possible, while at the same time keep the two cells far apart
as possible. Cells are partitioned one at a time until the number of
cells equals to the predefined number of clusters, K. The centers of
the K cells become the initial cluster centers for K-means. The
experimental results suggest that the proposed algorithm is effective,
converge to better clustering results than those of the random
initialization method. The research also indicated the proposed
algorithm would greatly improve the likelihood of every cluster
containing some data in it.