Abstract: The mobile cloud computing (MCC) with wireless sensor networks (WSNs) technology gets more attraction by research scholars because its combines the sensors data gathering ability with the cloud data processing capacity. This approach overcomes the limitation of data storage capacity and computational ability of sensor nodes. Finally, the stored data are sent to the mobile users when the user sends the request. The most of the integrated sensor-cloud schemes fail to observe the following criteria: 1) The mobile users request the specific data to the cloud based on their present location. 2) Power consumption since most of them are equipped with non-rechargeable batteries. Mostly, the sensors are deployed in hazardous and remote areas. This paper focuses on above observations and introduces an approach known as collaborative location-based sleep scheduling (CLSS) scheme. Both awake and asleep status of each sensor node is dynamically devised by schedulers and the scheduling is done purely based on the of mobile users’ current location; in this manner, large amount of energy consumption is minimized at WSN. CLSS work depends on two different methods; CLSS1 scheme provides lower energy consumption and CLSS2 provides the scalability and robustness of the integrated WSN.
Abstract: Digital technologies offer possibilities to effectively convert historical archives into instruments of knowledge able to provide a guide for the interpretation of historical phenomena. Digital conversion and management of those documents allow the possibility to add other sources in a unique and coherent model that permits the intersection of different data able to open new interpretations and understandings. Urban history uses, among other sources, the inventories that register human activities in a specific space (e.g. cadastres, censuses, etc.). The geographic localisation of that information inside cartographic supports allows for the comprehension and visualisation of specific relationships between different historical realities registering both the urban space and the peoples living there. These links that merge the different nature of data and documentation through a new organisation of the information can suggest a new interpretation of other related events. In all these kinds of analysis, the use of GIS platforms today represents the most appropriate answer. The design of the related databases is the key to realise the ad-hoc instrument to facilitate the analysis and the intersection of data of different origins. Moreover, GIS has become the digital platform where it is possible to add other kinds of data visualisation. This research deals with the industrial development of Turin at the beginning of the 20th century. A census of factories realized just prior to WWI provides the opportunity to test the potentialities of GIS platforms for the analysis of urban landscape modifications during the first industrial development of the town. The inventory includes data about location, activities, and people. GIS is shaped in a creative way linking different sources and digital systems aiming to create a new type of platform conceived as an interface integrating different kinds of data visualisation. The data processing allows linking this information to an urban space, and also visualising the growth of the city at that time. The sources, related to the urban landscape development in that period, are of a different nature. The emerging necessity to build, enlarge, modify and join different buildings to boost the industrial activities, according to their fast development, is recorded by different official permissions delivered by the municipality and now stored in the Historical Archive of the Municipality of Turin. Those documents, which are reports and drawings, contain numerous data on the buildings themselves, including the block where the plot is located, the district, and the people involved such as the owner, the investor, and the engineer or architect designing the industrial building. All these collected data offer the possibility to firstly re-build the process of change of the urban landscape by using GIS and 3D modelling technologies thanks to the access to the drawings (2D plans, sections and elevations) that show the previous and the planned situation. Furthermore, they access information for different queries of the linked dataset that could be useful for different research and targets such as economics, biographical, architectural, or demographical. By superimposing a layer of the present city, the past meets to the present-industrial heritage, and people meet urban history.
Abstract: Stiffness sensing is an important issue in medical diagnostic, robotics surgery, safe handling, and safe grasping of objects in production lines. Detecting and obtaining the characteristics in dwelling lumps embedded in a soft tissue and safe removing and handling of detected lumps is needed in surgery. Also in industry, grasping and handling an object without damaging in a place where it is not possible to access a human operator is very important. In this paper, a method for object handling is presented. It is based on the use of an intelligent gripper to detect the object stiffness and then setting a programmable force for grasping the object to move it. The main components of this system includes sensors (sensors for measuring force and displacement), electrical (electrical and electronic circuits, tactile data processing and force control system), mechanical (gripper mechanism and driving system for the gripper) and the display unit. The system uses a rotary potentiometer for measuring gripper displacement. A microcontroller using the feedback received by the load cell, mounted on the finger of the gripper, calculates the amount of stiffness, and then commands the gripper motor to apply a certain force on the object. Results of Experiments on some samples with different stiffness show that the gripper works successfully. The gripper can be used in haptic interfaces or robotic systems used for object handling.
Abstract: Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.
Abstract: This paper considers people’s driving skills
diagnosis under real driving conditions. In that sense, this research
presents an approach that uses GPS signals which have a direct
correlation with driving maneuvers. Besides, it is presented a novel
expert-driving-criteria approximation using fuzzy logic which
seeks to analyze GPS signals in order to issue an intelligent driving
diagnosis.
Based on above, this works presents in the first section the
intelligent driving diagnosis system approach in terms of its own
characteristics properties, explaining in detail significant
considerations about how an expert-driving-criteria approximation
must be developed. In the next section, the implementation of our
developed system based on the proposed fuzzy logic approach is
explained. Here, a proposed set of rules which corresponds to a
quantitative abstraction of some traffics laws and driving secure
techniques seeking to approach an expert-driving- criteria
approximation is presented.
Experimental testing has been performed in real driving
conditions. The testing results show that the intelligent driving
diagnosis system qualifies driver’s performance quantitatively with
a high degree of reliability.
Abstract: The Metropolitan Region of Sao Paulo (MRSP) has suffered from serious water scarcity. Consequently, the most convenient solution has been building wells to extract groundwater from local aquifers. However, it requires constant vigilance to prevent over extraction and future events that can pose serious threat to the population, such as subsidence. Radar imaging techniques (InSAR) have allowed continuous investigation of such phenomena. The analysis of data in the present study consists of 23 SAR images dated from October 2007 to March 2011, obtained by the ALOS-1 spacecraft. Data processing was made with the software GMTSAR, by using the InSAR technique to create pairs of interferograms with ground displacement during different time spans. First results show a correlation between the location of 102 wells registered in 2009 and signals of ground displacement equal or lower than -90 millimeters (mm) in the region. The longest time span interferogram obtained dates from October 2007 to March 2010. As a result, from that interferogram, it was possible to detect the average velocity of displacement in millimeters per year (mm/y), and which areas strong signals have persisted in the MRSP. Four specific areas with signals of subsidence of 28 mm/y to 40 mm/y were chosen to investigate the phenomenon: Guarulhos (Sao Paulo International Airport), the Greater Sao Paulo, Itaquera and Sao Caetano do Sul. The coverage area of the signals was between 0.6 km and 1.65 km of length. All areas are located above a sedimentary type of aquifer. Itaquera and Sao Caetano do Sul showed signals varying from 28 mm/y to 32 mm/y. On the other hand, the places most likely to be suffering from stronger subsidence are the ones in the Greater Sao Paulo and Guarulhos, right beside the International Airport of Sao Paulo. The rate of displacement observed in both regions goes from 35 mm/y to 40 mm/y. Previous investigations of the water use at the International Airport highlight the risks of excessive water extraction that was being done through 9 deep wells. Therefore, it is affirmed that subsidence events are likely to occur and to cause serious damage in the area. This study could show a situation that has not been explored with proper importance in the city, given its social and economic consequences. Since the data were only available until 2011, the question that remains is if the situation still persists. It could be reaffirmed, however, a scenario of risk at the International Airport of Sao Paulo that needs further investigation.
Abstract: The operation of nuclear power plants involves
continuous monitoring of the environment in their area. This
monitoring is performed using a complex data acquisition system,
which collects status information about the system itself and values
of many important physical variables e.g. temperature, humidity,
dose rate etc. This paper describes a proposal and optimization of
communication that takes place in teledosimetric system between the
central control server responsible for the data processing and storing
and the decentralized measuring stations, which are measuring the
physical variables. Analyzes of ongoing communication were
performed and consequently the optimization of the system
architecture and communication was done.
Abstract: The availability of powerful eye-safe laser sources and the recent advancements in electro-optical and mechanical beam-steering components have allowed laser-based Light Detection and Ranging (LIDAR) to become a promising technology for obstacle warning and avoidance in a variety of manned and unmanned aircraft applications. LIDAR outstanding angular resolution and accuracy characteristics are coupled to its good detection performance in a wide range of incidence angles and weather conditions, providing an ideal obstacle avoidance solution, which is especially attractive in low-level flying platforms such as helicopters and small-to-medium size Unmanned Aircraft (UA). The Laser Obstacle Avoidance Marconi (LOAM) system is one of such systems, which was jointly developed and tested by SELEX-ES and the Italian Air Force Research and Flight Test Centre. The system was originally conceived for military rotorcraft platforms and, in this paper, we briefly review the previous work and discuss in more details some of the key development activities required for integration of LOAM on UA platforms. The main hardware and software design features of this LOAM variant are presented, including a brief description of the system interfaces and sensor characteristics, together with the system performance models and data processing algorithms for obstacle detection, classification and avoidance. In particular, the paper focuses on the algorithm proposed for optimal avoidance trajectory generation in UA applications.
Abstract: Information in the nervous system is coded as firing patterns of electrical signals called action potential or spike so an essential step in analysis of neural mechanism is detection of action potentials embedded in the neural data. There are several methods proposed in the literature for such a purpose. In this paper a novel method based on empirical mode decomposition (EMD) has been developed. EMD is a decomposition method that extracts oscillations with different frequency range in a waveform. The method is adaptive and no a-priori knowledge about data or parameter adjusting is needed in it. The results for simulated data indicate that proposed method is comparable with wavelet based methods for spike detection. For neural signals with signal-to-noise ratio near 3 proposed methods is capable to detect more than 95% of action potentials accurately.
Abstract: the research was accomplished on fresh in Latvia wild
growing cranberries and cranberry cultivars. The aim of the study
was to evaluate effect of pretreatment method and drying conditions
on the volatile compounds composition in cranberries. Berries
pre-treatment methods were: perforation, halving and
steam-blanching. The berries before drying in a cabinet drier were
pre-treated using all three methods, in microwave vacuum
drier – using a steam-blanching and halving. Volatile compounds in
cranberries were analysed using GC-MS of extracts obtained by
SPME. During present research 21 various volatile compounds were
detected in fresh cranberries: the cultivar 'Steven' - 15, 'Bergman'
and 'Early black' – 13, 'Ben Lear' and 'Pilgrim' – 11 and wild
cranberries – 14 volatile compounds. In dried cranberries 20 volatile
compounds were detected. Mathematical data processing allows
drawing a conclusion that there exists the significant influence of
cranberry cultivar, pre-treatment method and drying condition on
volatile compounds in berries and new volatile compound formation.
Abstract: Artificial Immune System (AIS) is relatively naive paradigm for intelligent computations. The inspiration for AIS is derived from natural Immune System (IS). Classically it is believed that IS strives to discriminate between self and non-self. Most of the existing AIS research is based on this approach. Danger Theory (DT) argues this approach and proposes that IS fights against danger producing elements and tolerates others. We, the computational researchers, are not concerned with the arguments among immunologists but try to extract from it novel abstractions for intelligent computation. This paper aims to follow DT inspiration for intelligent data processing. The approach may introduce new avenue in intelligent processing. The data used is system calls data that is potentially significant in intrusion detection applications.
Abstract: Data gathering is an essential operation in wireless
sensor network applications. So it requires energy efficiency
techniques to increase the lifetime of the network. Similarly,
clustering is also an effective technique to improve the energy
efficiency and network lifetime of wireless sensor networks. In this
paper, an energy efficient cluster formation protocol is proposed with
the objective of achieving low energy dissipation and latency without
sacrificing application specific quality. The objective is achieved by
applying randomized, adaptive, self-configuring cluster formation
and localized control for data transfers. It involves application -
specific data processing, such as data aggregation or compression.
The cluster formation algorithm allows each node to make
independent decisions, so as to generate good clusters as the end.
Simulation results show that the proposed protocol utilizes minimum
energy and latency for cluster formation, there by reducing the
overhead of the protocol.
Abstract: Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.
Abstract: Data mining, which is the exploration of
knowledge from the large set of data, generated as a result of
the various data processing activities. Frequent Pattern Mining
is a very important task in data mining. The previous
approaches applied to generate frequent set generally adopt
candidate generation and pruning techniques for the
satisfaction of the desired objective. This paper shows how
the different approaches achieve the objective of frequent
mining along with the complexities required to perform the
job. This paper will also look for hardware approach of cache
coherence to improve efficiency of the above process. The
process of data mining is helpful in generation of support
systems that can help in Management, Bioinformatics,
Biotechnology, Medical Science, Statistics, Mathematics,
Banking, Networking and other Computer related
applications. This paper proposes the use of both upward and
downward closure property for the extraction of frequent item
sets which reduces the total number of scans required for the
generation of Candidate Sets.
Abstract: This paper presents an alternate approach that uses
artificial neural network to simulate the flood level dynamics in a
river basin. The algorithm was developed in a decision support
system environment in order to enable users to process the data. The
decision support system is found to be useful due to its interactive
nature, flexibility in approach and evolving graphical feature and can
be adopted for any similar situation to predict the flood level. The
main data processing includes the gauging station selection, input
generation, lead-time selection/generation, and length of prediction.
This program enables users to process the flood level data, to
train/test the model using various inputs and to visualize results. The
program code consists of a set of files, which can as well be modified
to match other purposes. This program may also serve as a tool for
real-time flood monitoring and process control. The running results
indicate that the decision support system applied to the flood level
seems to have reached encouraging results for the river basin under
examination. The comparison of the model predictions with the
observed data was satisfactory, where the model is able to forecast
the flood level up to 5 hours in advance with reasonable prediction
accuracy. Finally, this program may also serve as a tool for real-time
flood monitoring and process control.
Abstract: The development of distributed systems has been affected by the need to accommodate an increasing degree of flexibility, adaptability, and autonomy. The Mobile Agent technology is emerging as an alternative to build a smart generation of highly distributed systems. In this work, we investigate the performance aspect of agent-based technologies for information retrieval. We present a comparative performance evaluation model of Mobile Agents versus Remote Method Invocation by means of an analytical approach. We demonstrate the effectiveness of mobile agents for dynamic code deployment and remote data processing by reducing total latency and at the same time producing minimum network traffic. We argue that exploiting agent-based technologies significantly enhances the performance of distributed systems in the domain of information retrieval.
Abstract: In order to give high expertise the computer aided
design of mechanical systems involves specific activities focused on
processing two type of information: knowledge and data. Expert rule
based knowledge is generally processing qualitative information and
involves searching for proper solutions and their combination into
synthetic variant. Data processing is based on computational models
and it is supposed to be inter-related with reasoning in the knowledge
processing. In this paper an Intelligent Integrated System is proposed,
for the objective of choosing the adequate material. The software is
developed in Prolog – Flex software and takes into account various
constraints that appear in the accurate operation of gears.
Abstract: ANNARIMA that combines both autoregressive integrated moving average (ARIMA) model and artificial neural network (ANN) model is a valuable tool for modeling and forecasting nonlinear time series, yet the over-fitting problem is more likely to occur in neural network models. This paper provides a hybrid methodology that combines both radial basis function (RBF) neural network and auto regression (AR) model based on binomial smoothing (BS) technique which is efficient in data processing, which is called BSRBFAR. This method is examined by using the data of Canadian Lynx data. Empirical results indicate that the over-fitting problem can be eased using RBF neural network based on binomial smoothing which is called BS-RBF, and the hybrid model–BS-RBFAR can be an effective way to improve forecasting accuracy achieved by BSRBF used separately.
Abstract: This study focuses on bureau management
technologies and information systems in developing countries.
Developing countries use such systems which facilitate executive and
organizational functions through the utilization of bureau
management technologies and provide the executive staff with
necessary information.
The concepts of data and information differ from each other in
developing countries, and thus the concepts of data processing and
information processing are different. Symbols represent ideas,
objects, figures, letters and numbers. Data processing system is an
integrated system which deals with the processing of the data related
to the internal and external environment of the organization in order
to make decisions, create plans and develop strategies; it goes
without saying that this system is composed of both human beings
and machines. Information is obtained through the acquisition and
the processing of data. On the other hand, data are raw
communicative messages. Within this framework, data processing
equals to producing plausible information out of raw data.
Organizations in developing countries need to obtain information
relevant to them because rapid changes in the organizational arena
require rapid access to accurate information. The most significant
role of the directors and managers who work in the organizational
arena is to make decisions. Making a correct decision is possible only
when the directors and managers are equipped with sound ideas and
appropriate information. Therefore, acquisition, organization and
distribution of information gain significance. Today-s organizations
make use of computer-assisted “Management Information Systems"
in order to obtain and distribute information.
Decision Support System which is closely related to practice is an
information system that facilitates the director-s task of making
decisions. Decision Support System integrates human intelligence,
information technology and software in order to solve the complex
problems. With the support of the computer technology and software
systems, Decision Support System produces information relevant to
the decision to be made by the director and provides the executive
staff with supportive ideas about the decision.
Artificial Intelligence programs which transfer the studies and
experiences of the people to the computer are called expert systems.
An expert system stores expert information in a limited area and can
solve problems by deriving rational consequences.
Bureau management technologies and information systems in
developing countries create a kind of information society and
information economy which make those countries have their places
in the global socio-economic structure and which enable them to play
a reasonable and fruitful role; therefore it is of crucial importance to
make use of information and management technologies in order to
work together with innovative and enterprising individuals and it is
also significant to create “scientific policies" based on information
and technology in the fields of economy, politics, law and culture.
Abstract: Recently, grid computing has been widely focused on
the science, industry, and business fields, which are required a vast
amount of computing. Grid computing is to provide the environment
that many nodes (i.e., many computers) are connected with each
other through a local/global network and it is available for many
users. In the environment, to achieve data processing among nodes
for any applications, each node executes mutual authentication by
using certificates which published from the Certificate Authority
(for short, CA). However, if a failure or fault has occurred in the
CA, any new certificates cannot be published from the CA. As
a result, a new node cannot participate in the gird environment.
In this paper, an off-the-shelf scheme for dependable grid systems
using virtualization techniques is proposed and its implementation is
verified. The proposed approach using the virtualization techniques
is to restart an application, e.g., the CA, if it has failed. The system
can tolerate a failure or fault if it has occurred in the CA. Since
the proposed scheme is implemented at the application level easily,
the cost of its implementation by the system builder hardly takes
compared it with other methods. Simulation results show that the
CA in the system can recover from its failure or fault.