Abstract: This paper describes an effective solution to the task
of a remote monitoring of super-extended objects (oil and gas
pipeline, railways, national frontier). The suggested solution is based
on the principle of simultaneously monitoring of seismoacoustic and
optical/infrared physical fields. The principle of simultaneous
monitoring of those fields is not new but in contrast to the known
solutions the suggested approach allows to control super-extended
objects with very limited operational costs. So-called C-OTDR
(Coherent Optical Time Domain Reflectometer) systems are used to
monitor the seismoacoustic field. Far-CCTV systems are used to
monitor the optical/infrared field. A simultaneous data processing
provided by both systems allows effectively detecting and classifying
target activities, which appear in the monitored objects vicinity. The
results of practical usage had shown high effectiveness of the
suggested approach.
Abstract: The aim of this research is to design a collaborative
framework that integrates risk analysis activities into the geospatial
database design (GDD) process. Risk analysis is rarely undertaken
iteratively as part of the present GDD methods in conformance to
requirement engineering (RE) guidelines and risk standards.
Accordingly, when risk analysis is performed during the GDD, some
foreseeable risks may be overlooked and not reach the output
specifications especially when user intentions are not systematically
collected. This may lead to ill-defined requirements and ultimately in
higher risks of geospatial data misuse. The adopted approach consists
of 1) reviewing risk analysis process within the scope of RE and
GDD, 2) analyzing the challenges of risk analysis within the context
of GDD, and 3) presenting the components of a risk-based
collaborative framework that improves the collection of the
intended/forbidden usages of the data and helps geo-IT experts to
discover implicit requirements and risks.
Abstract: Nowadays, under developed countries for progress in
science and technology and decreasing the technologic gap with
developed countries, increasing the capacities and technology
transfer from developed countries. To remain competitive, industry is
continually searching for new methods to evolve their products.
Business model is one of the latest buzzwords in the Internet and
electronic business world. To be successful, organizations must look
into the needs and wants of their customers. This research attempts to
identify a specific feature of the company with a strong competitive
advantage by analyzing the cause of Customer satisfaction. Due to
the rapid development of knowledge and information technology,
business environments have become much more complicated.
Information technology can help a firm aiming to gain a competitive
advantage. This study explores the role and effect of Information
Communication Technology in Business Models and Customer
satisfaction on firms and also relationships between ICTs and
Outsourcing strategic.
Abstract: In this paper we investigated a number of the Internet
congestion control algorithms that has been developed in the last few
years. It was obviously found that many of these algorithms were
designed to deal with the Internet traffic merely as a train of
consequent packets. Other few algorithms were specifically tailored
to handle the Internet congestion caused by running media traffic that
represents audiovisual content. This later set of algorithms is
considered to be aware of the nature of this media content. In this
context we briefly explained a number of congestion control
algorithms and hence categorized them into the two following
categories: i) Media congestion control algorithms. ii) Common
congestion control algorithms. We hereby recommend the usage of
the media congestion control algorithms for the reason of being
media content-aware rather than the other common type of
algorithms that blindly manipulates such traffic. We showed that the
spread of such media content-aware algorithms over Internet will
lead to better congestion control status in the coming years. This is
due to the observed emergence of the era of digital convergence
where the media traffic type will form the majority of the Internet
traffic.
Abstract: IMCS is Integrated Monitoring and Control System for
thermal power plant. This system consists of mainly two parts; controllers and OIS (Operator Interface System). These two parts are
connected by Ethernet-based communication. The controller side of communication is managed by CNet module and OIS side is managed
by data server of OIS. CNet module sends the data of controller to data
server and receives commend data from data server. To minimizes or
balance the load of data server, this module buffers data created by controller at every cycle and send buffered data to data server on request of data server. For multiple data server, this module manages
the connection line with each data server and response for each request
from multiple data server. CNet module is included in each controller
of redundant system. When controller fail-over happens on redundant system, this module can provide data of controller to data sever
without loss. This paper presents three main features – separation of get task, usage of ring buffer and monitoring communication status –of CNet module to carry out these functions.
Abstract: Perth will run out of available sustainable natural
water resources by 2015 if nothing is done to slow usage rates,
according to a Western Australian study [1]. Alternative water
technology options need to be considered for the long-term
guaranteed supply of water for agricultural, commercial, domestic
and industrial purposes. Seawater is an alternative source of water for
human consumption, because seawater can be desalinated and
supplied in large quantities to a very high quality.
While seawater desalination is a promising option, the technology
requires a large amount of energy which is typically generated from
fossil fuels. The combustion of fossil fuels emits greenhouse gases
(GHG) and, is implicated in climate change. In addition to
environmental emissions from electricity generation for desalination,
greenhouse gases are emitted in the production of chemicals and
membranes for water treatment. Since Australia is a signatory to the
Kyoto Protocol, it is important to quantify greenhouse gas emissions
from desalinated water production.
A life cycle assessment (LCA) has been carried out to determine
the greenhouse gas emissions from the production of 1 gigalitre (GL)
of water from the new plant. In this LCA analysis, a new desalination
plant that will be installed in Bunbury, Western Australia, and known
as Southern Seawater Desalinization Plant (SSDP), was taken as a
case study. The system boundary of the LCA mainly consists of three
stages: seawater extraction, treatment and delivery. The analysis
found that the equivalent of 3,890 tonnes of CO2 could be emitted
from the production of 1 GL of desalinated water. This LCA analysis
has also identified that the reverse osmosis process would cause the
most significant greenhouse emissions as a result of the electricity
used if this is generated from fossil fuels
Abstract: Wastages such as grated coconut meat, spent tea and used sugarcane had contributed negative impacts to the environment. Vermicomposting method is fully utilized to manage the wastes towards a more sustainable approach. The worms that are used in the vermicomposting are Eisenia foetida and Eudrillus euginae. This research shows that the vermicompost of wastages has voltage of electrical energy and is able to light up the Light-Emitting Diode (LED) device. Based on the experiment, the use of replicated and double compartments of the component will produce double of voltage. Hence, for conclusion, this harmless and low cost technology of vermicompost can act as a dry cell in order to reduce the usage of hazardous chemicals that can contaminate the environment.
Abstract: In this research, the diffusion of innovation regarding
smartphone usage is analysed through a consumer behaviour theory.
This research aims to determine whether a pattern surrounding the
diffusion of innovation exists. As a methodology, an empirical study
of the switch from a conventional cell phone to a smartphone was
performed. Specifically, a questionnaire survey was completed by
general consumers, and the situational and behavioural characteristics
of switching from a cell phone to a smartphone were analysed. In
conclusion, we found that the speed of the diffusion of innovation, the
consumer behaviour characteristics, and the utilities of the product
vary according to the stage of the product life cycle.
Abstract: The paper gives the pilot results of the project that is
oriented on the use of data mining techniques and knowledge
discoveries from production systems through them. They have been
used in the management of these systems. The simulation models of
manufacturing systems have been developed to obtain the necessary
data about production. The authors have developed the way of
storing data obtained from the simulation models in the data
warehouse. Data mining model has been created by using specific
methods and selected techniques for defined problems of production
system management. The new knowledge has been applied to
production management system. Gained knowledge has been tested
on simulation models of the production system. An important benefit
of the project has been proposal of the new methodology. This
methodology is focused on data mining from the databases that store
operational data about the production process.
Abstract: Available Bit Rate Service (ABR) is the lower priority
service and the better service for the transmission of data. On wireline
ATM networks ABR source is always getting the feedback from
switches about increase or decrease of bandwidth according to the
changing network conditions and minimum bandwidth is guaranteed.
In wireless networks guaranteeing the minimum bandwidth is really a
challenging task as the source is always in mobile and traveling from
one cell to another cell. Re establishment of virtual circuits from start
to end every time causes the delay in transmission. In our proposed
solution we proposed the mechanism to provide more available
bandwidth to the ABR source by re-usage of part of old Virtual
Channels and establishing the new ones. We want the ABR source to
transmit the data continuously (non-stop) inorderto avoid the delay.
In worst case scenario at least minimum bandwidth is to be allocated.
In order to keep the data flow continuously, priority is given to the
handoff ABR call against new ABR call.
Abstract: It well recognized that one feature that makes a
successful company is its ability to successfully align its business goals with its information communication technologies platform.
Enterprise Resource Planning (ERP) systems contribute to achieve better performance by integrating various business functions and
providing support for information flows. However, the technological
systems complexity is known to prevent the business users to exploit in an efficient way the Enterprise Resource Planning Systems (ERP).
This paper aims to investigate the role of training in improving the
usage of ERP systems. To this end, we have designed an instrument
survey to employees of a Norwegian multinational global provider of
technology solutions. Based on the analysis of collected data, we have delineated a training model that could be high relevance for
both researchers and practitioners as a step towards a better
understanding of ERP system implementation.
Abstract: Knowledge development in companies relies on
knowledge-intensive business processes, which are characterized by
a high complexity in their execution, weak structuring,
communication-oriented tasks and high decision autonomy, and often the need for creativity and innovation. A foundation of knowledge development is provided, which is based on a new conception of
knowledge and knowledge dynamics. This conception consists of a three-dimensional model of knowledge with types, kinds and qualities. Built on this knowledge conception, knowledge dynamics is
modeled with the help of general knowledge conversions between
knowledge assets. Here knowledge dynamics is understood to cover
all of acquisition, conversion, transfer, development and usage of
knowledge. Through this conception we gain a sound basis for
knowledge management and development in an enterprise. Especially
the type dimension of knowledge, which categorizes it according to
its internality and externality with respect to the human being, is crucial for enterprise knowledge management and development,
because knowledge should be made available by converting it to
more external types.
Built on this conception, a modeling approach for knowledgeintensive
business processes is introduced, be it human-driven,e-driven or task-driven processes. As an example for this approach, a model of the creative activity for the renewal planning of
a product is given.
Abstract: Along with the basic features of students\' culture
information, with its widely usage oriented on implementation of the
new information technologies in educational process that determines
the search for ways of pointing to the similarity of interdisciplinary
connections content, aims and objectives of the study. In this regard,
the article questions about students\' information culture, and also
presented information about the aims and objectives of the
information culture process among students. In the formation of a
professional interest in relevant information, which is an opportunity
to assist in informing the professional activities of the essence of
effective use of interactive methods and innovative technologies in
the learning process. The result of the experiment proves the
effectiveness of the information culture process of students in
training the system of higher education based on the credit
technology. The main purpose of this paper is a comprehensive
review of students\' information culture.
Abstract: Texture classification is a trendy and a catchy
technology in the field of texture analysis. Textures, the repeated
patterns, have different frequency components along different
orientations. Our work is based on Texture Classification and its
applications. It finds its applications in various fields like Medical
Image Classification, Computer Vision, Remote Sensing,
Agricultural Field, and Textile Industry. Weed control has a major
effect on agriculture. A large amount of herbicide has been used for
controlling weeds in agriculture fields, lawns, golf courses, sport
fields, etc. Random spraying of herbicides does not meet the exact
requirement of the field. Certain areas in field have more weed
patches than estimated. So, we need a visual system that can
discriminate weeds from the field image which will reduce or even
eliminate the amount of herbicide used. This would allow farmers to
not use any herbicides or only apply them where they are needed. A
machine vision precision automated weed control system could
reduce the usage of chemicals in crop fields. In this paper, an
intelligent system for automatic weeding strategy Multi Resolution
Combined Statistical & spatial Frequency is used to discriminate the
weeds from the crops and to classify them as narrow, little and broad
weeds.
Abstract: The Influence Diagrams (IDs) is a kind of Probabilistic Belief Networks for graphic modeling. The usage of IDs can improve the communication among field experts, modelers, and decision makers, by showing the issue frame discussed from a high-level point of view. This paper enhances the Time-Sliced Influence Diagrams (TSIDs, or called Dynamic IDs) based formalism from a Discrete Event Systems Modeling and Simulation (DES M&S) perspective, for Exploring Analysis (EA) modeling. The enhancements enable a modeler to specify times occurred of endogenous events dynamically with stochastic sampling as model running and to describe the inter- influences among them with variable nodes in a dynamic situation that the existing TSIDs fails to capture. The new class of model is named Dynamic-Stochastic Influence Diagrams (DSIDs). The paper includes a description of the modeling formalism and the hiberarchy simulators implementing its simulation algorithm, and shows a case study to illustrate its enhancements.
Abstract: This paper studies the dependability of componentbased
applications, especially embedded ones, from the diagnosis
point of view. The principle of the diagnosis technique is to
implement inter-component tests in order to detect and locate the
faulty components without redundancy. The proposed approach for
diagnosing faulty components consists of two main aspects. The first
one concerns the execution of the inter-component tests which
requires integrating test functionality within a component. This is the
subject of this paper. The second one is the diagnosis process itself
which consists of the analysis of inter-component test results to
determine the fault-state of the whole system. Advantage of this
diagnosis method when compared to classical redundancy faulttolerant
techniques are application autonomy, cost-effectiveness and
better usage of system resources. Such advantage is very important
for many systems and especially for embedded ones.
Abstract: This paper provides an introduction into the evolution
of information and communication technology and illustrates its
usage in the work domain. The paper is sub-divided into two parts.
The first part gives an overview over the different phases of
information processing in the work domain. It starts by charting the
past and present usage of computers in work environments and shows
current technological trends, which are likely to influence future
business applications. The second part starts by briefly describing,
how the usage of computers changed business processes in the past,
and presents first Ambient Intelligence applications based on
identification and localization information, which are already used in
the production and retail sector. Based on current systems and
prototype applications, the paper gives an outlook of how Ambient
Intelligence technologies could change business processes in the
future.
Abstract: A new approach based on the consideration that electroencephalogram (EEG) signals are chaotic signals was presented for automated diagnosis of electroencephalographic changes. This consideration was tested successfully using the nonlinear dynamics tools, like the computation of Lyapunov exponents. This paper presented the usage of statistics over the set of the Lyapunov exponents in order to reduce the dimensionality of the extracted feature vectors. Since classification is more accurate when the pattern is simplified through representation by important features, feature extraction and selection play an important role in classifying systems such as neural networks. Multilayer perceptron neural network (MLPNN) architectures were formulated and used as basis for detection of electroencephalographic changes. Three types of EEG signals (EEG signals recorded from healthy volunteers with eyes open, epilepsy patients in the epileptogenic zone during a seizure-free interval, and epilepsy patients during epileptic seizures) were classified. The selected Lyapunov exponents of the EEG signals were used as inputs of the MLPNN trained with Levenberg- Marquardt algorithm. The classification results confirmed that the proposed MLPNN has potential in detecting the electroencephalographic changes.
Abstract: In recent years a number of applications with multirobot
systems (MRS) is growing in various areas. But their design
is in practice often difficult and algorithms are proposed for the
theoretical background and do not consider errors and noise in real
conditions, so they are not usable in real environment. These errors
are visible also in task of target localization enough, when robots
try to find and estimate the position of the target by the sensors.
Localization of target is possible also with one robot but as it was
examined target finding and localization with group of mobile robots
can estimate the target position more accurately and faster. The
accuracy of target position estimation is made by cooperation of
MRS and particle filtering. Advantage of usage the MRS with particle
filtering was tested on task of fixed target localization by group of
mobile robots.
Abstract: Frequent patterns are patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a dataset. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages such as C, Cµ, Java. The imperative paradigm is significantly inefficient when itemset is large and the frequent pattern is long. We suggest a high-level declarative style of programming using a functional language. Our supposition is that the problem of frequent pattern discovery can be efficiently and concisely implemented via a functional paradigm since pattern matching is a fundamental feature supported by most functional languages. Our frequent pattern mining implementation using the Haskell language confirms our hypothesis about conciseness of the program. The performance studies on speed and memory usage support our intuition on efficiency of functional language.