Abstract: Modern building automation needs to deal with very
different types of demands, depending on the use of a building and the
persons acting in it. To meet the requirements of situation awareness
in modern building automation, scenario recognition becomes more
and more important in order to detect sequences of events and to react
to them properly. We present two concepts of scenario recognition
and their implementation, one based on predefined templates and the
other applying an unsupervised learning algorithm using statistical
methods. Implemented applications will be described and their advantages
and disadvantages will be outlined.
Abstract: Support Vector Machine (SVM) is a statistical
learning tool that was initially developed by Vapnik in 1979 and later
developed to a more complex concept of structural risk minimization
(SRM). SVM is playing an increasing role in applications to
detection problems in various engineering problems, notably in
statistical signal processing, pattern recognition, image analysis, and
communication systems. In this paper, SVM was applied to the
detection of SAR (synthetic aperture radar) images in the presence of
partially developed speckle noise. The simulation was done for single
look and multi-look speckle models to give a complete overlook and
insight to the new proposed model of the SVM-based detector. The
structure of the SVM was derived and applied to real SAR images
and its performance in terms of the mean square error (MSE) metric
was calculated. We showed that the SVM-detected SAR images have
a very low MSE and are of good quality. The quality of the
processed speckled images improved for the multi-look model.
Furthermore, the contrast of the SVM detected images was higher
than that of the original non-noisy images, indicating that the SVM
approach increased the distance between the pixel reflectivity levels
(the detection hypotheses) in the original images.
Abstract: Use of the Internet and the World-Wide-Web
(WWW) has become widespread in recent years and mobile agent
technology has proliferated at an equally rapid rate. In this scenario
load balancing becomes important for P2P systems. Beside P2P
systems can be highly heterogeneous, i.e., they may consists of peers
that range from old desktops to powerful servers connected to
internet through high-bandwidth lines. There are various loads
balancing policies came into picture. Primitive one is Message
Passing Interface (MPI). Its wide availability and portability make it
an attractive choice; however the communication requirements are
sometimes inefficient when implementing the primitives provided by
MPI. In this scenario we use the concept of mobile agent because
Mobile agent (MA) based approach have the merits of high
flexibility, efficiency, low network traffic, less communication
latency as well as highly asynchronous. In this study we present
decentralized load balancing scheme using mobile agent technology
in which when a node is overloaded, task migrates to less utilized
nodes so as to share the workload. However, the decision of which
nodes receive migrating task is made in real-time by defining certain
load balancing policies. These policies are executed on PMADE (A
Platform for Mobile Agent Distribution and Execution) in
decentralized manner using JuxtaNet and various load balancing
metrics are discussed.
Abstract: Various solar energy technologies exist and they have
different application techniques in the generation of electrical power.
The widespread use of photovoltaic (PV) modules in such
technologies has been limited by relatively high costs and low
efficiencies. The efficiency of PV panels decreases as the operating
temperatures increase. This is due to the affect of solar intensity and
ambient temperature. In this work, Computational Fluid Dynamics
(CFD) was used to model the heat transfer from a standard PV panel
and thus determine the rate of dissipation of heat. To accurately
model the specific climatic conditions of the United Arab Emirates
(UAE), a case study of a new build green building in Dubai was
used. A finned heat pipe arrangement is proposed and analyzed to
determine the improved heat dissipation and thus improved
performance efficiency of the PV panel. A prototype of the
arrangement is built for experimental testing to validate the CFD
modeling and proof of concept.
Abstract: Tandem mass spectrometry (MS/MS) is the engine
driving high-throughput protein identification. Protein mixtures possibly
representing thousands of proteins from multiple species are
treated with proteolytic enzymes, cutting the proteins into smaller
peptides that are then analyzed generating MS/MS spectra. The
task of determining the identity of the peptide from its spectrum
is currently the weak point in the process. Current approaches to de
novo sequencing are able to compute candidate peptides efficiently.
The problem lies in the limitations of current scoring functions. In this
paper we introduce the concept of proteome signature. By examining
proteins and compiling proteome signatures (amino acid usage) it is
possible to characterize likely combinations of amino acids and better
distinguish between candidate peptides. Our results strongly support
the hypothesis that a scoring function that considers amino acid usage
patterns is better able to distinguish between candidate peptides. This
in turn leads to higher accuracy in peptide prediction.
Abstract: This research presents a system for post processing of
data that takes mined flat rules as input and discovers crisp as well as
fuzzy hierarchical structures using Learning Classifier System
approach. Learning Classifier System (LCS) is basically a machine
learning technique that combines evolutionary computing,
reinforcement learning, supervised or unsupervised learning and
heuristics to produce adaptive systems. A LCS learns by interacting
with an environment from which it receives feedback in the form of
numerical reward. Learning is achieved by trying to maximize the
amount of reward received. Crisp description for a concept usually
cannot represent human knowledge completely and practically. In the
proposed Learning Classifier System initial population is constructed
as a random collection of HPR–trees (related production rules) and
crisp / fuzzy hierarchies are evolved. A fuzzy subsumption relation is
suggested for the proposed system and based on Subsumption Matrix
(SM), a suitable fitness function is proposed. Suitable genetic
operators are proposed for the chosen chromosome representation
method. For implementing reinforcement a suitable reward and
punishment scheme is also proposed. Experimental results are
presented to demonstrate the performance of the proposed system.
Abstract: In manufacturing industries, development of measurement leads to increase the number of monitoring variables and eventually the importance of multivariate control comes to the fore. Statistical process control (SPC) is one of the most widely used as multivariate control chart. Nevertheless, SPC is restricted to apply in processes because its assumption of data as following specific distribution. Unfortunately, process data are composed by the mixture of several processes and it is hard to estimate as one certain distribution. To alternative conventional SPC, therefore, nonparametric control chart come into the picture because of the strength of nonparametric control chart, the absence of parameter estimation. SVDD based control chart is one of the nonparametric control charts having the advantage of flexible control boundary. However,basic concept of SVDD has been an oversight to the important of data characteristic, density distribution. Therefore, we proposed DW-SVDD (Density Weighted SVDD) to cover up the weakness of conventional SVDD. DW-SVDD makes a new attempt to consider dense of data as introducing the notion of density Weight. We extend as control chart using new proposed SVDD and a simulation study of various distributional data is conducted to demonstrate the improvement of performance.
Abstract: The changing economic climate has made global
manufacturing a growing reality over the last decade, forcing
companies from east and west and all over the world to
collaborate beyond geographic boundaries in the design,
manufacture and assemble of products. The ISO10303 and
ISO14649 Standards (STEP and STEP-NC) have been
developed to introduce interoperability into manufacturing
enterprises so as to meet the challenge of responding to
production on demand. This paper describes and illustrates a
STEP compliant CAD/CAPP/CAM System for the manufacture
of rotational parts on CNC turning centers. The information
models to support the proposed system together with the data
models defined in the ISO14649 standard used to create the NC
programs are also described. A structured view of a STEP
compliant CAD/CAPP/CAM system framework supporting the
next generation of intelligent CNC controllers for turn/mill
component manufacture is provided. Finally a proposed
computational environment for a STEP-NC compliant system
for turning operations (SCSTO) is described. SCSTO is the
experimental part of the research supported by the specification
of information models and constructed using a structured
methodology and object-oriented methods. SCSTO was
developed to generate a Part 21 file based on machining
features to support the interactive generation of process plans
utilizing feature extraction. A case study component has been
developed to prove the concept for using the milling and turning
parts of ISO14649 to provide a turn-mill CAD/CAPP/CAM
environment.
Abstract: This paper examines the role of telecommunications in sustainable development of urban, rural and remote communities in the Northern Territory of Australia through the theoretical lens of Social Capital. Social Capital is a relatively new construct and is rapidly gaining interest among policy makers, politicians and researchers as a means to both describe and understand social and economic development. Increasingly, the concept of Social Capital, as opposed to the traditional economic indicators, is seen as a more accurate measure of well-being. Whilst the essence of Social Capital is quality social relations, the concept intersects with telecommunications and Information Communications Technology (ICT) in a number of ways. The potential of ICT to disseminate information quickly, to reach vast numbers of people simultaneously and to include the previously excluded, is immense. However, the exact nature of the relationship is not clearly defined. This paper examines the nexus between social relations of mutual benefit, telecommunications access and sustainable development. A mixed methodological approach was used to test the hypothesis that No relationship exists between Social Capital and access to telecommunications services and facilities. Four communities, which included two urban, a rural and a remote Indigenous community in the Northern Territory of Australia are the focus of this research paper.
Abstract: The aim of this paper is to introduce and study a new concept of strong double χ2 (M,A, Δ) of fuzzy numbers and also some properties of the resulting sequence spaces of fuzzy numbers were examined.
Abstract: This paper is an exploration of the conceptual
confusion between E-learning and M-learning particularly in Africa.
Section I provides a background to the development of E-learning
and M-learning. Section II focuses on the conceptual analysis as it
applies to Africa. It is with an investigative and expansive mind that
this paper is elaborated to respond to a profound question of the
suitability of the concepts in a particular era in Africa. The aim of this
paper is therefore to shed light on which concept best suits the unique
situation of Africa in the era of cloud computing.
Abstract: Sustainable development is a concept which was
originated in Burtland commission in 1978. Although this concept
was born with environmental aspects, it is penetrated in all areas
rapidly, turning into a dominate view of planning. Concentrating on
future generation issue, especially when talking about heritage has a
long story. Each approach with all of its characteristics illustrates
differences in planning, hence planning always reflects the dominate
idea of its age. This paper studies sustainable development in
planning for historical cities with the aim of finding ways to deal
with heritage in planning for historical cities in Iran. Through this, it
will be illustrated how challenges between sustainable concept and
heritage could be concluded in planning.
Consequently, the paper will emphasize on:
Sustainable development in city planning
Trends regarding heritage
Challenges due to planning for historical cities in Iran
For the first two issues, documentary method regarding the
sustainable development and heritage literature is considered. As the
next step focusing on Iranian historical cities require considering the
urban planning and management structure and identifying the main
challenges related to heritage, so analyzing challenges regarding
heritage is considered. As the result it would be illustrated that key
issue in such planning is active conservation to improve and use the
potential of heritage while it's continues conservation is guaranteed.
By emphasizing on the planning system in Iran it will be obvious that
some reforms are needed in this system and its way of relating with
heritage. The main weakness in planning for historical cities in Iran
is the lack of independent city management. Without this factor
achieving active conservation as the main factor of sustainable
development would not be possible.
Abstract: This paper presents three new methodologies for the
basic operations, which aim at finding new ways of computing union
(maximum) and intersection (minimum) membership values by
taking into effect the entire membership values in a fuzzy set. The
new methodologies are conceptually simple and easy from the
application point of view and are illustrated with a variety of
problems such as Cartesian product of two fuzzy sets, max –min
composition of two fuzzy sets in different product spaces and an
application of an inverted pendulum to determine the impact of the
new methodologies. The results clearly indicate a difference based on
the nature of the fuzzy sets under consideration and hence will be
highly useful in quite a few applications where different values have
significant impact on the behavior of the system.
Abstract: Existing proceeding-models for the development of mechatronic systems provide a largely parallel action in the detailed development. This parallel approach is to take place also largely independent of one another in the various disciplines involved. An approach for a new proceeding-model provides a further development of existing models to use for the development of Adaptronic Systems. This approach is based on an intermediate integration and an abstract modeling of the adaptronic system. Based on this system-model a simulation of the global system behavior, due to external and internal factors or Forces is developed. For the intermediate integration a special data management system is used. According to the presented approach this data management system has a number of functions that are not part of the "normal" PDM functionality. Therefore a concept for a new data management system for the development of Adaptive system is presented in this paper. This concept divides the functions into six layers. In the first layer a system model is created, which divides the adaptronic system based on its components and the various technical disciplines. Moreover, the parameters and properties of the system are modeled and linked together with the requirements and the system model. The modeled parameters and properties result in a network which is analyzed in the second layer. From this analysis necessary adjustments to individual components for specific manipulation of the system behavior can be determined. The third layer contains an automatic abstract simulation of the system behavior. This simulation is a precursor for network analysis and serves as a filter. By the network analysis and simulation changes to system components are examined and necessary adjustments to other components are calculated. The other layers of the concept treat the automatic calculation of system reliability, the "normal" PDM-functionality and the integration of discipline-specific data into the system model. A prototypical implementation of an appropriate data management with the addition of an automatic system development is being implemented using the data management system ENOVIA SmarTeam V5 and the simulation system MATLAB.
Abstract: Wide applicability of concurrent programming
practices in developing various software applications leads to
different concurrency errors amongst which data race is the most
important. Java provides greatest support for concurrent
programming by introducing various concurrency packages. Aspect
oriented programming (AOP) is modern programming paradigm
facilitating the runtime interception of events of interest and can be
effectively used to handle the concurrency problems. AspectJ being
an aspect oriented extension to java facilitates the application of
concepts of AOP for data race detection. Volatile variables are
usually considered thread safe, but they can become the possible
candidates of data races if non-atomic operations are performed
concurrently upon them. Various data race detection algorithms have
been proposed in the past but this issue of volatility and atomicity is
still unaddressed. The aim of this research is to propose some
suggestions for incorporating certain conditions for data race
detection in java programs at the volatile fields by taking into account
support for atomicity in java concurrency packages and making use
of pointcuts. Two simple test programs will demonstrate the results
of research. The results are verified on two different Java
Development Kits (JDKs) for the purpose of comparison.
Abstract: Governments around the world are expending
considerable time and resources framing strategies and policies to
deliver energy security. The term 'energy security' has quietly
slipped into the energy lexicon without any meaningful discourse
about its meaning or assumptions. An examination of explicit and
inferred definitions finds that the concept is inherently slippery
because it is polysemic in nature having multiple dimensions and
taking on different specificities depending on the country (or
continent), timeframe or energy source to which it is applied. But
what does this mean for policymakers? Can traditional policy
approaches be used to address the problem of energy security or does
its- polysemic qualities mean that it should be treated as a 'wicked'
problem? To answer this question, the paper assesses energy security
against nine commonly cited characteristics of wicked policy
problems and finds strong evidence of 'wickedness'.
Abstract: Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.
Abstract: The six sigma method is a project-driven management approach to improve the organization-s products, services, and processes by continually reducing defects in the organization. Understanding the key features, obstacles, and shortcomings of the six sigma method allows organizations to better support their strategic directions, and increasing needs for coaching, mentoring, and training. It also provides opportunities to better implement six sigma projects. The purpose of this paper is the survey of six sigma process and its impact on the organizational productivity. So I have studied key concepts , problem solving process of six sigmaas well as the survey of important fields such as: DMAIC, six sigma and productivity applied programme, and other advantages of six sigma. In the end of this paper, present research conclusions. (direct and positive relation between six sigma and productivity)
Abstract: Arbitrarily shaped video objects are an important
concept in modern video coding methods. The techniques presently
used are not based on image elements but rather video objects having
an arbitrary shape. In this paper, spatial shape error concealment
techniques to be used for object-based image in error-prone
environments are proposed. We consider a geometric shape
representation consisting of the object boundary, which can be
extracted from the α-plane. Three different approaches are used to
replace a missing boundary segment: Bézier interpolation, Bézier
approximation and NURBS approximation. Experimental results on
object shape with different concealment difficulty demonstrate the
performance of the proposed methods. Comparisons with proposed
methods are also presented.
Abstract: Web applications have become very complex and
crucial, especially when combined with areas such as CRM
(Customer Relationship Management) and BPR (Business Process
Reengineering), the scientific community has focused attention to
Web applications design, development, analysis, and testing, by
studying and proposing methodologies and tools. This paper
proposes an approach to automatic multi-dimensional concern
mining for Web Applications, based on concepts analysis, impact
analysis, and token-based concern identification. This approach lets
the user to analyse and traverse Web software relevant to a particular
concern (concept, goal, purpose, etc.) via multi-dimensional
separation of concerns, to document, understand and test Web
applications. This technique was developed in the context of WAAT
(Web Applications Analysis and Testing) project. A semi-automatic
tool to support this technique is currently under development.