Abstract: The new technology of fuzzy neural networks for identification of parameters for mathematical models of geofields is proposed and checked. The effectiveness of that soft computing technology is demonstrated, especially in the early stage of modeling, when the information is uncertain and limited.
Abstract: Elastic boundary eigensolution problems are converted
into boundary integral equations by potential theory. The kernels of
the boundary integral equations have both the logarithmic and Hilbert
singularity simultaneously. We present the mechanical quadrature
methods for solving eigensolutions of the boundary integral equations
by dealing with two kinds of singularities at the same time. The methods
possess high accuracy O(h3) and low computing complexity. The
convergence and stability are proved based on Anselone-s collective
compact theory. Bases on the asymptotic error expansion with odd
powers, we can greatly improve the accuracy of the approximation,
and also derive a posteriori error estimate which can be used for
constructing self-adaptive algorithms. The efficiency of the algorithms
are illustrated by numerical examples.
Abstract: This paper proposes a new of cloud computing for individual computer users to share applications in distributed communities, called community-based personal cloud computing (CPCC). The paper also presents a prototype design and implementation of CPCC. The users of CPCC are able to share their computing applications with other users of the community. Any member of the community is able to execute remote applications shared by other members. The remote applications behave in the same way as their local counterparts, allowing the user to enter input, receive output as well as providing the access to the local data of the user. CPCC provides a peer-to-peer (P2P) environment where each peer provides applications which can be used by the other peers that are connected CPCC.
Abstract: Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.
Abstract: Recognizing behavioral patterns of financial markets
is essential for traders. Japanese candlestick chart is a common tool to
visualize and analyze such patterns in an economic time series. Since
the world was introduced to Japanese candlestick charting, traders
saw how combining this tool with intelligent technical approaches
creates a powerful formula for the savvy investors.
This paper propose a generalization to box counting method of
Grassberger-Procaccia, which is based on computing the correlation
dimension of Japanese candlesticks instead commonly used 'close'
points. The results of this method applied on several foreign
exchange rates vs. IRR (Iranian Rial). Satisfactorily show lower
chaotic dimension of Japanese candlesticks series than regular
Grassberger-Procaccia method applied merely on close points of
these same candles. This means there is some valuable information
inside candlesticks.
Abstract: Modern building automation needs to deal with very
different types of demands, depending on the use of a building and the
persons acting in it. To meet the requirements of situation awareness
in modern building automation, scenario recognition becomes more
and more important in order to detect sequences of events and to react
to them properly. We present two concepts of scenario recognition
and their implementation, one based on predefined templates and the
other applying an unsupervised learning algorithm using statistical
methods. Implemented applications will be described and their advantages
and disadvantages will be outlined.
Abstract: Detection and recognition of the Human Body Composition and extraction their measures (width and length of human body) in images are a major issue in detecting objects and the important field in Image, Signal and Vision Computing in recent years. Finding people and extraction their features in Images are particularly important problem of object recognition, because people can have high variability in the appearance. This variability may be due to the configuration of a person (e.g., standing vs. sitting vs. jogging), the pose (e.g. frontal vs. lateral view), clothing, and variations in illumination. In this study, first, Human Body is being recognized in image then the measures of Human Body extract from the image.
Abstract: Online Communities are an example of sociallyaware,
self-organising, complex adaptive computing systems.
The multi-agent systems (MAS) paradigm coordinated by
self-organisation mechanisms has been used as an effective
way for the simulation and modeling of such systems. In this
paper, we propose a model for simulating an online health
community using a situated multi-agent system approach,
governed by the co-evolution of the social and spatial
organisations of the agents.
Abstract: This research presents a system for post processing of
data that takes mined flat rules as input and discovers crisp as well as
fuzzy hierarchical structures using Learning Classifier System
approach. Learning Classifier System (LCS) is basically a machine
learning technique that combines evolutionary computing,
reinforcement learning, supervised or unsupervised learning and
heuristics to produce adaptive systems. A LCS learns by interacting
with an environment from which it receives feedback in the form of
numerical reward. Learning is achieved by trying to maximize the
amount of reward received. Crisp description for a concept usually
cannot represent human knowledge completely and practically. In the
proposed Learning Classifier System initial population is constructed
as a random collection of HPR–trees (related production rules) and
crisp / fuzzy hierarchies are evolved. A fuzzy subsumption relation is
suggested for the proposed system and based on Subsumption Matrix
(SM), a suitable fitness function is proposed. Suitable genetic
operators are proposed for the chosen chromosome representation
method. For implementing reinforcement a suitable reward and
punishment scheme is also proposed. Experimental results are
presented to demonstrate the performance of the proposed system.
Abstract: In this paper a real-time trajectory generation algorithm for computing 2-D optimal paths for autonomous aerial vehicles has been discussed. A dynamic programming approach is adopted to compute k-best paths by minimizing a cost function. Collision detection is implemented to detect intersection of the paths with obstacles. Our contribution is a novel approach to the problem of trajectory generation that is computationally efficient and offers considerable gain over existing techniques.
Abstract: Due to the dynamic nature of the Cloud, continuous monitoring of QoS requirements is necessary to manage the Cloud computing environment. The process of QoS monitoring and SLA violation detection consists of: collecting low and high level information pertinent to the service, analyzing the collected information, and taking corrective actions when SLA violations are detected. In this paper, we detail the architecture and the implementation of the first step of this process. More specifically, we propose an event-based approach to obtain run time information of services developed as BPEL processes. By catching particular events (i.e., the low level information), our approach recognizes the run-time execution path of a monitored service and uses the BPEL execution patterns to compute QoS of the composite service (i.e., the high level information).
Abstract: This paper is an exploration of the conceptual
confusion between E-learning and M-learning particularly in Africa.
Section I provides a background to the development of E-learning
and M-learning. Section II focuses on the conceptual analysis as it
applies to Africa. It is with an investigative and expansive mind that
this paper is elaborated to respond to a profound question of the
suitability of the concepts in a particular era in Africa. The aim of this
paper is therefore to shed light on which concept best suits the unique
situation of Africa in the era of cloud computing.
Abstract: This paper presents three new methodologies for the
basic operations, which aim at finding new ways of computing union
(maximum) and intersection (minimum) membership values by
taking into effect the entire membership values in a fuzzy set. The
new methodologies are conceptually simple and easy from the
application point of view and are illustrated with a variety of
problems such as Cartesian product of two fuzzy sets, max –min
composition of two fuzzy sets in different product spaces and an
application of an inverted pendulum to determine the impact of the
new methodologies. The results clearly indicate a difference based on
the nature of the fuzzy sets under consideration and hence will be
highly useful in quite a few applications where different values have
significant impact on the behavior of the system.
Abstract: This paper uses p-tolerance with the lowest posterior
loss, quadratic loss function, average length criteria, average
coverage criteria, and worst outcome criterion for computing of
sample size to estimate proportion in Binomial probability function
with Beta prior distribution. The proposed methodology is examined,
and its effectiveness is shown.
Abstract: According to development of communications and
web-based technologies in recent years, e-Learning has became very
important for everyone and is seen as one of most dynamic teaching
methods.
Grid computing is a pattern for increasing of computing power
and storage capacity of a system and is based on hardware and
software resources in a network with common purpose. In this article
we study grid architecture and describe its different layers. In this
way, we will analyze grid layered architecture. Then we will
introduce a new suitable architecture for e-Learning which is based
on grid network, and for this reason we call it Grid Learning
Architecture. Various sections and layers of suggested architecture
will be analyzed; especially grid middleware layer that has key role.
This layer is heart of grid learning architecture and, in fact,
regardless of this layer, e-Learning based on grid architecture will
not be feasible.
Abstract: Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.
Abstract: It is widely acknowledged that there is a shortage of software developers, not only in South Africa, but also worldwide. Despite reports on a gap between industry needs and software education, the gap has mostly been explored in quantitative studies. This paper reports on the qualitative data of a mixed method study of the perceptions of professional software developers regarding what topics they learned from their formal education and the importance of these topics to their actual work. The analysis suggests that there is a gap between industry’s needs and software development education and the following recommendations are made: 1) Real-life projects must be included in students’ education; 2) Soft skills and business skills must be included in curricula; 3) Universities must keep the curriculum up to date; 4) Software development education must be made accessible to a diverse range of students.
Abstract: In the last decades to supply the various and different
demands of clients, a lot of manufacturers trend to use the mixedmodel
assembly line (MMAL) in their production lines, since this
policy make possible to assemble various and different models of the
equivalent goods on the same line with the MTO approach.
In this article, we determine the sequence of (MMAL) line, with
applying the kitting approach and planning of rest time for general
workers to reduce the wastages, increase the workers effectiveness
and apply the sector of lean production approach.
This Multi-objective sequencing problem solved in small size with
GAMS22.2 and PSO meta heuristic in 10 test problems and compare
their results together and conclude that their results are very similar
together, next we determine the important factors in computing the
cost, which improving them cost reduced. Since this problem, is NPhard
in large size, we use the particle swarm optimization (PSO)
meta-heuristic for solving it. In large size we define some test
problems to survey it-s performance and determine the important
factors in calculating the cost, that by change or improved them
production in minimum cost will be possible.
Abstract: This paper proposes a novel game theoretical
technique to address the problem of data object replication in largescale
distributed computing systems. The proposed technique draws
inspiration from computational economic theory and employs the
extended Vickrey auction. Specifically, players in a non-cooperative
environment compete for server-side scarce memory space to
replicate data objects so as to minimize the total network object
transfer cost, while maintaining object concurrency. Optimization of
such a cost in turn leads to load balancing, fault-tolerance and
reduced user access time. The method is experimentally evaluated
against four well-known techniques from the literature: branch and
bound, greedy, bin-packing and genetic algorithms. The experimental
results reveal that the proposed approach outperforms the four
techniques in both the execution time and solution quality.
Abstract: The capturing of gel electrophoresis image represents
the output of a DNA computing algorithm. Before this image is being
captured, DNA computing involves parallel overlap assembly (POA)
and polymerase chain reaction (PCR) that is the main of this
computing algorithm. However, the design of the DNA
oligonucleotides to represent a problem is quite complicated and is
prone to errors. In order to reduce these errors during the design stage
before the actual in-vitro experiment is carried out; a simulation
software capable of simulating the POA and PCR processes is
developed. This simulation software capability is unlimited where
problem of any size and complexity can be simulated, thus saving
cost due to possible errors during the design process. Information
regarding the DNA sequence during the computing process as well as
the computing output can be extracted at the same time using the
simulation software.