Abstract: Many-core GPUs provide high computing ability and
substantial bandwidth; however, optimizing irregular applications
like SpMV on GPUs becomes a difficult but meaningful task. In this
paper, we propose a novel method to improve the performance of
SpMV on GPUs. A new storage format called HYB-R is proposed to
exploit GPU architecture more efficiently. The COO portion of the
matrix is partitioned recursively into a ELL portion and a COO
portion in the process of creating HYB-R format to ensure that there
are as many non-zeros as possible in ELL format. The method of
partitioning the matrix is an important problem for HYB-R kernel, so
we also try to tune the parameters to partition the matrix for higher
performance. Experimental results show that our method can get
better performance than the fastest kernel (HYB) in NVIDIA-s
SpMV library with as high as 17% speedup.
Abstract: EPA (Ethernet for Plant Automation) resolves the nondeterministic problem of standard Ethernet and accomplishes real-time communication by means of micro-segment topology and deterministic scheduling mechanism. This paper studies the real-time performance of EPA periodic data transmission from theoretical and experimental perspective. By analyzing information transmission characteristics and EPA deterministic scheduling mechanism, 5 indicators including delivery time, time synchronization accuracy, data-sending time offset accuracy, utilization percentage of configured timeslice and non-RTE bandwidth that can be used to specify the real-time performance of EPA periodic data transmission are presented and investigated. On this basis, the test principles and test methods of the indicators are respectively studied and some formulas for real-time performance of EPA system are derived. Furthermore, an experiment platform is developed to test the indicators of EPA periodic data transmission in a micro-segment. According to the analysis and the experiment, the methods to improve the real-time performance of EPA periodic data transmission including optimizing network structure, studying self-adaptive adjustment method of timeslice and providing data-sending time offset accuracy for configuration are proposed.
Abstract: The main problems of data centric and open source
project are large number of developers and changes of core
framework. Model-View-Control (MVC) design pattern significantly
improved the development and adjustments of complex projects.
Entity framework as a Model layer in MVC architecture has
simplified communication with the database. How often are the new
technologies used and whether they have potentials for designing
more efficient Enterprise Resource Planning (ERP) system that will
be more suited to accountants?
Abstract: Recently, many web services to provide information for public transport are developed and released. They are optimized for mobile devices such a smartphone. We are also developing better path planning system for route buses and trains called “Bus-Net"[1]. However these systems only provide paths and related information before the user start moving. So we propose a context aware navigation to change the way to support public transport users. If we go to somewhere using many kinds of public transport, we have to know how to use them. In addition, public transport is dynamic system, and these have different characteristic by type. So we need information at real-time. Therefore we suggest the system that can support on user-s state. It has a variety of ways to help public transport users by each state, like turn-by-turn navigation. Context aware navigation will be able to reduce anxiety for using public transport.
Abstract: The goal of data mining algorithms is to discover
useful information embedded in large databases. One of the most
important data mining problems is discovery of frequently occurring
patterns in sequential data. In a multidimensional sequence each
event depends on more than one dimension. The search space is quite
large and the serial algorithms are not scalable for very large
datasets. To address this, it is necessary to study scalable parallel
implementations of sequence mining algorithms.
In this paper, we present a model for multidimensional sequence
and describe a parallel algorithm based on data parallelism.
Simulation experiments show good load balancing and scalable and
acceptable speedup over different processors and problem sizes and
demonstrate that our approach can works efficiently in a real parallel
computing environment.
Abstract: The Economic factors are leading to the rise of
infrastructures provides software and computing facilities as a
service, known as cloud services or cloud computing. Cloud services
can provide efficiencies for application providers, both by limiting
up-front capital expenses, and by reducing the cost of ownership over
time. Such services are made available in a data center, using shared
commodity hardware for computation and storage. There is a varied
set of cloud services available today, including application services
(salesforce.com), storage services (Amazon S3), compute services
(Google App Engine, Amazon EC2) and data services (Amazon
SimpleDB, Microsoft SQL Server Data Services, Google-s Data
store). These services represent a variety of reformations of data
management architectures, and more are on the horizon.
Abstract: This paper presents a new approach for intelligent agent communication based on ontology for agent community. DARPA agent markup language (DAML) is used to build the community ontology. This paper extends the agent management specification by the foundation for intelligent physical agents (FIPA) to develop an agent role called community facilitator (CF) that manages community directory and community ontology. CF helps build agent community. Precise description of agent service in this community can thus be achieved. This facilitates agent communication. Furthermore, through ontology update, agents with different ontology are capable of communicating with each other. An example of advanced traveler information system is included to illustrate practicality of this approach.
Abstract: Data Structures and Algorithms is a module in most
Computer Science or Information Technology curricula. It is one of
the modules most students identify as being difficult. This paper
demonstrates how programming a solution for Sudoku can make
abstract concepts more concrete. The paper relates concepts of a
typical Data Structures and Algorithms module to a step by step
solution for Sudoku in a human type as opposed to a computer
oriented solution.
Abstract: In the era of great competition, understanding and satisfying
customers- requirements are the critical tasks for a company
to make a profits. Customer relationship management (CRM) thus
becomes an important business issue at present. With the help of
the data mining techniques, the manager can explore and analyze
from a large quantity of data to discover meaningful patterns and
rules. Among all methods, well-known association rule is most
commonly seen. This paper is based on Apriori algorithm and uses
genetic algorithms combining a data mining method to discover fuzzy
classification rules. The mined results can be applied in CRM to
help decision marker make correct business decisions for marketing
strategies.
Abstract: Different methods containing biometric algorithms are
presented for the representation of eigenfaces detection including
face recognition, are identification and verification. Our theme of this
research is to manage the critical processing stages (accuracy, speed,
security and monitoring) of face activities with the flexibility of
searching and edit the secure authorized database. In this paper we
implement different techniques such as eigenfaces vector reduction
by using texture and shape vector phenomenon for complexity
removal, while density matching score with Face Boundary Fixation
(FBF) extracted the most likelihood characteristics in this media
processing contents. We examine the development and performance
efficiency of the database by applying our creative algorithms in both
recognition and detection phenomenon. Our results show the
performance accuracy and security gain with better achievement than
a number of previous approaches in all the above processes in an
encouraging mode.
Abstract: The importance of hints in an intelligent tutoring system is well understood. The problems however related to their delivering are quite a few. In this paper we propose delivering of hints to be based on considering their usefulness. By this we mean that a hint is regarded as useful to a student if the student has succeeded to solve a problem after the hint was suggested to her/him. Methods from the theory of partial orderings are further applied facilitating an automated process of offering individualized advises on how to proceed in order to solve a particular problem.
Abstract: Instead of representing individual cognition only, population cognition is represented using artificial neural networks whilst maintaining individuality. This population network trains continuously, simulating adaptation. An implementation of two coexisting populations is compared to the Lotka-Volterra model of predator-prey interaction. Applications include multi-agent systems such as artificial life or computer games.
Abstract: The decisions made by admission control algorithms are
based on the availability of network resources viz. bandwidth, energy,
memory buffers, etc., without degrading the Quality-of-Service (QoS)
requirement of applications that are admitted. In this paper, we
present an energy-aware admission control (EAAC) scheme which
provides admission control for flows in an ad hoc network based
on the knowledge of the present and future residual energy of the
intermediate nodes along the routing path. The aim of EAAC is to
quantify the energy that the new flow will consume so that it can
be decided whether the future residual energy of the nodes along
the routing path can satisfy the energy requirement. In other words,
this energy-aware routing admits a new flow iff any node in the
routing path does not run out of its energy during the transmission
of packets. The future residual energy of a node is predicted using
the Multi-layer Neural Network (MNN) model. Simulation results
shows that the proposed scheme increases the network lifetime. Also
the performance of the MNN model is presented.
Abstract: Search for a tertiary substructure that geometrically
matches the 3D pattern of the binding site of a well-studied protein provides a solution to predict protein functions. In our previous work,
a web server has been built to predict protein-ligand binding sites
based on automatically extracted templates. However, a drawback of such templates is that the web server was prone to resulting in many
false positive matches. In this study, we present a sequence-order constraint to reduce the false positive matches of using automatically
extracted templates to predict protein-ligand binding sites. The binding site predictor comprises i) an automatically constructed template library and ii) a local structure alignment algorithm for
querying the library. The sequence-order constraint is employed to
identify the inconsistency between the local regions of the query protein and the templates. Experimental results reveal that the sequence-order constraint can largely reduce the false positive matches and is effective for template-based binding site prediction.
Abstract: Work Breakdown Structure (WBS) is one of the
most vital planning processes of the project management since it
is considered to be the fundamental of other processes like
scheduling, controlling, assigning responsibilities, etc. In fact
WBS or activity list is the heart of a project and omission of a
simple task can lead to an irrecoverable result. There are some
tools in order to generate a project WBS. One of the most
powerful tools is mind mapping which is the basis of this article.
Mind map is a method for thinking together and helps a project
manager to stimulate the mind of project team members to
generate project WBS. Here we try to generate a WBS of a
sample project involving with the building construction using the
aid of mind map and the artificial intelligence (AI) programming
language. Since mind map structure can not represent data in a
computerized way, we convert it to a semantic network which can
be used by the computer and then extract the final WBS from the
semantic network by the prolog programming language. This
method will result a comprehensive WBS and decrease the
probability of omitting project tasks.
Abstract: Hybrid knowledge model is suggested as an underlying
framework for product development management. It can support such
hybrid features as ontologies and rules. Effective collaboration in
product development environment depends on sharing and reasoning
product information as well as engineering knowledge. Many studies
have considered product information and engineering knowledge.
However, most previous research has focused either on building the
ontology of product information or rule-based systems of engineering
knowledge. This paper shows that F-logic based knowledge model can
support such desirable features in a hybrid way.
Abstract: This paper presents one of the best applications of wireless sensor network for campus Monitoring. With the help of PIR sensor, temperature sensor and humidity sensor, effective utilization of energy resources has been implemented in one of rooms of Sharda University, Greater Noida, India. The RISC microcontroller is used here for analysis of output of sensors and providing proper control using ZigBee protocol. This wireless sensor module presents a tremendous power saving method for any campus
Abstract: Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.
Abstract: Conception is the primordial part in the realization of
a computer system. Several tools have been used to help inventors to
describe their software. These tools knew a big success in the
relational databases domain since they permit to generate SQL script
modeling the database from an Entity/Association model. However,
with the evolution of the computer domain, the relational databases
proved their limits and object-relational model became used more
and more. Tools of present conception don't support all new concepts
introduced by this model and the syntax of the SQL3 language. We
propose in this paper a tool of help to the conception and
implementation of object-relational databases called «NAVIGTOOLS"
that allows the user to generate script modeling its database
in SQL3 language. This tool bases itself on the Entity/Association
and navigational model for modeling the object-relational databases.
Abstract: In many data mining applications, it is a priori known
that the target function should satisfy certain constraints imposed
by, for example, economic theory or a human-decision maker. In this
paper we consider partially monotone prediction problems, where the
target variable depends monotonically on some of the input variables
but not on all. We propose a novel method to construct prediction
models, where monotone dependences with respect to some of
the input variables are preserved by virtue of construction. Our
method belongs to the class of mixture models. The basic idea is to
convolute monotone neural networks with weight (kernel) functions
to make predictions. By using simulation and real case studies,
we demonstrate the application of our method. To obtain sound
assessment for the performance of our approach, we use standard
neural networks with weight decay and partially monotone linear
models as benchmark methods for comparison. The results show that
our approach outperforms partially monotone linear models in terms
of accuracy. Furthermore, the incorporation of partial monotonicity
constraints not only leads to models that are in accordance with the
decision maker's expertise, but also reduces considerably the model
variance in comparison to standard neural networks with weight
decay.