An Energy Aware Dispatch Scheme WSNs

One of the key research issues in wireless sensor networks (WSNs) is how to efficiently deploy sensors to cover an area. In this paper, we present a Fishnet Based Dispatch Scheme (FiBDS) with energy aware mobility and interest based sensing angle. We propose two algorithms, one is FiBDS centralized algorithm and another is FiBDS distributed algorithm. The centralized algorithm is designed specifically for the non-time critical applications, commonly known as non real-time applications while the distributed algorithm is designed specifically for the time critical applications, commonly known as real-time applications. The proposed dispatch scheme works in a phase-selection manner. In this in each phase a specific constraint is dealt with according to the specified priority and then moved onto the next phase and at the end of each only the best suited nodes for the phase are chosen. Simulation results are presented to verify their effectiveness. 

Visual-Graphical Methods for Exploring Longitudinal Data

Longitudinal data typically have the characteristics of changes over time, nonlinear growth patterns, between-subjects variability, and the within errors exhibiting heteroscedasticity and dependence. The data exploration is more complicated than that of cross-sectional data. The purpose of this paper is to organize/integrate of various visual-graphical techniques to explore longitudinal data. From the application of the proposed methods, investigators can answer the research questions include characterizing or describing the growth patterns at both group and individual level, identifying the time points where important changes occur and unusual subjects, selecting suitable statistical models, and suggesting possible within-error variance.

A Formal Suite of Object Relational Database Metrics

Object Relational Databases (ORDB) are complex in nature than traditional relational databases because they combine the characteristics of both object oriented concepts and relational features of conventional databases. Design of an ORDB demands efficient and quality schema considering the structural, functional and componential traits. This internal quality of the schema is assured by metrics that measure the relevant attributes. This is extended to substantiate the understandability, usability and reliability of the schema, thus assuring external quality of the schema. This work institutes a formalization of ORDB metrics; metric definition, evaluation methodology and the calibration of the metric. Three ORDB schemas were used to conduct the evaluation and the formalization of the metrics. The metrics are calibrated using content and criteria related validity based on the measurability, consistency and reliability of the metrics. Nominal and summative scales are derived based on the evaluated metric values and are standardized. Future works pertaining to ORDB metrics forms the concluding note.

Computer Verification in Cryptography

In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.

Combining Variable Ordering Heuristics for Improving Search Algorithms Performance

Variable ordering heuristics are used in constraint satisfaction algorithms. Different characteristics of various variable ordering heuristics are complementary. Therefore we have tried to get the advantages of all heuristics to improve search algorithms performance for solving constraint satisfaction problems. This paper considers combinations based on products and quotients, and then a newer form of combination based on weighted sums of ratings from a set of base heuristics, some of which result in definite improvements in performance.

Application of Computational Intelligence for Sensor Fault Detection and Isolation

The new idea of this research is application of a new fault detection and isolation (FDI) technique for supervision of sensor networks in transportation system. In measurement systems, it is necessary to detect all types of faults and failures, based on predefined algorithm. Last improvements in artificial neural network studies (ANN) led to using them for some FDI purposes. In this paper, application of new probabilistic neural network features for data approximation and data classification are considered for plausibility check in temperature measurement. For this purpose, two-phase FDI mechanism was considered for residual generation and evaluation.

Mixtures of Monotone Networks for Prediction

In many data mining applications, it is a priori known that the target function should satisfy certain constraints imposed by, for example, economic theory or a human-decision maker. In this paper we consider partially monotone prediction problems, where the target variable depends monotonically on some of the input variables but not on all. We propose a novel method to construct prediction models, where monotone dependences with respect to some of the input variables are preserved by virtue of construction. Our method belongs to the class of mixture models. The basic idea is to convolute monotone neural networks with weight (kernel) functions to make predictions. By using simulation and real case studies, we demonstrate the application of our method. To obtain sound assessment for the performance of our approach, we use standard neural networks with weight decay and partially monotone linear models as benchmark methods for comparison. The results show that our approach outperforms partially monotone linear models in terms of accuracy. Furthermore, the incorporation of partial monotonicity constraints not only leads to models that are in accordance with the decision maker's expertise, but also reduces considerably the model variance in comparison to standard neural networks with weight decay.

New Approach for the Modeling and the Implementation of the Object-Relational Databases

Conception is the primordial part in the realization of a computer system. Several tools have been used to help inventors to describe their software. These tools knew a big success in the relational databases domain since they permit to generate SQL script modeling the database from an Entity/Association model. However, with the evolution of the computer domain, the relational databases proved their limits and object-relational model became used more and more. Tools of present conception don't support all new concepts introduced by this model and the syntax of the SQL3 language. We propose in this paper a tool of help to the conception and implementation of object-relational databases called «NAVIGTOOLS" that allows the user to generate script modeling its database in SQL3 language. This tool bases itself on the Entity/Association and navigational model for modeling the object-relational databases.

A Nobel Approach for Campus Monitoring

This paper presents one of the best applications of wireless sensor network for campus Monitoring. With the help of PIR sensor, temperature sensor and humidity sensor, effective utilization of energy resources has been implemented in one of rooms of Sharda University, Greater Noida, India. The RISC microcontroller is used here for analysis of output of sensors and providing proper control using ZigBee protocol. This wireless sensor module presents a tremendous power saving method for any campus

The Hybrid Knowledge Model for Product Development Management

Hybrid knowledge model is suggested as an underlying framework for product development management. It can support such hybrid features as ontologies and rules. Effective collaboration in product development environment depends on sharing and reasoning product information as well as engineering knowledge. Many studies have considered product information and engineering knowledge. However, most previous research has focused either on building the ontology of product information or rule-based systems of engineering knowledge. This paper shows that F-logic based knowledge model can support such desirable features in a hybrid way.

An Intelligent System Framework for Generating Activity List of a Project Using WBS Mind map and Semantic Network

Work Breakdown Structure (WBS) is one of the most vital planning processes of the project management since it is considered to be the fundamental of other processes like scheduling, controlling, assigning responsibilities, etc. In fact WBS or activity list is the heart of a project and omission of a simple task can lead to an irrecoverable result. There are some tools in order to generate a project WBS. One of the most powerful tools is mind mapping which is the basis of this article. Mind map is a method for thinking together and helps a project manager to stimulate the mind of project team members to generate project WBS. Here we try to generate a WBS of a sample project involving with the building construction using the aid of mind map and the artificial intelligence (AI) programming language. Since mind map structure can not represent data in a computerized way, we convert it to a semantic network which can be used by the computer and then extract the final WBS from the semantic network by the prolog programming language. This method will result a comprehensive WBS and decrease the probability of omitting project tasks.

Introducing Sequence-Order Constraint into Prediction of Protein Binding Sites with Automatically Extracted Templates

Search for a tertiary substructure that geometrically matches the 3D pattern of the binding site of a well-studied protein provides a solution to predict protein functions. In our previous work, a web server has been built to predict protein-ligand binding sites based on automatically extracted templates. However, a drawback of such templates is that the web server was prone to resulting in many false positive matches. In this study, we present a sequence-order constraint to reduce the false positive matches of using automatically extracted templates to predict protein-ligand binding sites. The binding site predictor comprises i) an automatically constructed template library and ii) a local structure alignment algorithm for querying the library. The sequence-order constraint is employed to identify the inconsistency between the local regions of the query protein and the templates. Experimental results reveal that the sequence-order constraint can largely reduce the false positive matches and is effective for template-based binding site prediction.

EAAC: Energy-Aware Admission Control Scheme for Ad Hoc Networks

The decisions made by admission control algorithms are based on the availability of network resources viz. bandwidth, energy, memory buffers, etc., without degrading the Quality-of-Service (QoS) requirement of applications that are admitted. In this paper, we present an energy-aware admission control (EAAC) scheme which provides admission control for flows in an ad hoc network based on the knowledge of the present and future residual energy of the intermediate nodes along the routing path. The aim of EAAC is to quantify the energy that the new flow will consume so that it can be decided whether the future residual energy of the nodes along the routing path can satisfy the energy requirement. In other words, this energy-aware routing admits a new flow iff any node in the routing path does not run out of its energy during the transmission of packets. The future residual energy of a node is predicted using the Multi-layer Neural Network (MNN) model. Simulation results shows that the proposed scheme increases the network lifetime. Also the performance of the MNN model is presented.

Representing Collective Unconsciousness Using Neural Networks

Instead of representing individual cognition only, population cognition is represented using artificial neural networks whilst maintaining individuality. This population network trains continuously, simulating adaptation. An implementation of two coexisting populations is compared to the Lotka-Volterra model of predator-prey interaction. Applications include multi-agent systems such as artificial life or computer games.

Automated Ranking of Hints

The importance of hints in an intelligent tutoring system is well understood. The problems however related to their delivering are quite a few. In this paper we propose delivering of hints to be based on considering their usefulness. By this we mean that a hint is regarded as useful to a student if the student has succeeded to solve a problem after the hint was suggested to her/him. Methods from the theory of partial orderings are further applied facilitating an automated process of offering individualized advises on how to proceed in order to solve a particular problem.

The Research of Fuzzy Classification Rules Applied to CRM

In the era of great competition, understanding and satisfying customers- requirements are the critical tasks for a company to make a profits. Customer relationship management (CRM) thus becomes an important business issue at present. With the help of the data mining techniques, the manager can explore and analyze from a large quantity of data to discover meaningful patterns and rules. Among all methods, well-known association rule is most commonly seen. This paper is based on Apriori algorithm and uses genetic algorithms combining a data mining method to discover fuzzy classification rules. The mined results can be applied in CRM to help decision marker make correct business decisions for marketing strategies.

Making Data Structures and Algorithms more Understandable by Programming Sudoku the Human Way

Data Structures and Algorithms is a module in most Computer Science or Information Technology curricula. It is one of the modules most students identify as being difficult. This paper demonstrates how programming a solution for Sudoku can make abstract concepts more concrete. The paper relates concepts of a typical Data Structures and Algorithms module to a step by step solution for Sudoku in a human type as opposed to a computer oriented solution.

Cloud Computing Databases: Latest Trends and Architectural Concepts

The Economic factors are leading to the rise of infrastructures provides software and computing facilities as a service, known as cloud services or cloud computing. Cloud services can provide efficiencies for application providers, both by limiting up-front capital expenses, and by reducing the cost of ownership over time. Such services are made available in a data center, using shared commodity hardware for computation and storage. There is a varied set of cloud services available today, including application services (salesforce.com), storage services (Amazon S3), compute services (Google App Engine, Amazon EC2) and data services (Amazon SimpleDB, Microsoft SQL Server Data Services, Google-s Data store). These services represent a variety of reformations of data management architectures, and more are on the horizon.

Application and Limitation of Parallel Modelingin Multidimensional Sequential Pattern

The goal of data mining algorithms is to discover useful information embedded in large databases. One of the most important data mining problems is discovery of frequently occurring patterns in sequential data. In a multidimensional sequence each event depends on more than one dimension. The search space is quite large and the serial algorithms are not scalable for very large datasets. To address this, it is necessary to study scalable parallel implementations of sequence mining algorithms. In this paper, we present a model for multidimensional sequence and describe a parallel algorithm based on data parallelism. Simulation experiments show good load balancing and scalable and acceptable speedup over different processors and problem sizes and demonstrate that our approach can works efficiently in a real parallel computing environment.

Software Technology Behind Computer Accounting

The main problems of data centric and open source project are large number of developers and changes of core framework. Model-View-Control (MVC) design pattern significantly improved the development and adjustments of complex projects. Entity framework as a Model layer in MVC architecture has simplified communication with the database. How often are the new technologies used and whether they have potentials for designing more efficient Enterprise Resource Planning (ERP) system that will be more suited to accountants?