Resource Constraint Mobile Agent Framework For Ambient Intelligence

In this paper, we introduce an mobile agent framework with proactive load balancing for ambient intelligence (AmI) environments. One of the main obstacles of AmI is the scalability in which the openness of AmI environment introduces dynamic resource requirements on agencies. To mediate this scalability problem, our framework proposes a load balancing module to proactively analyze the resource consumption of network bandwidth and preferred agencies to suggest the optimal communication method to its user. The framework generally formulates an AmI environment that consists of three main components: (1) mobile devices, (2) hosts or agencies, and (3) directory service center (DSC). A preliminary implementation was conducted with NetLogo and the experimental results show that the proposed approach provides enhanced system performance by minimizing the network utilization to provide users with responsive services.

A Tabu Search Heuristic for Scratch-Pad Memory Management

Reducing energy consumption of embedded systems requires careful memory management. It has been shown that Scratch- Pad Memories (SPMs) are low size, low cost, efficient (i.e. energy saving) data structures directly managed at the software level. In this paper, the focus is on heuristic methods for SPMs management. A method is efficient if the number of accesses to SPM is as large as possible and if all available space (i.e. bits) is used. A Tabu Search (TS) approach for memory management is proposed which is, to the best of our knowledge, a new original alternative to the best known existing heuristic (BEH). In fact, experimentations performed on benchmarks show that the Tabu Search method is as efficient as BEH (in terms of energy consumption) but BEH requires a sorting which can be computationally expensive for a large amount of data. TS is easy to implement and since no sorting is necessary, unlike BEH, the corresponding sorting time is saved. In addition to that, in a dynamic perspective where the maximum capacity of the SPM is not known in advance, the TS heuristic will perform better than BEH.

Pharmaceutical Microencapsulation Technology for Development of Controlled Release Drug Delivery systems

This article demonstrated development of controlled release system of an NSAID drug, Diclofenac sodium employing different ratios of Ethyl cellulose. Diclofenac sodium and ethyl cellulose in different proportions were processed by microencapsulation based on phase separation technique to formulate microcapsules. The prepared microcapsules were then compressed into tablets to obtain controlled release oral formulations. In-vitro evaluation was performed by dissolution test of each preparation was conducted in 900 ml of phosphate buffer solution of pH 7.2 maintained at 37 ± 0.5 °C and stirred at 50 rpm. At predetermined time intervals (0, 0.5, 1.0, 1.5, 2, 3, 4, 6, 8, 10, 12, 16, 20 and 24 hrs). The drug concentration in the collected samples was determined by UV spectrophotometer at 276 nm. The physical characteristics of diclofenac sodium microcapsules were according to accepted range. These were off-white, free flowing and spherical in shape. The release profile of diclofenac sodium from microcapsules was found to be directly proportional to the proportion of ethylcellulose and coat thickness. The in-vitro release pattern showed that with ratio of 1:1 and 1:2 (drug: polymer), the percentage release of drug at first hour was 16.91 and 11.52 %, respectively as compared to 1:3 which is only 6.87 % with in this time. The release mechanism followed higuchi model for its release pattern. Tablet Formulation (F2) of present study was found comparable in release profile the marketed brand Phlogin-SR, microcapsules showed an extended release beyond 24 h. Further, a good correlation was found between drug release and proportion of ethylcellulose in the microcapsules. Microencapsulation based on coacervation found as good technique to control release of diclofenac sodium for making the controlled release formulations.

Delay-dependent Stability Analysis for Uncertain Switched Neutral System

This paper considers the robust exponential stability issues for a class of uncertain switched neutral system which delays switched according to the switching rule. The system under consideration includes both stable and unstable subsystems. The uncertainties considered in this paper are norm bounded, and possibly time varying. Based on multiple Lyapunov functional approach and dwell-time technique, the time-dependent switching rule is designed depend on the so-called average dwell time of stable subsystems as well as the ratio of the total activation time of stable subsystems and unstable subsystems. It is shown that by suitably controlling the switching between the stable and unstable modes, the robust stabilization of the switched uncertain neutral systems can be achieved. Two simulation examples are given to demonstrate the effectiveness of the proposed method.

Modeling and Identification of Hammerstein System by using Triangular Basis Functions

This paper deals with modeling and parameter identification of nonlinear systems described by Hammerstein model having Piecewise nonlinear characteristics such as Dead-zone nonlinearity characteristic. The simultaneous use of both an easy decomposition technique and the triangular basis functions leads to a particular form of Hammerstein model. The approximation by using Triangular basis functions for the description of the static nonlinear block conducts to a linear regressor model, so that least squares techniques can be used for the parameter estimation. Singular Values Decomposition (SVD) technique has been applied to separate the coupled parameters. The proposed approach has been efficiently tested on academic examples of simulation.

Understanding and Measuring Trust Evolution Effectiveness in Peer-to-Peer Computing Systems

In any trust model, the two information sources that a peer relies on to predict trustworthiness of another peer are direct experience as well as reputation. These two vital components evolve over time. Trust evolution is an important issue, where the objective is to observe a sequence of past values of a trust parameter and determine the future estimates. Unfortunately, trust evolution algorithms received little attention and the proposed algorithms in the literature do not comply with the conditions and the nature of trust. This paper contributes to this important problem in the following ways: (a) presents an algorithm that manages and models trust evolution in a P2P environment, (b) devises new mechanisms for effectively maintaining trust values based on the conditions that influence trust evolution , and (c) introduces a new methodology for incorporating trust-nurture incentives into the trust evolution algorithm. Simulation experiments are carried out to evaluate our trust evolution algorithm.

Ontology of Collaborative Supply Chain for Quality Management

In the highly competitive and rapidly changing global marketplace, independent organizations and enterprises often come together and form a temporary alignment of virtual enterprise in a supply chain to better provide products or service. As firms adopt the systems approach implicit in supply chain management, they must manage the quality from both internal process control and external control of supplier quality and customer requirements. How to incorporate quality management of upstream and downstream supply chain partners into their own quality management system has recently received a great deal of attention from both academic and practice. This paper investigate the collaborative feature and the entities- relationship in a supply chain, and presents an ontology of collaborative supply chain from an approach of aligning service-oriented framework with service-dominant logic. This perspective facilitates the segregation of material flow management from manufacturing capability management, which provides a foundation for the coordination and integration of the business process to measure, analyze, and continually improve the quality of products, services, and process. Further, this approach characterizes the different interests of supply chain partners, providing an innovative approach to analyze the collaborative features of supply chain. Furthermore, this ontology is the foundation to develop quality management system which internalizes the quality management in upstream and downstream supply chain partners and manages the quality in supply chain systematically.

Towards an Enhanced Stochastic Simulation Model for Risk Analysis in Highway Construction

Over the years, there is a growing trend towards quality-based specifications in highway construction. In many Quality Control/Quality Assurance (QC/QA) specifications, the contractor is primarily responsible for quality control of the process, whereas the highway agency is responsible for testing the acceptance of the product. A cooperative investigation was conducted in Illinois over several years to develop a prototype End-Result Specification (ERS) for asphalt pavement construction. The final characteristics of the product are stipulated in the ERS and the contractor is given considerable freedom in achieving those characteristics. The risk for the contractor or agency depends on how the acceptance limits and processes are specified. Stochastic simulation models are very useful in estimating and analyzing payment risk in ERS systems and these form an integral part of the Illinois-s prototype ERS system. This paper describes the development of an innovative methodology to estimate the variability components in in-situ density, air voids and asphalt content data from ERS projects. The information gained from this would be crucial in simulating these ERS projects for estimation and analysis of payment risks associated with asphalt pavement construction. However, these methods require at least two parties to conduct tests on all the split samples obtained according to the sampling scheme prescribed in present ERS implemented in Illinois.

Optimization Using Simulation of the Vehicle Routing Problem

A key element of many distribution systems is the routing and scheduling of vehicles servicing a set of customers. A wide variety of exact and approximate algorithms have been proposed for solving the vehicle routing problems (VRP). Exact algorithms can only solve relatively small problems of VRP, which is classified as NP-Hard. Several approximate algorithms have proven successful in finding a feasible solution not necessarily optimum. Although different parts of the problem are stochastic in nature; yet, limited work relevant to the application of discrete event system simulation has addressed the problem. Presented here is optimization using simulation of VRP; where, a simplified problem has been developed in the ExtendSimTM simulation environment; where, ExtendSimTM evolutionary optimizer is used to minimize the total transportation cost of the problem. Results obtained from the model are very satisfactory. Further complexities of the problem are proposed for consideration in the future.

A Novel Method to Evaluate Line Loadability for Distribution Systems with Realistic Loads

This paper presents a simple method for estimation of additional load as a factor of the existing load that may be drawn before reaching the point of line maximum loadability of radial distribution system (RDS) with different realistic load models at different substation voltages. The proposed method involves a simple line loadability index (LLI) that gives a measure of the proximity of the present state of a line in the distribution system. The LLI can use to assess voltage instability and the line loading margin. The proposed method also compares with the existing method of maximum loadability index [10]. The simulation results show that the LLI can identify not only the weakest line/branch causing system instability but also the system voltage collapse point when it is near one. This feature enables us to set an index threshold to monitor and predict system stability on-line so that a proper action can be taken to prevent the system from collapse. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on two bus and 69 bus RDS.

Optimum Cascaded Design for Speech Enhancement Using Kalman Filter

Speech enhancement is the process of eliminating noise and increasing the quality of a speech signal, which is contaminated with other kinds of distortions. This paper is on developing an optimum cascaded system for speech enhancement. This aim is attained without diminishing any relevant speech information and without much computational and time complexity. LMS algorithm, Spectral Subtraction and Kalman filter have been deployed as the main de-noising algorithms in this work. Since these algorithms suffer from respective shortcomings, this work has been undertaken to design cascaded systems in different combinations and the evaluation of such cascades by qualitative (listening) and quantitative (SNR) tests.

A New Model for Question Answering Systems

Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems. If this module doesn't work properly, it will make problems for other sections. Moreover answer processing module is an emerging topic in Question Answering, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic classification. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. Answer processing module, consists of candidate answer filtering, candidate answer ordering components and also it has a validation section for interacting with user. This module makes it more suitable to find exact answer. In this paper we have described question and answer processing modules with modeling, implementing and evaluating the system. System implemented in two versions. Results show that 'Version No.1' gave correct answer to 70% of questions (30 correct answers to 50 asked questions) and 'version No.2' gave correct answers to 94% of questions (47 correct answers to 50 asked questions).

Fractal Shapes Description with Parametric L-systems and Turtle Algebra

In this paper, we propose a new method to describe fractal shapes using parametric l-systems. First we introduce scaling factors in the production rules of the parametric l-systems grammars. Then we decorticate these grammars with scaling factors using turtle algebra to show the mathematical relation between l-systems and iterated function systems (IFS). We demonstrate that with specific values of the scaling factors, we find the exact relationship established by Prusinkiewicz and Hammel between l-systems and IFS.

Technological Environment - International Marketing Strategy Relationship

International trade involves both large and small firms engaged in business overseas. Possible drivers that force companies to enter international markets include increasing competition at the domestic market, maturing domestic markets, and limited domestic market opportunities. Technology is an important driving factor in shaping international marketing strategy as well as in driving force towards a more global marketplace, especially technology in communication. It includes telephones, the internet, computer systems and e-mail. There are three main marketing strategy choices, namely standardization approach, adaptation approach and middleof- the-road approach that companies implement to overseas markets. The decision depends on situations and factors facing the companies in the international markets. In this paper, the contingency concept is considered that no single strategy can be effective in all contexts. The effect of strategy on performance depends on specific situational variables. Strategic fit is employed to investigate export marketing strategy adaptation under certain environmental conditions, which in turn can lead to superior performance.

Knowledge Based Wear Particle Analysis

The paper describes a knowledge based system for analysis of microscopic wear particles. Wear particles contained in lubricating oil carry important information concerning machine condition, in particular the state of wear. Experts (Tribologists) in the field extract this information to monitor the operation of the machine and ensure safety, efficiency, quality, productivity, and economy of operation. This procedure is not always objective and it can also be expensive. The aim is to classify these particles according to their morphological attributes of size, shape, edge detail, thickness ratio, color, and texture, and by using this classification thereby predict wear failure modes in engines and other machinery. The attribute knowledge links human expertise to the devised Knowledge Based Wear Particle Analysis System (KBWPAS). The system provides an automated and systematic approach to wear particle identification which is linked directly to wear processes and modes that occur in machinery. This brings consistency in wear judgment prediction which leads to standardization and also less dependence on Tribologists.

Avoiding Catastrophic Forgetting by a Dual-Network Memory Model Using a Chaotic Neural Network

In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. In this paper, we propose a biologically inspired neural network model which overcomes this problem. The proposed model consists of two distinct networks: one is a Hopfield type of chaotic associative memory and the other is a multilayer neural network. We consider that these networks correspond to the hippocampus and the neocortex of the brain, respectively. Information given is firstly stored in the hippocampal network with fast learning algorithm. Then the stored information is recalled by chaotic behavior of each neuron in the hippocampal network. Finally, it is consolidated in the neocortical network by using pseudopatterns. Computer simulation results show that the proposed model has much better ability to avoid catastrophic forgetting in comparison with conventional models.

An Efficient Key Management Scheme for Secure SCADA Communication

A SCADA (Supervisory Control And Data Acquisition) system is an industrial control and monitoring system for national infrastructures. The SCADA systems were used in a closed environment without considering about security functionality in the past. As communication technology develops, they try to connect the SCADA systems to an open network. Therefore, the security of the SCADA systems has been an issue. The study of key management for SCADA system also has been performed. However, existing key management schemes for SCADA system such as SKE(Key establishment for SCADA systems) and SKMA(Key management scheme for SCADA systems) cannot support broadcasting communication. To solve this problem, an Advanced Key Management Architecture for Secure SCADA Communication has been proposed by Choi et al.. Choi et al.-s scheme also has a problem that it requires lots of computational cost for multicasting communication. In this paper, we propose an enhanced scheme which improving computational cost for multicasting communication with considering the number of keys to be stored in a low power communication device (RTU).

Optimal Power Allocation for the Proposed Asymmetric Turbo Code for 3G Systems

We proposed a new class of asymmetric turbo encoder for 3G systems that performs well in both “water fall" and “error floor" regions in [7]. In this paper, a modified (optimal) power allocation scheme for the different bits of new class of asymmetric turbo encoder has been investigated to enhance the performance. The simulation results and performance bound for proposed asymmetric turbo code with modified Unequal Power Allocation (UPA) scheme for the frame length, N=400, code rate, r=1/3 with Log-MAP decoder over Additive White Gaussian Noise (AWGN) channel are obtained and compared with the system with typical UPA and without UPA. The performance tests are extended over AWGN channel for different frame size to verify the possibility of implementation of the modified UPA scheme for the proposed asymmetric turbo code. From the performance results, it is observed that the proposed asymmetric turbo code with modified UPA performs better than the system without UPA and with typical UPA and it provides a coding gain of 0.4 to 0.52dB.

The First Integral Approach in Stability Problem of Large Scale Nonlinear Dynamical Systems

In analyzing large scale nonlinear dynamical systems, it is often desirable to treat the overall system as a collection of interconnected subsystems. Solutions properties of the large scale system are then deduced from the solution properties of the individual subsystems and the nature of the interconnections. In this paper a new approach is proposed for the stability analysis of large scale systems, which is based upon the concept of vector Lyapunov functions and the decomposition methods. The present results make use of graph theoretic decomposition techniques in which the overall system is partitioned into a hierarchy of strongly connected components. We show then, that under very reasonable assumptions, the overall system is stable once the strongly connected subsystems are stables. Finally an example is given to illustrate the constructive methodology proposed.