On Mobile Checkpointing using Index and Time Together

Checkpointing is one of the commonly used techniques to provide fault-tolerance in distributed systems so that the system can operate even if one or more components have failed. However, mobile computing systems are constrained by low bandwidth, mobility, lack of stable storage, frequent disconnections and limited battery life. Hence, checkpointing protocols having lesser number of synchronization messages and fewer checkpoints are preferred in mobile environment. There are two different approaches, although not orthogonal, to checkpoint mobile computing systems namely, time-based and index-based. Our protocol is a fusion of these two approaches, though not first of its kind. In the present exposition, an index-based checkpointing protocol has been developed, which uses time to indirectly coordinate the creation of consistent global checkpoints for mobile computing systems. The proposed algorithm is non-blocking, adaptive, and does not use any control message. Compared to other contemporary checkpointing algorithms, it is computationally more efficient because it takes lesser number of checkpoints and does not need to compute dependency relationships. A brief account of important and relevant works in both the fields, time-based and index-based, has also been included in the presentation.

Manifold Analysis by Topologically Constrained Isometric Embedding

We present a new algorithm for nonlinear dimensionality reduction that consistently uses global information, and that enables understanding the intrinsic geometry of non-convex manifolds. Compared to methods that consider only local information, our method appears to be more robust to noise. Unlike most methods that incorporate global information, the proposed approach automatically handles non-convexity of the data manifold. We demonstrate the performance of our algorithm and compare it to state-of-the-art methods on synthetic as well as real data.

Performance Analysis of a Flexible Manufacturing Line Operated Under Surplus-based Production Control

In this paper we present our results on the performance analysis of a multi-product manufacturing line. We study the influence of external perturbations, intermediate buffer content and the number of manufacturing stages on the production tracking error of each machine in the multi-product line operated under a surplusbased production control policy. Starting by the analysis of a single machine with multiple production stages (one for each product type), we provide bounds on the production error of each stage. Then, we extend our analysis to a line of multi-stage machines, where similarly, bounds on each production tracking error for each product type, as well as buffer content are obtained. Details on performance of the closed-loop flow line model are illustrated in numerical simulations.

Structural Cost of Optimized Reinforced Concrete Isolated Footing

This paper presents an analytical model to estimate the cost of an optimized design of reinforced concrete isolated footing base on structural safety. Flexural and optimized formulas for square and rectangular footingare derived base on ACI building code of design, material cost and optimization. The optimization constraints consist of upper and lower limits of depth and area of steel. Footing depth and area of reinforcing steel are to be minimized to yield the optimal footing dimensions. Optimized footing materials cost of concrete, reinforcing steel and formwork of the designed sections are computed. Total cost factor TCF and other cost factors are developed to generalize and simplify the calculations of footing material cost. Numerical examples are presented to illustrate the model capability of estimating the material cost of the footing for a desired axial load.

A Message Passing Implementation of a New Parallel Arrangement Algorithm

This paper describes a new algorithm of arrangement in parallel, based on Odd-Even Mergesort, called division and concurrent mixes. The main idea of the algorithm is to achieve that each processor uses a sequential algorithm for ordering a part of the vector, and after that, for making the processors work in pairs in order to mix two of these sections ordered in a greater one, also ordered; after several iterations, the vector will be completely ordered. The paper describes the implementation of the new algorithm on a Message Passing environment (such as MPI). Besides, it compares the obtained experimental results with the quicksort sequential algorithm and with the parallel implementations (also on MPI) of the algorithms quicksort and bitonic sort. The comparison has been realized in an 8 processors cluster under GNU/Linux which is running on a unique PC processor.

Cooperative Multi Agent Soccer Robot Team

This paper introduces our first efforts of developing a new team for RoboCup Middle Size Competition. In our robots we have applied omni directional based mobile system with omnidirectional vision system and fuzzy control algorithm to navigate robots. The control architecture of MRL middle-size robots is a three layered architecture, Planning, Sequencing, and Executing. It also uses Blackboard system to achieve coordination among agents. Moreover, the architecture should have minimum dependency on low level structure and have a uniform protocol to interact with real robot.

Optimal Capacitor Allocation for loss reduction in Distribution System Using Fuzzy and Plant Growth Simulation Algorithm

This paper presents a new and efficient approach for capacitor placement in radial distribution systems that determine the optimal locations and size of capacitor with an objective of improving the voltage profile and reduction of power loss. The solution methodology has two parts: in part one the loss sensitivity factors are used to select the candidate locations for the capacitor placement and in part two a new algorithm that employs Plant growth Simulation Algorithm (PGSA) is used to estimate the optimal size of capacitors at the optimal buses determined in part one. The main advantage of the proposed method is that it does not require any external control parameters. The other advantage is that it handles the objective function and the constraints separately, avoiding the trouble to determine the barrier factors. The proposed method is applied to 9 and 34 bus radial distribution systems. The solutions obtained by the proposed method are compared with other methods. The proposed method has outperformed the other methods in terms of the quality of solution.

Object-Oriented Programming Strategies in C# for Power Conscious System

Low power consumption is a major constraint for battery-powered system like computer notebook or PDA. In the past, specialists usually designed both specific optimized equipments and codes to relief this concern. Doing like this could work for quite a long time, however, in this era, there is another significant restraint, the time to market. To be able to serve along the power constraint while can launch products in shorter production period, objectoriented programming (OOP) has stepped in to this field. Though everyone knows that OOP has quite much more overhead than assembly and procedural languages, development trend still heads to this new world, which contradicts with the target of low power consumption. Most of the prior power related software researches reported that OOP consumed much resource, however, as industry had to accept it due to business reasons, up to now, no papers yet had mentioned about how to choose the best OOP practice in this power limited boundary. This article is the pioneer that tries to specify and propose the optimized strategy in writing OOP software under energy concerned environment, based on quantitative real results. The language chosen for studying is C# based on .NET Framework 2.0 which is one of the trendy OOP development environments. The recommendation gotten from this research would be a good roadmap that can help developers in coding that well balances between time to market and time of battery.

Environmental Performance Assessment Model as a Sustainability Decision Tool for Small and Middle Sized Enterprises

Paper deals with environmental metrics and assessment systems devoted to Small and Medium Sized Enterprises. Authors are presenting proposed assessment model which has an ability to discover current environmental strengths and weaknesses of Small and Middle Sized Enterprise. Suggested model has also an ambition to become a Sustainability Decision Tool. Model is able to identify "best environmental devision" in the company, and to quantify how this decision contributed into overall environmental improvement. Authors understand environmental improvements as environmental innovations (product, process and organizational). Suggested model is based on its own concept; however, authors are also utilizing already existing environmental assessment tools.

Citizenship Norms and the Participation of Young Adults in a Democracy

This paper explores the changing trend in citizenship norms among young citizens from various ethnic groups in Malaysia and the extent to which it influences the participation of young citizens in political and civil issues. Embedded in democratic constitutions are the rights and freedoms that accompany citizenship, and these rights and freedoms include participation. Participation in democracies should go beyond voting; it should include taking part in the governance process. The political process is not at risk even though politics does not work as it did in the past. A national sample of 1697 respondents between the ages of 21 and 40 years were interviewed in January 2011. The findings show that respondents embrace an engaged-citizenship norm more than they do the traditional duty-citizen norm. Among the ethnic groups, the Chinese show lower means in both citizenship norms compared with other ethnic groups, namely, the Malays and the Indians. The duty-citizen norm correlates higher with political participation than with civic participation. On the other hand, the engaged-citizen norm correlates higher with civic participation than with political participation.

Double Flux Orientation Control for a Doubly Fed Induction Machine

Doubly fed induction machines DFIM are used mainly for wind energy conversion in MW power plants. This paper presents a new strategy of field oriented control ,it is based on the principle of a double flux orientation of stator and rotor at the same time. Therefore, the orthogonality created between the two oriented fluxes, which must be strictly observed, leads to generate a linear and decoupled control with an optimal torque. The obtained simulation results show the feasibility and the effectiveness of the suggested method.

Quantifying the Sustainable Building Criteria Based on Case Studies from Malaysia

In order to encourage the construction of green homes (GH) in Malaysia, a simple and attainable framework for designing and building GHs is needed. This can be achieved by aligning GH principles against Cole-s 'Sustainable Building Criteria' (SBC). This set of considerations was used to categorize the GH features of three case studies from Malaysia. Although the categorization of building features is useful at exploring the presence of sustainability inclinations of each house, the overall impact of building features in each of the five SBCs are unknown. Therefore, this paper explored the possibility of quantifying the impact of building features categorized in SBC1 – “Buildings will have to adapt to the new environment and restore damaged ecology while mitigating resource use" based on existing GH assessment tools and methods and other literature. This process as reported in this paper could lead to a new dimension in green home rating and assessment methods.

Finite Element Study on Corono-Radicular Restored Premolars

Restoration of endodontically treated teeth is a common problem in dentistry, related to the fractures occurring in such teeth and to concentration of forces little information regarding variation of basic preparation guidelines in stress distribution has been available. To date, there is still no agreement in the literature about which material or technique can optimally restore endodontically treated teeth. The aim of the present study was to evaluate the influence of the core height and restoration materials on corono-radicular restored upper first premolar. The first step of the study was to achieve 3D models in order to analyze teeth, dowel and core restorations and overlying full ceramic crowns. The FEM model was obtained by importing the solid model into ANSYS finite element analysis software. An occlusal load of 100 N was conducted, and stresses occurring in the restorations, and teeth structures were calculated. Numerical simulations provide a biomechanical explanation for stress distribution in prosthetic restored teeth. Within the limitations of the present study, it was found that the core height has no important influence on the stress generated in coronoradicular restored premolars. It can be drawn that the cervical regions of the teeth and restorations were subjected to the highest stress concentrations.

Fast Lines at Theme Parks

Waiting times and queues are a daily problem for theme parks. Fast lines or priority queues appear as a solution for a specific segment of customers, that is, tourists who are willing to pay to avoid waiting. This paper analyzes the fast line system and explores the factors that affect the decision to purchase a fast line pass. A greater understanding of these factors may help companies to design appropriate products and services. This conceptual paper was based on a literature review in marketing and consumer behavior. Additional research was identified in related disciplines such as leisure studies, psychology, and sociology. A conceptual framework of the factors influencing the decision to purchase a fast line pass is presented.

Computer Proven Correctness of the Rabin Public-Key Scheme

We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.

Arterial Stiffness Detection Depending on Neural Network Classification of the Multi- Input Parameters

Diagnostic and detection of the arterial stiffness is very important; which gives indication of the associated increased risk of cardiovascular diseases. To make a cheap and easy method for general screening technique to avoid the future cardiovascular complexes , due to the rising of the arterial stiffness ; a proposed algorithm depending on photoplethysmogram to be used. The photoplethysmograph signals would be processed in MATLAB. The signal will be filtered, baseline wandering removed, peaks and valleys detected and normalization of the signals should be achieved .The area under the catacrotic phase of the photoplethysmogram pulse curve is calculated using trapezoidal algorithm ; then will used in cooperation with other parameters such as age, height, blood pressure in neural network for arterial stiffness detection. The Neural network were implemented with sensitivity of 80%, accuracy 85% and specificity of 90% were got from the patients data. It is concluded that neural network can detect the arterial STIFFNESS depending on risk factor parameters.

Laser Transmission through Vegetative Material

The dynamic speckle or biospeckle is an interference phenomenon generated at the reflection of a coherent light by an active surface or even by a particulate or living body surface. The above mentioned phenomenon gave scientific support to a method named biospeckle which has been employed to study seed viability, biological activity, tissue senescence, tissue water content, fruit bruising, etc. Since the above mentioned method is not invasive and yields numerical values, it can be considered for possible automation associated to several processes, including selection and sorting. Based on these preliminary considerations, this research work proposed to study the interaction of a laser beam with vegetative samples by measuring the incident light intensity and the transmitted light beam intensity at several vegetative slabs of varying thickness. Tests were carried on fifteen slices of apple tissue divided into three thickness groups, i.e., 4 mm, 5 mm, 18 mm and 22 mm. A diode laser beam of 10mW and 632 nm wavelength and a Samsung digital camera were employed to carry the tests. Outgoing images were analyzed by comparing the gray gradient of a fixed image column of each image to obtain a laser penetration scale into the tissue, according to the slice thickness.

Fuzzy Wavelet Packet based Feature Extraction Method for Multifunction Myoelectric Control

The myoelectric signal (MES) is one of the Biosignals utilized in helping humans to control equipments. Recent approaches in MES classification to control prosthetic devices employing pattern recognition techniques revealed two problems, first, the classification performance of the system starts degrading when the number of motion classes to be classified increases, second, in order to solve the first problem, additional complicated methods were utilized which increase the computational cost of a multifunction myoelectric control system. In an effort to solve these problems and to achieve a feasible design for real time implementation with high overall accuracy, this paper presents a new method for feature extraction in MES recognition systems. The method works by extracting features using Wavelet Packet Transform (WPT) applied on the MES from multiple channels, and then employs Fuzzy c-means (FCM) algorithm to generate a measure that judges on features suitability for classification. Finally, Principle Component Analysis (PCA) is utilized to reduce the size of the data before computing the classification accuracy with a multilayer perceptron neural network. The proposed system produces powerful classification results (99% accuracy) by using only a small portion of the original feature set.

Biodiesel Production over nano-MgO Supported on Titania

Nano-MgO was successfully deposited on titania using deposition-precipitation method. The catalyst produced was characterised using FTIR, XRD, BET and XRF and its activity was tested on the transesterification reaction of soybean oil to biodiesel. The catalyst activity improved when the reaction temperature was increasedfrom 150 and 225 OC. It was also observed that increasing the reaction time above 1h had no significant benefit on conversion. The stability fixed MgO on TiO2 was investigated using XRF and ICP-OES. It was observed that MgO loss during the reaction was between 0.5-2.3 percent and that there was no correlation between the reaction temperature and the MgO loss.

Application of MADM in Identifying the Transmission Rate of Dengue fever: A Case Study of Shah Alam, Malaysia

Identifying parameters in an epidemic model is one of the important aspect of modeling. In this paper, we suggest a method to identify the transmission rate by using the multistage Adomian decomposition method. As a case study, we use the data of the reported dengue fever cases in the city of Shah Alam, Malaysia. The result obtained fairly represents the actual situation. However, in the SIR model, this method serves as an alternative in parameter identification and enables us to make necessary analysis for a smaller interval.