Performance Assessment of Computational Gridon Weather Indices from HOAPS Data

Long term rainfall analysis and prediction is a challenging task especially in the modern world where the impact of global warming is creating complications in environmental issues. These factors which are data intensive require high performance computational modeling for accurate prediction. This research paper describes a prototype which is designed and developed on grid environment using a number of coupled software infrastructural building blocks. This grid enabled system provides the demanding computational power, efficiency, resources, user-friendly interface, secured job submission and high throughput. The results obtained using sequential execution and grid enabled execution shows that computational performance has enhanced among 36% to 75%, for decade of climate parameters. Large variation in performance can be attributed to varying degree of computational resources available for job execution. Grid Computing enables the dynamic runtime selection, sharing and aggregation of distributed and autonomous resources which plays an important role not only in business, but also in scientific implications and social surroundings. This research paper attempts to explore the grid enabled computing capabilities on weather indices from HOAPS data for climate impact modeling and change detection.

Enzymatic Saccharification of Dilute Alkaline Pre-treated Microalgal (Tetraselmis suecica) Biomass for Biobutanol Production

Enzymatic saccharification of biomass for reducing sugar production is one of the crucial processes in biofuel production through biochemical conversion. In this study, enzymatic saccharification of dilute potassium hydroxide (KOH) pre-treated Tetraselmis suecica biomass was carried out by using cellulase enzyme obtained from Trichoderma longibrachiatum. Initially, the pre-treatment conditions were optimised by changing alkali reagent concentration, retention time for reaction, and temperature. The T. suecica biomass after pre-treatment was also characterized using Fourier Transform Infrared Spectra and Scanning Electron Microscope. These analyses revealed that the functional group such as acetyl and hydroxyl groups, structure and surface of T. suecica biomass were changed through pre-treatment, which is favourable for enzymatic saccharification process. Comparison of enzymatic saccharification of untreated and pre-treated microalgal biomass indicated that higher level of reducing sugar can be obtained from pre-treated T. suecica. Enzymatic saccharification of pre-treated T. suecica biomass was optimised by changing temperature, pH, and enzyme concentration to solid ratio ([E]/[S]). Highest conversion of carbohydrate into reducing sugar of 95% amounted to reducing sugar yield of 20 (wt%) from pre-treated T. suecica was obtained from saccharification, at temperature: 40°C, pH: 4.5 and [E]/[S] of 0.1 after 72 h of incubation. Hydrolysate obtained from enzymatic saccharification of pretreated T. suecica biomass was further fermented into biobutanol using Clostridium saccharoperbutyliticum as biocatalyst. The results from this study demonstrate a positive prospect of application of dilute alkaline pre-treatment to enhance enzymatic saccharification and biobutanol production from microalgal biomass.

n-Butanol as an Extractant for Lactic Acid Recovery

Extraction of lactic acid from aqueous solution using n-butanol as an extractant was studied. Effect of mixing time, pH of the aqueous solution, initial lactic acid concentration, and volume ratio between the organic and the aqueous phase were investigated. Distribution coefficient and degree of lactic acid extraction was found to increase when the pH of aqueous solution was decreased. The pH Effect was substantially pronounced at pH of the aqueous solution less than 1. Initial lactic acid concentration and organic-toaqueous volume ratio appeared to have positive effect on the distribution coefficient and the degree of extraction. Due to the nature of n-butanol that is partially miscible in water, incorporation of aqueous solution into organic phase was observed in the extraction with large organic-to-aqueous volume ratio.

Mean Square Exponential Synchronization of Stochastic Neutral Type Chaotic Neural Networks with Mixed Delay

This paper studies the mean square exponential synchronization problem of a class of stochastic neutral type chaotic neural networks with mixed delay. On the Basis of Lyapunov stability theory, some sufficient conditions ensuring the mean square exponential synchronization of two identical chaotic neural networks are obtained by using stochastic analysis and inequality technique. These conditions are expressed in the form of linear matrix inequalities (LMIs), whose feasibility can be easily checked by using Matlab LMI Toolbox. The feedback controller used in this paper is more general than those used in previous literatures. One simulation example is presented to demonstrate the effectiveness of the derived results.

A Formal Implementation of Database Security

This paper is to investigate the impplementation of security mechanism in object oriented database system. Formal methods plays an essential role in computer security due to its powerful expressiveness and concise syntax and semantics. In this paper, both issues of specification and implementation in database security environment will be considered; and the database security is achieved through the development of an efficient implementation of the specification without compromising its originality and expressiveness.

Compromise Ratio Method for Decision Making under Fuzzy Environment using Fuzzy Distance Measure

The aim of this paper is to adopt a compromise ratio (CR) methodology for fuzzy multi-attribute single-expert decision making proble. In this paper, the rating of each alternative has been described by linguistic terms, which can be expressed as triangular fuzzy numbers. The compromise ratio method for fuzzy multi-attribute single expert decision making has been considered here by taking the ranking index based on the concept that the chosen alternative should be as close as possible to the ideal solution and as far away as possible from the negative-ideal solution simultaneously. From logical point of view, the distance between two triangular fuzzy numbers also is a fuzzy number, not a crisp value. Therefore a fuzzy distance measure, which is itself a fuzzy number, has been used here to calculate the difference between two triangular fuzzy numbers. Now in this paper, with the help of this fuzzy distance measure, it has been shown that the compromise ratio is a fuzzy number and this eases the problem of the decision maker to take the decision. The computation principle and the procedure of the compromise ratio method have been described in detail in this paper. A comparative analysis of the compromise ratio method previously proposed [1] and the newly adopted method have been illustrated with two numerical examples.

Simulated Annealing Algorithm for Data Aggregation Trees in Wireless Sensor Networks and Comparison with Genetic Algorithm

In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.

Evaluation of Evolution Strategy, Genetic Algorithm and their Hybrid on Evolving Simulated Car Racing Controllers

Researchers have been applying tional intelligence (AI/CI) methods to computer games. In this research field, further researchesare required to compare AI/CI methods with respect to each game application. In th our experimental result on the comparison of three evolutionary algorithms – evolution strategy, genetic algorithm, and their hybrid applied to evolving controller agents for the CIG 2007 Simulated Car Racing competition. Our experimental result shows that, premature convergence of solutions was observed in the case of ES, and GA outperformed ES in the last half of generations. Besides, a hybrid which uses GA first and ES next evolved the best solution among the whole solutions being generated. This result shows the ability of GA in globally searching promising areas in the early stage and the ability of ES in locally searching the focused area (fine-tuning solutions).

Design and Implementation of Secure Electronic Payment System (Client)

Secure electronic payment system is presented in this paper. This electronic payment system is to be secure for clients such as customers and shop owners. The security architecture of the system is designed by RC5 encryption / decryption algorithm. This eliminates the fraud that occurs today with stolen credit card numbers. The symmetric key cryptosystem RC5 can protect conventional transaction data such as account numbers, amount and other information. This process can be done electronically using RC5 encryption / decryption program written by Microsoft Visual Basic 6.0. There is no danger of any data sent within the system being intercepted, and replaced. The alternative is to use the existing network, and to encrypt all data transmissions. The system with encryption is acceptably secure, but that the level of encryption has to be stepped up, as computing power increases. Results In order to be secure the system the communication between modules is encrypted using symmetric key cryptosystem RC5. The system will use simple user name, password, user ID, user type and cipher authentication mechanism for identification, when the user first enters the system. It is the most common method of authentication in most computer system.

Confirming the Identity of the Individual Using Remote Assessment in E-learning

One major issue that is regularly cited as a block to the widespread use of online assessments in eLearning, is that of the authentication of the student and the level of confidence that an assessor can have that the assessment was actually completed by that student. Currently, this issue is either ignored, in which case confidence in the assessment and any ensuing qualification is damaged, or else assessments are conducted at central, controlled locations at specified times, losing the benefits of the distributed nature of the learning programme. Particularly as we move towards constructivist models of learning, with intentions towards achieving heutagogic learning environments, the benefits of a properly managed online assessment system are clear. Here we discuss some of the approaches that could be adopted to address these issues, looking at the use of existing security and biometric techniques, combined with some novel behavioural elements. These approaches offer the opportunity to validate the student on accessing an assessment, on submission, and also during the actual production of the assessment. These techniques are currently under development in the DECADE project, and future work will evaluate and report their use..

Evaluation of Guaiacol and Syringol Emission upon Wood Pyrolysis for some Fast Growing Species

Wood pyrolysis for Casuarina glauca, Casuarina cunninghamiana, Eucalyptus camaldulensis, Eucalyptus microtheca was made at 450°C with 2.5°C/min. in a flowing N2-atmosphere. The Eucalyptus genus wood gave higher values of specific gravity, ash , total extractives, lignin, N2-liquid trap distillate (NLTD) and water trap distillate (WSP) than those for Casuarina genus. The GHC of NLTD was higher for Casuarina genus than that for Eucalyptus genus with the highest value for Casuarina cunninghamiana. Guiacol, 4-ethyl-2-methoxyphenol and syringol were observed in the NLTD of all the four wood species reflecting their parent hardwood lignin origin. Eucalyptus camaldulensis wood had the highest lignin content (28.89%) and was pyrolyzed to the highest values of phenolics (73.01%), guaiacol (11.2%) and syringol (32.28%) contents in methylene chloride fraction (MCF) of NLTD. Accordingly, recoveries of syringol and guaiacol may become economically attractive from Eucalyptus camaldulensis.

An Agent Based Dynamic Resource Scheduling Model with FCFS-Job Grouping Strategy in Grid Computing

Grid computing is a group of clusters connected over high-speed networks that involves coordinating and sharing computational power, data storage and network resources operating across dynamic and geographically dispersed locations. Resource management and job scheduling are critical tasks in grid computing. Resource selection becomes challenging due to heterogeneity and dynamic availability of resources. Job scheduling is a NP-complete problem and different heuristics may be used to reach an optimal or near optimal solution. This paper proposes a model for resource and job scheduling in dynamic grid environment. The main focus is to maximize the resource utilization and minimize processing time of jobs. Grid resource selection strategy is based on Max Heap Tree (MHT) that best suits for large scale application and root node of MHT is selected for job submission. Job grouping concept is used to maximize resource utilization for scheduling of jobs in grid computing. Proposed resource selection model and job grouping concept are used to enhance scalability, robustness, efficiency and load balancing ability of the grid.

Identify Features and Parameters to Devise an Accurate Intrusion Detection System Using Artificial Neural Network

The aim of this article is to explain how features of attacks could be extracted from the packets. It also explains how vectors could be built and then applied to the input of any analysis stage. For analyzing, the work deploys the Feedforward-Back propagation neural network to act as misuse intrusion detection system. It uses ten types if attacks as example for training and testing the neural network. It explains how the packets are analyzed to extract features. The work shows how selecting the right features, building correct vectors and how correct identification of the training methods with nodes- number in hidden layer of any neural network affecting the accuracy of system. In addition, the work shows how to get values of optimal weights and use them to initialize the Artificial Neural Network.

Research on Hybrid Neural Network in Intrusion Detection System

This paper presents an intrusion detection system of hybrid neural network model based on RBF and Elman. It is used for anomaly detection and misuse detection. This model has the memory function .It can detect discrete and related aggressive behavior effectively. RBF network is a real-time pattern classifier, and Elman network achieves the memory ability for former event. Based on the hybrid model intrusion detection system uses DARPA data set to do test evaluation. It uses ROC curve to display the test result intuitively. After the experiment it proves this hybrid model intrusion detection system can effectively improve the detection rate, and reduce the rate of false alarm and fail.

Vortex Wake Formation and Its Effects on Thrust and Propulsive Efficiency of an Oscillating Airfoil

Flows over a harmonically oscillating NACA 0012 airfoil are simulated here using a two-dimensional, unsteady, incompressibleNavier-Stokes solver.Both pure-plunging and pitching-plunging combined oscillations are considered at a Reynolds number of 5000. Special attention is paid to the vortex shedding and interaction mechanism of the motions. For all the simulations presented here, the reduced frequency (k) is fixed at a value of 2.5 and plunging amplitude (h) is selected to be in the range of 0.2-0.5. The simulation results show that the interaction mechanism between the leading and trailing edge vortices has a decisive effect on the values of the resulting thrust and propulsive efficiency.

On Methodologies for Analysing Sickness Absence Data: An Insight into a New Method

Sickness absence represents a major economic and social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model selection and a critical analysis of the temporal trends, the occurrence and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model applicability to complicated longitudinal data.

A Hybrid Approach for Quantification of Novelty in Rule Discovery

Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.

A Study of the Damages to Historical Monuments due to Climatic Factors and Air Pollution and Offering Solutions

Historical monuments as architectural heritage are, economically and culturally, considered one of the key aspects for modern communities. Cultural heritage represents a country-s national identity and pride and maintains and enriches that country-s culture. Therefore, conservation of the monuments remained from our ancestors requires everybody-s serious and unremitting effort. Conservation, renewal, restoration, and technical study of cultural and historical matters are issues which have a special status among various forms of art and science in the present century and this is due to two reasons: firstly, progress of humankind in this century has created a factor called environmental pollution which not only has caused new destructive processes of cultural/historical monuments but also has accelerated the previous destructive processes by several times, and secondly, the rapid advance of various sciences, especially chemistry, has lead to the contribution of new methods and materials to this significant issue.

Software Reliability Prediction Model Analysis

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Reconstitute Information about Discontinued Water Quality Variables in the Nile Delta Monitoring Network Using Two Record Extension Techniques

The world economic crises and budget constraints have caused authorities, especially those in developing countries, to rationalize water quality monitoring activities. Rationalization consists of reducing the number of monitoring sites, the number of samples, and/or the number of water quality variables measured. The reduction in water quality variables is usually based on correlation. If two variables exhibit high correlation, it is an indication that some of the information produced may be redundant. Consequently, one variable can be discontinued, and the other continues to be measured. Later, the ordinary least squares (OLS) regression technique is employed to reconstitute information about discontinued variable by using the continuously measured one as an explanatory variable. In this paper, two record extension techniques are employed to reconstitute information about discontinued water quality variables, the OLS and the Line of Organic Correlation (LOC). An empirical experiment is conducted using water quality records from the Nile Delta water quality monitoring network in Egypt. The record extension techniques are compared for their ability to predict different statistical parameters of the discontinued variables. Results show that the OLS is better at estimating individual water quality records. However, results indicate an underestimation of the variance in the extended records. The LOC technique is superior in preserving characteristics of the entire distribution and avoids underestimation of the variance. It is concluded from this study that the OLS can be used for the substitution of missing values, while LOC is preferable for inferring statements about the probability distribution.