Stochastic Simulation of Reaction-Diffusion Systems

Reactiondiffusion systems are mathematical models that describe how the concentration of one or more substances distributed in space changes under the influence of local chemical reactions in which the substances are converted into each other, and diffusion which causes the substances to spread out in space. The classical representation of a reaction-diffusion system is given by semi-linear parabolic partial differential equations, whose general form is ÔêétX(x, t) = DΔX(x, t), where X(x, t) is the state vector, D is the matrix of the diffusion coefficients and Δ is the Laplace operator. If the solute move in an homogeneous system in thermal equilibrium, the diffusion coefficients are constants that do not depend on the local concentration of solvent and of solutes and on local temperature of the medium. In this paper a new stochastic reaction-diffusion model in which the diffusion coefficients are function of the local concentration, viscosity and frictional forces of solvent and solute is presented. Such a model provides a more realistic description of the molecular kinetics in non-homogenoeus and highly structured media as the intra- and inter-cellular spaces. The movement of a molecule A from a region i to a region j of the space is described as a first order reaction Ai k- → Aj , where the rate constant k depends on the diffusion coefficient. Representing the diffusional motion as a chemical reaction allows to assimilate a reaction-diffusion system to a pure reaction system and to simulate it with Gillespie-inspired stochastic simulation algorithms. The stochastic time evolution of the system is given by the occurrence of diffusion events and chemical reaction events. At each time step an event (reaction or diffusion) is selected from a probability distribution of waiting times determined by the specific speed of reaction and diffusion events. Redi is the software tool, developed to implement the model of reaction-diffusion kinetics and dynamics. It is a free software, that can be downloaded from http://www.cosbi.eu. To demonstrate the validity of the new reaction-diffusion model, the simulation results of the chaperone-assisted protein folding in cytoplasm obtained with Redi are reported. This case study is redrawing the attention of the scientific community due to current interests on protein aggregation as a potential cause for neurodegenerative diseases.

Appling Eyring-s Accelerated Life Testing Model to “Times to Breakdown“ of Insulating Fluid: A Combined Approach of an Accelerated and a Sequential Life Testing

In this paper, the test purpose will be to assess whether or not the accelerated model proposed by Eyring will be able to translate results for the shape and scale parameters of an underlying Weibull model, obtained under two accelerating using conditions, to expected normal using condition results for these parameters. The product being analyzed is a new type of insulate fluid, and the accelerating factor is the voltage stresses applied to the fluid at two different levels (30KV and 40KV). The normal operating voltage is 25KV. In this case, it was possible to test the insulate fluid at normal voltage using condition. Both results for the two parameters of the Weibull model, obtained under normal using condition and translated from accelerated using conditions to normal conditions, will be compared to each other to assess the accuracy of the Eyring model when the accelerating factor is only the voltage stress.

Supercritical Fluid Extraction of Lutein Esters from Marigold Flowers and their Hydrolysis by Improved Saponification and Enzyme Biocatalysis

Lutein is a dietary oxycarotenoid which is found to reduce the risks of Age-related Macular Degeneration (AMD). Supercritical fluid extraction of lutein esters from marigold petals was carried out and was found to be much effective than conventional solvent extraction. The saponification of pre-concentrated lutein esters to produce free lutein was studied which showed a composition of about 88% total carotenoids (UV-VIS spectrophotometry) and 90.7% lutein (HPLC). The lipase catalyzed hydrolysis of lutein esters in conventional medium was investigated. The optimal temperature, pH, enzyme concentration and water activity were found to be 50°C, 7, 15% and 0.33 respectively and the activity loss of lipase was about 25% after 8 times re-use in at 50°C for 12 days. However, the lipase catalyzed hydrolysis of lutein esters in conventional media resulted in poor conversions (16.4%).

An Algorithm for Autonomous Aerial Navigation using MATLAB® Mapping Tool Box

In the present era of aviation technology, autonomous navigation and control have emerged as a prime area of active research. Owing to the tremendous developments in the field, autonomous controls have led today’s engineers to claim that future of aerospace vehicle is unmanned. Development of guidance and navigation algorithms for an unmanned aerial vehicle (UAV) is an extremely challenging task, which requires efforts to meet strict, and at times, conflicting goals of guidance and control. In this paper, aircraft altitude and heading controllers and an efficient algorithm for self-governing navigation using MATLAB® mapping toolbox is presented which also enables loitering of a fixed wing UAV over a specified area. For this purpose, a nonlinear mathematical model of a UAV is used. The nonlinear model is linearized around a stable trim point and decoupled for controller design. The linear controllers are tested on the nonlinear aircraft model and navigation algorithm is subsequently developed for for autonomous flight of the UAV. The results are presented for trajectory controllers and waypoint based navigation. Our investigation reveals that MATLAB® mapping toolbox can be exploited to successfully deliver an efficient algorithm for autonomous aerial navigation for a UAV.

Turkic - Indian Lexical Parallels in the Framework of the Nostratic Language's Macrofamily

From ancient times Turkic languages have been in contact with numerous representatives of different language families. The article discusses the Turkic - Indian language contact and were shown promise and necessity of this trend for the Turkic linguistics, were given Turkic - Indian lexical parallels in the framework of the nostratic language's macro family. The research work has done on the base of lexical parallels (LP) -of Turkic (which belong to the Altaic family of languages) and Indian (including Dravidian and Indo-Aryan languages).

Bioethanol - A Viable Answer to India-s Surging Energy Needs

India is currently the second most populous nation in the world with over 1.2 billion people, growing annually at the rate of 1.5%. It is experiencing a surge in energy demands, expected to grow more than three to four times in 25 years. Most of the energy requirements are currently satisfied by the import of fossil fuels – coal, petroleum-based products and natural gas. Biofuels can satisfy these energy needs in an environmentally benign and cost effective manner while reducing dependence on import of fossil fuels, thus providing National Energy Security. Among various forms of bioenergy, bioethanol is one of the major options for India because of availability of feed stock crops. This paper presents an overview on bioethanol production and technology, steps taken by the Indian government to facilitate and bring about optimal development and utilization of indigenous biomass feedstocks for production of this biofuel.

Using the Keystrokes Dynamic for Systems of Personal Security

This paper presents a boarding on biometric authentication through the Keystrokes Dynamics that it intends to identify a person from its habitual rhythm to type in conventional keyboard. Seven done experiments: verifying amount of prototypes, threshold, features and the variation of the choice of the times of the features vector. The results show that the use of the Keystroke Dynamics is simple and efficient for personal authentication, getting optimum resulted using 90% of the features with 4.44% FRR and 0% FAR.

Contingent Pay and Experience with Its Utilization by Companies in one of the Czech Republic's Regions

One part of the total employee’s reward is apart from basic wages or salary, employee’s benefits and intangible remuneration also so called contingent (variable) pay. Contingent pay is connected to performance, contribution, cap competency or skills of individual employees, and to team’s or company-wide performance or to combination of few of the mentioned possibilities. Sometimes among the contingent pay is also incorporated the remuneration based on length of employment, when the financial reward is not connected to performance or skills, but to length of continuous employment either on one working position or in one level of remuneration scale. Main aim of this article is to define, based on available information, contingent pay, describe individual forms, its advantages and disadvantages and possibilities to utilization in practice; but also bring information not only about its extent and level of utilization of contingent pay by companies in one of the Czech Republic’s regions, but also mention their practical experience with this type of remuneration.

Identification of Individual Objects at the Intelligent Assembly Cell

In this contribution is presented a complex design of individual objects identification in the workplace of intelligent assembly cell. Intelligent assembly cell is situated at Institute of Manufacturing Systems and Applied Mechanics and is used for pneumatic actuator assembly. Pneumatic actuator components are pneumatic roller, cover, piston and spring. Two identification objects alternatives for assembly are designed in the workplace of industrial robot. In the contribution is evaluated and selected suitable alternative for identification – 2D codes reader. The complex design of individual object identification is going out of intelligent manufacturing systems knowledge. Intelligent assembly and manufacturing systems as systems of new generation are gradually loaded in to the mechanical production, when they are removeing human operation out of production process and they also short production times.

Entropy Based Spatial Design: A Genetic Algorithm Approach (Case Study)

We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.

Performance Modeling for Web based J2EE and .NET Applications

When architecting an application, key nonfunctional requirements such as performance, scalability, availability and security, which influence the architecture of the system, are some times not adequately addressed. Performance of the application may not be looked at until there is a concern. There are several problems with this reactive approach. If the system does not meet its performance objectives, the application is unlikely to be accepted by the stakeholders. This paper suggests an approach for performance modeling for web based J2EE and .Net applications to address performance issues early in the development life cycle. It also includes a Performance Modeling Case Study, with Proof-of-Concept (PoC) and implementation details for .NET and J2EE platforms.

Three-Dimensional Simulation of Free Electron Laser with Prebunching and Efficiency Enhancement

Three-dimensional simulation of harmonic up generation in free electron laser amplifier operating simultaneously with a cold and relativistic electron beam is presented in steady-state regime where the slippage of the electromagnetic wave with respect to the electron beam is ignored. By using slowly varying envelope approximation and applying the source-dependent expansion to wave equations, electromagnetic fields are represented in terms of the Hermit Gaussian modes which are well suited for the planar wiggler configuration. The electron dynamics is described by the fully threedimensional Lorentz force equation in presence of the realistic planar magnetostatic wiggler and electromagnetic fields. A set of coupled nonlinear first-order differential equations is derived and solved numerically. The fundamental and third harmonic radiation of the beam is considered. In addition to uniform beam, prebunched electron beam has also been studied. For this effect of sinusoidal distribution of entry times for the electron beam on the evolution of radiation is compared with uniform distribution. It is shown that prebunching reduces the saturation length substantially. For efficiency enhancement the wiggler is set to decrease linearly when the radiation of the third harmonic saturates. The optimum starting point of tapering and the slope of radiation in the amplitude of wiggler are found by successive run of the code.

GPS INS Integration Application in Flight Management System

Flight management system (FMS) is a specialized computer system that automates a wide variety of in-flight tasks, reducing the workload on the flight crew to the point that modern aircraft no longer carry flight engineers or navigators. The primary function of FMS is to perform the in-flight management of the flight plan using various sensors (such as GPS and INS often backed up by radio navigation) to determine the aircraft's position. From the cockpit FMS is normally controlled through a Control Display Unit (CDU) which incorporates a small screen and keyboard or touch screen. This paper investigates the performance of GPS/ INS integration techniques in which the data fusion process is done using Kalman filtering. This will include the importance of sensors calibration as well as the alignment of the strap down inertial navigation system. The limitations of the inertial navigation systems are investigated in order to understand why INS sometimes is integrated with other navigation aids and not just operating in standalone mode. Finally, both the loosely coupled and tightly coupled configurations are analyzed for several types of situations and operational conditions.

Analysis of Classifications of Unsolicited Bulk Emails

In recent times, the problem of Unsolicited Bulk Email (UBE) or commonly known as Spam Email, has increased at a tremendous growth rate. We present an analysis of survey based on classifications of UBE in various research works. There are many research instances for classification between spam and non-spam emails but very few research instances are available for classification of spam emails, per se. This paper does not intend to assert some UBE classification to be better than the others nor does it propose any new classification but it bemoans the lack of harmony on number and definition of categories proposed by different researchers. The paper also elaborates on factors like intent of spammer, content of UBE and ambiguity in different categories as proposed in related research works of classifications of UBE.

Population Trend of Canola Aphid, Lipaphis Erysimi (Kalt.) (Homoptera: Aphididae) and its Associated Natural Enemies in Different Brassica Lines along with the Effect of Gamma Radiation on Their Population

Studies regarding the determination of population trend of Lipaphis erysimi (kalt.) and its associated natural enemies in different Brassica lines along with the effect of gamma radiation on their population were conducted at Agricultural Research Farm, Malakandher, Khyber Pakhtunkhwa Agricultural University Peshawar during spring 2006. Three different Brassica lines F6B3, F6B6 and F6B7 were used, which were replicated four times in Randomized Complete Block Design. The data revealed that aphid infestation invariably stated in all three varieties during last week of February 2006 (1st observation). The peak population of 4.39 aphids leaf-1 was s recorded during 2nd week of March and lowest population of 1.02 aphids leaf-1 was recorded during 5th week of March. The species of lady bird beetle (Coccinella septempunctata) and Syrphid fly (Syrphus balteatus) first appeared on 24th February with a mean number of 0.40 lady bird beetle leaf-1 and 0.87 Syrphid fly leaf-1, respectively. At the time when aphid population started to increase the peak population of C. septempunctata (0.70 lady bird beetle leaf- 1) and S. balteatus (1.04 syrphid fly leaf-1) was recorded on the 2nd week of March. Chrysoperla carnea appeared in the 1st week of March and their peak population was recorded during the 3rd week of March with mean population of 1.46 C. carnea leaf-1. Among all the Brassica lines, F6B7 showed comparatively more resistance as compared to F6B3 F6B6. F6B3 showed least resistance against L. erysimi, which was found to be the most susceptible cultivar. F6B7 was also found superior in terms of natural enemies. Maximum number of all natural enemies was recorded on this variety followed by F6B6. Lowest number of natural enemies was recorded in F6B3. No significant effect was recorded for the effect of gamma radiation on the population of aphids, natural enemies and on the varieties.

Optimization of Transmitter Aperture by Genetic Algorithm in Optical Satellite

To establish optical communication between any two satellites, the transmitter satellite must track the beacon of the receiver satellite and point the information optical beam in its direction. Optical tracking and pointing systems for free space suffer during tracking from high-amplitude vibration because of background radiation from interstellar objects such as the Sun, Moon, Earth, and stars in the tracking field of view or the mechanical impact from satellite internal and external sources. The vibrations of beam pointing increase the bit error rate and jam communication between the two satellites. One way to overcome this problem is the use of very small transmitter beam divergence angles of too narrow divergence angle is that the transmitter beam may sometimes miss the receiver satellite, due to pointing vibrations. In this paper we propose the use of genetic algorithm to optimize the BER as function of transmitter optics aperture.

A New Approach for Recoverable Timestamp Ordering Schedule

A new approach for timestamp ordering problem in serializable schedules is presented. Since the number of users using databases is increasing rapidly, the accuracy and needing high throughput are main topics in database area. Strict 2PL does not allow all possible serializable schedules and so does not result high throughput. The main advantages of the approach are the ability to enforce the execution of transaction to be recoverable and the high achievable performance of concurrent execution in central databases. Comparing to Strict 2PL, the general structure of the algorithm is simple, free deadlock, and allows executing all possible serializable schedules which results high throughput. Various examples which include different orders of database operations are discussed.

A Study of Dynamic Clustering Method to Extend the Lifetime of Wireless Sensor Network

In recent years, the research in wireless sensor network has increased steadily, and many studies were focusing on reducing energy consumption of sensor nodes to extend their lifetimes. In this paper, the issue of energy consumption is investigated and two adaptive mechanisms are proposed to extend the network lifetime. This study uses high-energy-first scheme to determine cluster heads for data transmission. Thus, energy consumption in each cluster is balanced and network lifetime can be extended. In addition, this study uses cluster merging and dynamic routing mechanisms to further reduce energy consumption during data transmission. The simulation results show that the proposed method can effectively extend the lifetime of wireless sensor network, and it is suitable for different base station locations.

Role of Director's Philosophical Approach in Cinematographic Expression

The original idea for a feature film may come from a writer, director or a producer. Director is the person responsible for the creative aspects, both interpretive and technical, of a motion picture production in a film. Director may be shot discussing his project with his or her cowriters, members of production staff, and producer, and director may be shown selecting locales or constructing sets. All these activities provide, of course, ways of externalizing director-s ideas about the film. A director sometimes pushes both the film image and techniques of narration to new artistic limits, but main responsibility of director is take the spectator to an original opinion in his philosophical approach. Director tries to find an artistic angle in every scene and change screenplay into an effective story and sets his film on a spiritual and philosophical base.

An Intelligent System for Phish Detection, using Dynamic Analysis and Template Matching

Phishing, or stealing of sensitive information on the web, has dealt a major blow to Internet Security in recent times. Most of the existing anti-phishing solutions fail to handle the fuzziness involved in phish detection, thus leading to a large number of false positives. This fuzziness is attributed to the use of highly flexible and at the same time, highly ambiguous HTML language. We introduce a new perspective against phishing, that tries to systematically prove, whether a given page is phished or not, using the corresponding original page as the basis of the comparison. It analyzes the layout of the pages under consideration to determine the percentage distortion between them, indicative of any form of malicious alteration. The system design represents an intelligent system, employing dynamic assessment which accurately identifies brand new phishing attacks and will prove effective in reducing the number of false positives. This framework could potentially be used as a knowledge base, in educating the internet users against phishing.