Visualisation Techniques Connecting VRML and GENESIS Environments

We created the tool, which combines the powerful GENESIS (GEneral NEural SImulation System) simulation language with the up-to-date visualisation and internet techniques. Our solution resides in the connection between the simulation output from GENESIS, which is converted to the data-structure suitable for WWW browsers and VRML (Virtual Reality Modelling Language) viewers. The selected GENESIS simulations are once exported into the VRML code, and stored in our neurovisualisation portal (webserver). There, the loaded models, demonstrating mainly the spread of electrical signal (action potentials, postsynaptic potentials) along the neuronal membrane (axon, dendritic tree, neuron) could be displayed in the client-s VRML viewer, without interacting with original GENESIS environment. This enables the visualisation of basic neurophysiological phenomena designed for GENESIS simulator on the independent OS (operation system).

Baseline Performance of Notebook Computer under Various Environmental and Usage Conditions for Prognostics

A study was conducted to formally characterize notebook computer performance under various environmental and usage conditions. Software was developed to collect data from the operating system of the computer. An experiment was conducted to evaluate the performance parameters- variations, trends, and correlations, as well as the extreme value they can attain in various usage and environmental conditions. An automated software script was written to simulate user activity. The variability of each performance parameter was addressed by establishing the empirical relationship between performance parameters. These equations were presented as baseline estimates for performance parameters, which can be used to detect system deviations from normal operation and for prognostic assessment. The effect of environmental factors, including different power sources, ambient temperatures, humidity, and usage, on performance parameters of notebooks was studied.

Hand Gesture Recognition using Blob Detection for Immersive Projection Display System

We developed a vision interface immersive projection system, CAVE in virtual rea using hand gesture recognition with computer vis background image was subtracted from current webcam and we convert the color space of the imag Then we mask skin regions using skin color range t a noise reduction operation. We made blobs fro gestures were recognized using these blobs. Using recognition, we could implement an effective bothering devices for CAVE. e framework for an reality research field vision techniques. ent image frame age into HSV space. e threshold and apply from the image and ing our hand gesture e interface without

A Study of Dose Distribution and Image Quality under an Automatic Tube Current Modulation (ATCM) System for a Toshiba Aquilion 64 CT Scanner Using a New Design of Phantom

Automatic tube current modulation (ATCM) systems are available for all CT manufacturers and are used for the majority of patients. Understanding how the systems work and their influence on patient dose and image quality is important for CT users, in order to gain the most effective use of the systems. In the present study, a new phantom was used for evaluating dose distribution and image quality under the ATCM operation for the Toshiba Aquilion 64 CT scanner using different ATCM options and a fixed mAs technique. A routine chest, abdomen and pelvis (CAP) protocol was selected for study and Gafchromic film was used to measure entrance surface dose (ESD), peripheral dose and central axis dose in the phantom. The results show the dose reductions achievable with various ATCM options, in relation with the target noise. The doses and image noise distribution were more uniform when the ATCM system was implemented compared with the fixed mAs technique. The lower limit set for the tube current will affect the modulations especially for the lower dose option. This limit prevented the tube current being reduced further and therefore the lower dose ATCM setting resembled a fixed mAs technique. Selection of a lower tube current limit is likely to reduce doses for smaller patients in scans of chest and neck regions.

Assessing the Problems of Pumping Stations: A Case Study of Boneh Basht Pumping Station

Establishing pumping stations is one of the most common ways of providing water from rivers. There are many issues involved in the design and operation of pumping stations most important of which is the problem of sedimentation. One of the significant issues which must be taken into consideration in designing pumping stations is the operation method and technical matters related to it. Safety and convenience of operation is one of the issues that must be always considered by the designer. Some of the major issues in making decisions regarding the type of design for the station are geographical condition, the location of the station and availability of experts in maintenance and operation of the station. Dimensions of the station must allow free movement for checking and operating pumps after installation of pumps and plumbing system.

Nonlinear Control of a Continuous Bioreactor Based on Cell Population Model

Saccharomyces cerevisiae (baker-s yeast) can exhibit sustained oscillations during the operation in a continuous bioreactor that adversely affects its stability and productivity. Because of heterogeneous nature of cell populations, the cell population balance models can be used to capture the dynamic behavior of such cultures. In this paper an unstructured, segregated model is used which is based on population balance equation(PBE) and then in order to simulation, the 4th order Rung-Kutta is used for time dimension and three methods, finite difference, orthogonal collocation on finite elements and Galerkin finite element are used for discretization of the cell mass domain. The results indicate that the orthogonal collocation on finite element not only is able to predict the oscillating behavior of the cell culture but also needs much little time for calculations. Therefore this method is preferred in comparison with other methods. In the next step two controllers, a globally linearizing control (GLC) and a conventional proportional-integral (PI) controller are designed for controlling the total cell mass per unit volume, and performances of these controllers are compared through simulation. The results show that although the PI controller has simpler structure, the GLC has better performance.

Closely Parametrical Model for an Electrical Arc Furnace

To maximise furnace production it-s necessary to optimise furnace control, with the objectives of achieving maximum power input into the melting process, minimum network distortion and power-off time, without compromise on quality and safety. This can be achieved with on the one hand by an appropriate electrode control and on the other hand by a minimum of AC transformer switching. Electrical arc is a stochastic process; witch is the principal cause of power quality problems, including voltages dips, harmonic distortion, unbalance loads and flicker. So it is difficult to make an appropriate model for an Electrical Arc Furnace (EAF). The factors that effect EAF operation are the melting or refining materials, melting stage, electrode position (arc length), electrode arm control and short circuit power of the feeder. So arc voltages, current and power are defined as a nonlinear function of the arc length. In this article we propose our own empirical function of the EAF and model, for the mean stages of the melting process, thanks to the measurements in the steel factory.

Development of High Performance Clarification System for FBR Dissolver Liquor

A high performance clarification system has been discussed for advanced aqueous reprocessing of FBR spent fuel. Dissolver residue gives the cause of troubles on the plant operation of reprocessing. In this study, the new clarification system based on the hybrid of centrifuge and filtration was proposed to get the high separation ability of the component of whole insoluble sludge. The clarification tests of simulated solid species were carried out to evaluate the clarification performance using small-scale test apparatus of centrifuge and filter unit. The density effect of solid species on the collection efficiency was mainly evaluated in the centrifugal clarification test. In the filtration test using ceramic filter with pore size of 0.2μm, on the other hand, permeability and filtration rate were evaluated in addition to the filtration efficiency. As results, it was evaluated that the collection efficiency of solid species on the new clarification system was estimated as nearly 100%. In conclusion, the high clarification performance of dissolver liquor can be achieved by the hybrid of the centrifuge and filtration system.

The Autoregresive Analysis for Wind Turbine Signal Postprocessing

Today modern simulations solutions in the wind turbine industry have achieved a high degree of complexity and detail in result. Limitations exist when it is time to validate model results against measurements. Regarding Model validation it is of special interest to identify mode frequencies and to differentiate them from the different excitations. A wind turbine is a complex device and measurements regarding any part of the assembly show a lot of noise. Input excitations are difficult or even impossible to measure due to the stochastic nature of the environment. Traditional techniques for frequency analysis or features extraction are widely used to analyze wind turbine sensor signals, but have several limitations specially attending to non stationary signals (Events). A new technique based on autoregresive analysis techniques is introduced here for a specific application, a comparison and examples related to different events in the wind turbine operations are presented.

Efficient Lossless Compression of Weather Radar Data

Data compression is used operationally to reduce bandwidth and storage requirements. An efficient method for achieving lossless weather radar data compression is presented. The characteristics of the data are taken into account and the optical linear prediction is used for the PPI images in the weather radar data in the proposed method. The next PPI image is identical to the current one and a dramatic reduction in source entropy is achieved by using the prediction algorithm. Some lossless compression methods are used to compress the predicted data. Experimental results show that for the weather radar data, the method proposed in this paper outperforms the other methods.

Effect of Using Stone Cutting Waste on the Compression Strength and Slump Characteristics of Concrete

The aim of this work is to study the possible use of stone cutting sludge waste in concrete production, which would reduce both the environmental impact and the production cost .Slurry sludge was used a source of water in concrete production, which was obtained from Samara factory/Jordan, The physico-chemical and mineralogical characterization of the sludge was carried out to identify the major components and to compare it with the typical sand used to produce concrete. Samples analysis showed that 96% of slurry sludge volume is water, so it should be considered as an important source of water. Results indicated that the use of slurry sludge as water source in concrete production has insignificant effect on compression strength, while it has a sharp effect on the slump values. Using slurry sludge with a percentage of 25% of the total water content obtained successful concrete samples regarding slump and compression tests. To clarify slurry sludge, settling process can be used to remove the suspended solid. A settling period of 30 min. obtained 99% removal efficiency. The clarified water is suitable for using in concrete mixes, which reduce water consumption, conserve water recourses, increase the profit, reduce operation cost and save the environment. Additionally, the dry sludge could be used in the mix design instead of the fine materials with sizes < 160 um. This application could conserve the natural materials and solve the environmental and economical problem caused by sludge accumulation.

Grid Coordination with Marketmaker Agents

Market based models are frequently used in the resource allocation on the computational grid. However, as the size of the grid grows, it becomes difficult for the customer to negotiate directly with all the providers. Middle agents are introduced to mediate between the providers and customers and facilitate the resource allocation process. The most frequently deployed middle agents are the matchmakers and the brokers. The matchmaking agent finds possible candidate providers who can satisfy the requirements of the consumers, after which the customer directly negotiates with the candidates. The broker agents are mediating the negotiation with the providers in real time. In this paper we present a new type of middle agent, the marketmaker. Its operation is based on two parallel operations - through the investment process the marketmaker is acquiring resources and resource reservations in large quantities, while through the resale process it sells them to the customers. The operation of the marketmaker is based on the fact that through its global view of the grid it can perform a more efficient resource allocation than the one possible in one-to-one negotiations between the customers and providers. We present the operation and algorithms governing the operation of the marketmaker agent, contrasting it with the matchmaker and broker agents. Through a series of simulations in the task oriented domain we compare the operation of the three agents types. We find that the use of marketmaker agent leads to a better performance in the allocation of large tasks and a significant reduction of the messaging overhead.

Sorting Primitives and Genome Rearrangementin Bioinformatics: A Unified Perspective

Bioinformatics and computational biology involve the use of techniques including applied mathematics, informatics, statistics, computer science, artificial intelligence, chemistry, and biochemistry to solve biological problems usually on the molecular level. Research in computational biology often overlaps with systems biology. Major research efforts in the field include sequence alignment, gene finding, genome assembly, protein structure alignment, protein structure prediction, prediction of gene expression and proteinprotein interactions, and the modeling of evolution. Various global rearrangements of permutations, such as reversals and transpositions,have recently become of interest because of their applications in computational molecular biology. A reversal is an operation that reverses the order of a substring of a permutation. A transposition is an operation that swaps two adjacent substrings of a permutation. The problem of determining the smallest number of reversals required to transform a given permutation into the identity permutation is called sorting by reversals. Similar problems can be defined for transpositions and other global rearrangements. In this work we perform a study about some genome rearrangement primitives. We show how a genome is modelled by a permutation, introduce some of the existing primitives and the lower and upper bounds on them. We then provide a comparison of the introduced primitives.

An Experimental Multi-Agent Robot System for Operating in Hazardous Environments

In this paper, a multi-agent robot system is presented. The system consists of four robots. The developed robots are able to automatically enter and patrol a harmful environment, such as the building infected with virus or the factory with leaking hazardous gas. Further, every robot is able to perform obstacle avoidance and search for the victims. Several operation modes are designed: remote control, obstacle avoidance, automatic searching, and so on.

Computational Investigation of Air-Gas Venturi Mixer for Powered Bi-Fuel Diesel Engine

In a bi-fuel diesel engine, the carburetor plays a vital role in switching from fuel gas to petrol mode operation and viceversa. The carburetor is the most important part of the fuel system of a diesel engine. All diesel engines carry variable venturi mixer carburetors. The basic operation of the carburetor mainly depends on the restriction barrel called the venturi. When air flows through the venturi, its speed increases and its pressure decreases. The main challenge focuses on designing a mixing device which mixes the supplied gas is the incoming air at an optimum ratio. In order to surmount the identified problems, the way fuel gas and air flow in the mixer have to be analyzed. In this case, the Computational Fluid Dynamics or CFD approach is applied in design of the prototype mixer. The present work is aimed at further understanding of the air and fuel flow structure by performing CFD studies using a software code. In this study for mixing air and gas in the condition that has been mentioned in continuance, some mixers have been designed. Then using of computational fluid dynamics, the optimum mixer has been selected. The results indicated that mixer with 12 holes can produce a homogenous mixture than those of 8-holes and 6-holes mixer. Also the result showed that if inlet convergency was smoother than outlet divergency, the mixture get more homogenous, the reason of that is in increasing turbulence in outlet divergency.

Application of the Balanced Scorecard into the Formulation of the Firm Strategy

In contemporary global and dynamically developing environment there is a need of the strategic planning fundamental. It is complicated, but at the same time important process from the point of view of continual keeping of competitive advantage. The aim of the paper is formulation of strategic goals for the needs of the small enterprises. There will be used Balanced Scorecard as a balanced system of the indicators for the clearing and transferring vision into particular goals. In particular perspectives the theme will be focused on strategic goals. Consequently will be mention the concept of the competitiveness IDINMOSU. This connect to Balanced Scorecard.

Memristor: The Missing Circuit Element and its Application

Memristor is also known as the fourth fundamental passive circuit element. When current flows in one direction through the device, the electrical resistance increases and when current flows in the opposite direction, the resistance decreases. When the current is stopped, the component retains the last resistance that it had, and when the flow of charge starts again, the resistance of the circuit will be what it was when it was last active. It behaves as a nonlinear resistor with memory. Recently memristors have generated wide research interest and have found many applications. In this paper we survey the various applications of memristors which include non volatile memory, nanoelectronic memories, computer logic, neuromorphic computer architectures low power remote sensing applications, crossbar latches as transistor replacements, analog computations and switches.

Degeneracy of MIS under the Conditions of Instability: A Mathematical Formulation

It has been always observed that the effectiveness of MIS as a support tool for management decisions degenerate after time of implementation, despite the substantial investments being made. This is true for organizations at the initial stages of MIS implementations, manual or computerized. A survey of a sample of middle to top managers in business and government institutions was made. A large ratio indicates that the MIS has lost its impact on the day-to-day operations, and even the response lag time expands sometimes indefinitely. The data indicates an infant mortality phenomenon of the bathtub model. Reasons may be monotonous nature of MIS delivery, irrelevance, irreverence, timeliness, and lack of adequate detail. All those reasons collaborate to create a degree of degeneracy. We investigate and model as a bathtub model the phenomenon of MIS degeneracy that inflicts the MIS systems and renders it ineffective. A degeneracy index is developed to identify the status of the MIS system and possible remedies to prevent the onset of total collapse of the system to the point of being useless.

Fuzzy Mathematical Morphology approach in Image Processing

Morphological operators transform the original image into another image through the interaction with the other image of certain shape and size which is known as the structure element. Mathematical morphology provides a systematic approach to analyze the geometric characteristics of signals or images, and has been applied widely too many applications such as edge detection, objection segmentation, noise suppression and so on. Fuzzy Mathematical Morphology aims to extend the binary morphological operators to grey-level images. In order to define the basic morphological operations such as fuzzy erosion, dilation, opening and closing, a general method based upon fuzzy implication and inclusion grade operators is introduced. The fuzzy morphological operations extend the ordinary morphological operations by using fuzzy sets where for fuzzy sets, the union operation is replaced by a maximum operation, and the intersection operation is replaced by a minimum operation. In this work, it consists of two articles. In the first one, fuzzy set theory, fuzzy Mathematical morphology which is based on fuzzy logic and fuzzy set theory; fuzzy Mathematical operations and their properties will be studied in details. As a second part, the application of fuzziness in Mathematical morphology in practical work such as image processing will be discussed with the illustration problems.

Optimal Control Strategy for High Performance EV Interior Permanent Magnet Synchronous Motor

The controllable electrical loss which consists of the copper loss and iron loss can be minimized by the optimal control of the armature current vector. The control algorithm of current vector minimizing the electrical loss is proposed and the optimal current vector can be decided according to the operating speed and the load conditions. The proposed control algorithm is applied to the experimental PM motor drive system and this paper presents a modern approach of speed control for permanent magnet synchronous motor (PMSM) applied for Electric Vehicle using a nonlinear control. The regulation algorithms are based on the feedback linearization technique. The direct component of the current is controlled to be zero which insures the maximum torque operation. The near unity power factor operation is also achieved. More over, among EV-s motor electric propulsion features, the energy efficiency is a basic characteristic that is influenced by vehicle dynamics and system architecture. For this reason, the EV dynamics are taken into account.