A Comparison of Software Analysis and Design Methods for Real Time Systems

This paper examines and compares several of the most common real time methods. These methods are CORE, YSM, MASCOT, JSD, DARTS, RTSAD, ADARTS, CODARTS, HOOD, HRT-HOOD, ROOM, UML, UML-RT. The methods are compared using attributes like i) usability, ii) compositionality and iii) proper RT notations available. Finally some comparison results are given and discussed.

Radiation Dose Distribution for Workers in South Korean Nuclear Power Plants

A total of 33,680 nuclear power plants (NPPs) workers were monitored and recorded from 1990 to 2007. According to the record, the average individual radiation dose has been decreasing continually from it 3.20 mSv/man in 1990 to 1.12 mSv/man at the end of 2007. After the International Commission on Radiological Protection (ICRP) 60 recommendation was generalized in South Korea, no nuclear power plant workers received above 20 mSv radiation, and the numbers of relatively highly exposed workers have been decreasing continuously. The age distribution of radiation workers in nuclear power plants was composed of mainly 20-30- year-olds (83%) for 1990 ~ 1994 and 30-40-year-olds (75%) for 2003 ~ 2007. The difference in individual average dose by age was not significant. Most (77%) of NPP radiation exposures from 1990 to 2007 occurred mostly during the refueling period. With regard to exposure type, the majority of exposures were external exposures, representing 95% of the total exposures, while internal exposures represented only 5%. External effective dose was affected mainly by gamma radiation exposure, with an insignificant amount of neutron exposure. As for internal effective dose, tritium (3H) in the pressurized heavy water reactor (PHWR) was the biggest cause of exposure.

Grid Coordination with Marketmaker Agents

Market based models are frequently used in the resource allocation on the computational grid. However, as the size of the grid grows, it becomes difficult for the customer to negotiate directly with all the providers. Middle agents are introduced to mediate between the providers and customers and facilitate the resource allocation process. The most frequently deployed middle agents are the matchmakers and the brokers. The matchmaking agent finds possible candidate providers who can satisfy the requirements of the consumers, after which the customer directly negotiates with the candidates. The broker agents are mediating the negotiation with the providers in real time. In this paper we present a new type of middle agent, the marketmaker. Its operation is based on two parallel operations - through the investment process the marketmaker is acquiring resources and resource reservations in large quantities, while through the resale process it sells them to the customers. The operation of the marketmaker is based on the fact that through its global view of the grid it can perform a more efficient resource allocation than the one possible in one-to-one negotiations between the customers and providers. We present the operation and algorithms governing the operation of the marketmaker agent, contrasting it with the matchmaker and broker agents. Through a series of simulations in the task oriented domain we compare the operation of the three agents types. We find that the use of marketmaker agent leads to a better performance in the allocation of large tasks and a significant reduction of the messaging overhead.

Design and Bandwidth Allocation of Embedded ATM Networks using Genetic Algorithm

In this paper, genetic algorithm (GA) is proposed for the design of an optimization algorithm to achieve the bandwidth allocation of ATM network. In Broadband ISDN, the ATM is a highbandwidth; fast packet switching and multiplexing technique. Using ATM it can be flexibly reconfigure the network and reassign the bandwidth to meet the requirements of all types of services. By dynamically routing the traffic and adjusting the bandwidth assignment, the average packet delay of the whole network can be reduced to a minimum. M/M/1 model can be used to analyze the performance.

Enhanced Differentiation of Stromal Cells and Embryonic Stem Cells with Vitamin D3

In-vitro mouse co-culture of E14 embryonic stem cells (ESCs) and OP9 stromal cells can recapitulate the earliest stages of haematopoietic development, not accessible in human embryos, supporting both haemogenic precursors and their primitive haematopoietic progeny. 1α, 25-Dihydroxy-vitamin D3 (VD3) has been demonstrated to be a powerful differentiation inducer for a wide variety of neoplastic cells, and could enhance early differentiation of ESCs into blood cells in E14/OP9 co-culture. This study aims to ascertain whether VD3 is key in promoting differentiation and suppressing proliferation, by separately investigating the effects of VD3 on the proliferation phase of the E14 cell line and on stromal OP9 cells.The results showed that VD3 inhibited the proliferation of the cells in a dose-dependent manner, quantitatively by decreased cell number, and qualitatively by alkaline-phosphatase staining that revealed significant differences between VD3-treated and untreated cells, characterised by decreased enzyme expression (colourless cells). Propidium-iodide cell-cycle analyses showed no significant percentage change in VD3-treated E14 and OP9 cells within their G and S-phases, compared to the untreated controls, despite the increased percentage of G-phase compared to the S-phase in a dosedependent manner. These results with E14 and OP9 cells indicate that adequate VD3 concentration enhances cellular differentiation and inhibits proliferation. The results also suggest that if E14 and OP9 cells were co-cultured andVD3-treated, there would be furtherenhanced differentiation of ESCs into blood cells.

Finite Element Application to Estimate Inservice Material Properties using Miniature Specimen

This paper presents a method for determining the uniaxial tensile properties such as Young-s modulus, yield strength and the flow behaviour of a material in a virtually non-destructive manner. To achieve this, a new dumb-bell shaped miniature specimen has been designed. This helps in avoiding the removal of large size material samples from the in-service component for the evaluation of current material properties. The proposed miniature specimen has an advantage in finite element modelling with respect to computational time and memory space. Test fixtures have been developed to enable the tension tests on the miniature specimen in a testing machine. The studies have been conducted in a chromium (H11) steel and an aluminum alloy (AR66). The output from the miniature test viz. load-elongation diagram is obtained and the finite element simulation of the test is carried out using a 2D plane stress analysis. The results are compared with the experimental results. It is observed that the results from the finite element simulation corroborate well with the miniature test results. The approach seems to have potential to predict the mechanical properties of the materials, which could be used in remaining life estimation of the various in-service structures.

Sorting Primitives and Genome Rearrangementin Bioinformatics: A Unified Perspective

Bioinformatics and computational biology involve the use of techniques including applied mathematics, informatics, statistics, computer science, artificial intelligence, chemistry, and biochemistry to solve biological problems usually on the molecular level. Research in computational biology often overlaps with systems biology. Major research efforts in the field include sequence alignment, gene finding, genome assembly, protein structure alignment, protein structure prediction, prediction of gene expression and proteinprotein interactions, and the modeling of evolution. Various global rearrangements of permutations, such as reversals and transpositions,have recently become of interest because of their applications in computational molecular biology. A reversal is an operation that reverses the order of a substring of a permutation. A transposition is an operation that swaps two adjacent substrings of a permutation. The problem of determining the smallest number of reversals required to transform a given permutation into the identity permutation is called sorting by reversals. Similar problems can be defined for transpositions and other global rearrangements. In this work we perform a study about some genome rearrangement primitives. We show how a genome is modelled by a permutation, introduce some of the existing primitives and the lower and upper bounds on them. We then provide a comparison of the introduced primitives.

Knowledge and Attitude among Women and Men in Decision Making on Pap Smear Screening in Kelantan, Malaysia

This paper explores the knowledge and attitude of women and men in decision making on pap smear screening. This qualitative study recruited 52 respondents with 44 women and 8 men, using the purposive sampling with snowballing technique through indepth interviews. This study demonstrates several key findings: Female respondents have better knowledge compared to male. Most of the women perceived that pap smear screening is beneficial and important, but to proceed with the test is still doubtful. Male respondents were supportive in terms of sending their spouses to the health facilities or give more freedom to their wives to choose and making decision on their own health due to prominent reason that women know best on their own health. It is expected that the results from this study will provide useful guideline for healthcare providers to prepare any action/intervention to provide an extensive education to improve people-s knowledge and attitude towards pap smear.

Eclectic Rule-Extraction from Support Vector Machines

Support vector machines (SVMs) have shown superior performance compared to other machine learning techniques, especially in classification problems. Yet one limitation of SVMs is the lack of an explanation capability which is crucial in some applications, e.g. in the medical and security domains. In this paper, a novel approach for eclectic rule-extraction from support vector machines is presented. This approach utilizes the knowledge acquired by the SVM and represented in its support vectors as well as the parameters associated with them. The approach includes three stages; training, propositional rule-extraction and rule quality evaluation. Results from four different experiments have demonstrated the value of the approach for extracting comprehensible rules of high accuracy and fidelity.

Contact Problem for an Elastic Layered Composite Resting on Rigid Flat Supports

In this study, the contact problem of a layered composite which consists of two materials with different elastic constants and heights resting on two rigid flat supports with sharp edges is considered. The effect of gravity is neglected. While friction between the layers is taken into account, it is assumed that there is no friction between the supports and the layered composite so that only compressive tractions can be transmitted across the interface. The layered composite is subjected to a uniform clamping pressure over a finite portion of its top surface. The problem is reduced to a singular integral equation in which the contact pressure is the unknown function. The singular integral equation is evaluated numerically and the results for various dimensionless quantities are presented in graphical forms.

Modulation Identification Algorithm for Adaptive Demodulator in Software Defined Radios Using Wavelet Transform

A generalized Digital Modulation Identification algorithm for adaptive demodulator has been developed and presented in this paper. The algorithm developed is verified using wavelet Transform and histogram computation to identify QPSK and QAM with GMSK and M–ary FSK modulations. It has been found that the histogram peaks simplifies the procedure for identification. The simulated results show that the correct modulation identification is possible to a lower bound of 5 dB and 12 dB for GMSK and QPSK respectively. When SNR is above 5 dB the throughput of the proposed algorithm is more than 97.8%. The receiver operating characteristics (ROC) has been computed to measure the performance of the proposed algorithm and the analysis shows that the probability of detection (Pd) drops rapidly when SNR is 5 dB and probability of false alarm (Pf) is smaller than 0.3. The performance of the proposed algorithm has been compared with existing methods and found it will identify all digital modulation schemes with low SNR.

Some Third Order Methods for Solving Systems of Nonlinear Equations

Based on Traub-s methods for solving nonlinear equation f(x) = 0, we develop two families of third-order methods for solving system of nonlinear equations F(x) = 0. The families include well-known existing methods as special cases. The stability is corroborated by numerical results. Comparison with well-known methods shows that the present methods are robust. These higher order methods may be very useful in the numerical applications requiring high precision in their computations because these methods yield a clear reduction in number of iterations.

Establish a Methodology for Testing and Optimizing GPRS Performance Case Study: Libya GSM

The main goal of this paper is to establish a methodology for testing and optimizing GPRS performance over Libya GSM network as well as to propose a suitable optimization technique to improve performance. Some measurements of download, upload, throughput, round-trip time, reliability, handover, security enhancement and packet loss over a GPRS access network were carried out. Measured values are compared to the theoretical values that could be calculated beforehand. This data should be processed and delivered by the server across the wireless network to the client. The client on the fly takes those pieces of the data and process immediately. Also, we illustrate the results by describing the main parameters that affect the quality of service. Finally, Libya-s two mobile operators, Libyana Mobile Phone and Al-Madar al- Jadeed Company are selected as a case study to validate our methodology.

A high Speed 8 Transistor Full Adder Design Using Novel 3 Transistor XOR Gates

The paper proposes the novel design of a 3T XOR gate combining complementary CMOS with pass transistor logic. The design has been compared with earlier proposed 4T and 6T XOR gates and a significant improvement in silicon area and power-delay product has been obtained. An eight transistor full adder has been designed using the proposed three-transistor XOR gate and its performance has been investigated using 0.15um and 0.35um technologies. Compared to the earlier designed 10 transistor full adder, the proposed adder shows a significant improvement in silicon area and power delay product. The whole simulation has been carried out using HSPICE.

Adjustment of a PET Scanner for PEPT

Positron emission particle tracking (PEPT) is a technique in which a single radioactive tracer particle can be accurately tracked as it moves. A limitation of PET is that in order to reconstruct a tomographic image it is necessary to acquire a large volume of data (millions of events), so it is difficult to study rapidly changing systems. By considering this fact, PEPT is a very fast process compared with PET. In PEPT detecting both photons defines a line and the annihilation is assumed to have occurred somewhere along this line. The location of the tracer can be determined to within a few mm from coincident detection of a small number of pairs of back-to-back gamma rays and using triangulation. This can be achieved many times per second and the track of a moving particle can be reliably followed. This technique was invented at the University of Birmingham [1]. The attempt in PEPT is not to form an image of the tracer particle but simply to determine its location with time. If this tracer is followed for a long enough period within a closed, circulating system it explores all possible types of motion. The application of PEPT to industrial process systems carried out at the University of Birmingham is categorized in two subjects: the behaviour of granular materials and viscous fluids. Granular materials are processed in industry for example in the manufacture of pharmaceuticals, ceramics, food, polymers and PEPT has been used in a number of ways to study the behaviour of these systems [2]. PEPT allows the possibility of tracking a single particle within the bed [3]. Also PEPT has been used for studying systems such as: fluid flow, viscous fluids in mixers [4], using a neutrally-buoyant tracer particle [5].

Investigating Cultural, Artistic and Architectural Consequences of Mongolian Invasion of Iran and Establishment of Ilkhanate Dynasty

Social, culture and artistic status of a society in various historical eras is affected by numerous, and sometimes imposed, factors that better understanding requires analysis of such conditions. Throughout history Iran has been involved with determining and significant events that examining each of these events can improve the understanding of social conditions of this country in the intended time. Mongolian conquest of Iran is one of most significant events in the history of Iran with consequences that never left Iranian societies. During this tragic invasion and subsequent devastating wars, which led to establishment of Ilkhanate dynasty, numerous cultural and artistic changes occurred both in Mongolian conquerors and Iranian society. This study examines these changes with a glimpse towards art and architecture as important part of cultural aspects and social communication.

Genetic Algorithms and Kernel Matrix-based Criteria Combined Approach to Perform Feature and Model Selection for Support Vector Machines

Feature and model selection are in the center of attention of many researches because of their impact on classifiers- performance. Both selections are usually performed separately but recent developments suggest using a combined GA-SVM approach to perform them simultaneously. This approach improves the performance of the classifier identifying the best subset of variables and the optimal parameters- values. Although GA-SVM is an effective method it is computationally expensive, thus a rough method can be considered. The paper investigates a joined approach of Genetic Algorithm and kernel matrix criteria to perform simultaneously feature and model selection for SVM classification problem. The purpose of this research is to improve the classification performance of SVM through an efficient approach, the Kernel Matrix Genetic Algorithm method (KMGA).

Project Base Learning for IT Personnel Resources Development using TVML

Using the animations video of teaching materials is an effective learning method. However, we thought that more effective learning method is to produce the teaching video by learners themselves. The learners who act as the producer must learn and understand well to produce and present video of teaching materials to others. The purpose of this study is to propose the project based learning (PBL) technique by co-producing video of IT (information technology) teaching materials. We used the T2V player to produce the video based on TVML a TV program description language. By proposed method, we have assigned the learners to produce the animations video for “National Examination for Information Processing Technicians (IPA examination)" in Japan, in order to get them learns various knowledge and skill on IT field. Experimental result showed that learning effect has occurred at the video production process that useful for IT personnel resources development.

Edge Detection in Digital Images Using Fuzzy Logic Technique

The fuzzy technique is an operator introduced in order to simulate at a mathematical level the compensatory behavior in process of decision making or subjective evaluation. The following paper introduces such operators on hand of computer vision application. In this paper a novel method based on fuzzy logic reasoning strategy is proposed for edge detection in digital images without determining the threshold value. The proposed approach begins by segmenting the images into regions using floating 3x3 binary matrix. The edge pixels are mapped to a range of values distinct from each other. The robustness of the proposed method results for different captured images are compared to those obtained with the linear Sobel operator. It is gave a permanent effect in the lines smoothness and straightness for the straight lines and good roundness for the curved lines. In the same time the corners get sharper and can be defined easily.

Computational Investigation of Air-Gas Venturi Mixer for Powered Bi-Fuel Diesel Engine

In a bi-fuel diesel engine, the carburetor plays a vital role in switching from fuel gas to petrol mode operation and viceversa. The carburetor is the most important part of the fuel system of a diesel engine. All diesel engines carry variable venturi mixer carburetors. The basic operation of the carburetor mainly depends on the restriction barrel called the venturi. When air flows through the venturi, its speed increases and its pressure decreases. The main challenge focuses on designing a mixing device which mixes the supplied gas is the incoming air at an optimum ratio. In order to surmount the identified problems, the way fuel gas and air flow in the mixer have to be analyzed. In this case, the Computational Fluid Dynamics or CFD approach is applied in design of the prototype mixer. The present work is aimed at further understanding of the air and fuel flow structure by performing CFD studies using a software code. In this study for mixing air and gas in the condition that has been mentioned in continuance, some mixers have been designed. Then using of computational fluid dynamics, the optimum mixer has been selected. The results indicated that mixer with 12 holes can produce a homogenous mixture than those of 8-holes and 6-holes mixer. Also the result showed that if inlet convergency was smoother than outlet divergency, the mixture get more homogenous, the reason of that is in increasing turbulence in outlet divergency.