Delay Analysis of Sampled-Data Systems in Hard RTOS

In this paper, we have presented the effect of varying time-delays on performance and stability in the single-channel multirate sampled-data system in hard real-time (RT-Linux) environment. The sampling task require response time that might exceed the capacity of RT-Linux. So a straight implementation with RT-Linux is not feasible, because of the latency of the systems and hence, sampling period should be less to handle this task. The best sampling rate is chosen for the sampled-data system, which is the slowest rate meets all performance requirements. RT-Linux is consistent with its specifications and the resolution of the real-time is considered 0.01 seconds to achieve an efficient result. The test results of our laboratory experiment shows that the multi-rate control technique in hard real-time operating system (RTOS) can improve the stability problem caused by the random access delays and asynchronization.

Data Mining for Cancer Management in Egypt Case Study: Childhood Acute Lymphoblastic Leukemia

Data Mining aims at discovering knowledge out of data and presenting it in a form that is easily comprehensible to humans. One of the useful applications in Egypt is the Cancer management, especially the management of Acute Lymphoblastic Leukemia or ALL, which is the most common type of cancer in children. This paper discusses the process of designing a prototype that can help in the management of childhood ALL, which has a great significance in the health care field. Besides, it has a social impact on decreasing the rate of infection in children in Egypt. It also provides valubale information about the distribution and segmentation of ALL in Egypt, which may be linked to the possible risk factors. Undirected Knowledge Discovery is used since, in the case of this research project, there is no target field as the data provided is mainly subjective. This is done in order to quantify the subjective variables. Therefore, the computer will be asked to identify significant patterns in the provided medical data about ALL. This may be achieved through collecting the data necessary for the system, determimng the data mining technique to be used for the system, and choosing the most suitable implementation tool for the domain. The research makes use of a data mining tool, Clementine, so as to apply Decision Trees technique. We feed it with data extracted from real-life cases taken from specialized Cancer Institutes. Relevant medical cases details such as patient medical history and diagnosis are analyzed, classified, and clustered in order to improve the disease management.

An Improved Illumination Normalization based on Anisotropic Smoothing for Face Recognition

Robust face recognition under various illumination environments is very difficult and needs to be accomplished for successful commercialization. In this paper, we propose an improved illumination normalization method for face recognition. Illumination normalization algorithm based on anisotropic smoothing is well known to be effective among illumination normalization methods but deteriorates the intensity contrast of the original image, and incurs less sharp edges. The proposed method in this paper improves the previous anisotropic smoothing-based illumination normalization method so that it increases the intensity contrast and enhances the edges while diminishing the effect of illumination variations. Due to the result of these improvements, face images preprocessed by the proposed illumination normalization method becomes to have more distinctive feature vectors (Gabor feature vectors) for face recognition. Through experiments of face recognition based on Gabor feature vector similarity, the effectiveness of the proposed illumination normalization method is verified.

Transient Stress Analysis on Medium Modules Spur Gear by Using Mode Super Position Technique

Natural frequencies and dynamic response of a spur gear sector are investigated using a two dimensional finite element model that offers significant advantages for dynamic gear analyses. The gear teeth are analyzed for different operating speeds. A primary feature of this modeling is determination of mesh forces using a detailed contact analysis for each time step as the gears roll through the mesh. ANSYS software has been used on the proposed model to find the natural frequencies by Block Lanczos technique and displacements and dynamic stresses by transient mode super position method. The effect of rotational speed of the gear on the dynamic response of gear tooth has been studied and design limits have been discussed.

Stochastic Scheduling to Minimize Expected Lateness in Multiple Identical Machines

There are many real world problems in which parameters like the arrival time of new jobs, failure of resources, and completion time of jobs change continuously. This paper tackles the problem of scheduling jobs with random due dates on multiple identical machines in a stochastic environment. First to assign jobs to different machine centers LPT scheduling methods have been used, after that the particular sequence of jobs to be processed on the machine have been found using simple stochastic techniques. The performance parameter under consideration has been the maximum lateness concerning the stochastic due dates which are independent and exponentially distributed. At the end a relevant problem has been solved using the techniques in the paper..

Partial Derivatives and Optimization Problem on Time Scales

The optimization problem using time scales is studied. Time scale is a model of time. The language of time scales seems to be an ideal tool to unify the continuous-time and the discrete-time theories. In this work we present necessary conditions for a solution of an optimization problem on time scales. To obtain that result we use properties and results of the partial diamond-alpha derivatives for continuous-multivariable functions. These results are also presented here.

An Application of the Sinc-Collocation Method to a Three-Dimensional Oceanography Model

In this paper, we explore the applicability of the Sinc- Collocation method to a three-dimensional (3D) oceanography model. The model describes a wind-driven current with depth-dependent eddy viscosity in the complex-velocity system. In general, the Sinc-based methods excel over other traditional numerical methods due to their exponentially decaying errors, rapid convergence and handling problems in the presence of singularities in end-points. Together with these advantages, the Sinc-Collocation approach that we utilize exploits first derivative interpolation, whose integration is much less sensitive to numerical errors. We bring up several model problems to prove the accuracy, stability, and computational efficiency of the method. The approximate solutions determined by the Sinc-Collocation technique are compared to exact solutions and those obtained by the Sinc-Galerkin approach in earlier studies. Our findings indicate that the Sinc-Collocation method outperforms other Sinc-based methods in past studies.

Computer-aided Lenke Classification of Scoliotic Spines

The identification and classification of the spine deformity play an important role when considering surgical planning for adolescent patients with idiopathic scoliosis. The subject of this article is the Lenke classification of scoliotic spines using Cobb angle measurements. The purpose is two-fold: (1) design a rulebased diagram to assist clinicians in the classification process and (2) investigate a computer classifier which improves the classification time and accuracy. The rule-based diagram efficiency was evaluated in a series of scoliotic classifications by 10 clinicians. The computer classifier was tested on a radiographic measurement database of 603 patients. Classification accuracy was 93% using the rule-based diagram and 99% for the computer classifier. Both the computer classifier and the rule based diagram can efficiently assist clinicians in their Lenke classification of spine scoliosis.

A Subtractive Clustering Based Approach for Early Prediction of Fault Proneness in Software Modules

In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.

Sequence-based Prediction of Gamma-turn Types using a Physicochemical Property-based Decision Tree Method

The γ-turns play important roles in protein folding and molecular recognition. The prediction and analysis of γ-turn types are important for both protein structure predictions and better understanding the characteristics of different γ-turn types. This study proposed a physicochemical property-based decision tree (PPDT) method to interpretably predict γ-turn types. In addition to the good prediction performance of PPDT, three simple and human interpretable IF-THEN rules are extracted from the decision tree constructed by PPDT. The identified informative physicochemical properties and concise rules provide a simple way for discriminating and understanding γ-turn types.

Natural Radioactivity Measurements of Basalt Rocks in Sidakan District Northeastern of Kurdistan Region-Iraq

The amounts of radioactivity in the igneous rocks have been investigated; samples were collected from the total of eight basalt rock types in the northeastern of Kurdistan region/Iraq. The activity concentration of 226Ra (238U) series, 228Ac (232Th) series, 40K and 137Cs were measured using Planar HPGe and NaI(Tl) detectors. Along the study area the radium equivalent activities Raeq in Bq/Kg of samples under investigation were found in the range of 22.16 to 77.31 Bq/Kg with an average value of 44.8 Bq/Kg, this value is much below the internationally accepted value of 370 Bq/Kg. To estimate the health effects of this natural radioactive composition, the average values of absorbed gamma dose rate D (55 nGyh-1), Indoor and outdoor annual effective dose rates Eied (0.11 mSvy-1) . and Eoed (0.03 mSvy-1), External hazard index Hex (0.138) and internal hazard index Hin(0.154), and representative level index Iγr (0.386) have been calculated and found to be lower than the worldwide average values.

Study of Base-Isolation Building System

In order to improve the effect of isolation structure, the principles and behaviours of the base-isolation system are studied, and the types and characteristics of the base-isolation are also discussed. Compared to the traditional aseismatic structures, the base isolation structures decrease the seismic response obviously: the total structural aseismatic value decreases to 1/4-1/32 and the seismic shear stress in the upper structure decreases to 1/14-1/23. In the huge seism, the structure can have an obvious aseismatic effect.

Conceptual Overview of Housing Affordability in Selangor, Malaysia

Socioeconomic stability and development of a country, can be describe by housing affordability. It is aimed to ensure the housing provided as one of the key factors that is affordable by every income earner group whether low-income, middle income and high income group. This research carried out is to find out affordability of home ownership level for first medium cost landed-house by the middle-income group in Selangor, Malaysia. It is also hope that it could be seen as able to contribute to the knowledge and understanding on housing affordability level for the middleincome group and variables that influenced the medium income group-s ability to own first medium-cost houses.

Network Intrusion Detection Design Using Feature Selection of Soft Computing Paradigms

The network traffic data provided for the design of intrusion detection always are large with ineffective information and enclose limited and ambiguous information about users- activities. We study the problems and propose a two phases approach in our intrusion detection design. In the first phase, we develop a correlation-based feature selection algorithm to remove the worthless information from the original high dimensional database. Next, we design an intrusion detection method to solve the problems of uncertainty caused by limited and ambiguous information. In the experiments, we choose six UCI databases and DARPA KDD99 intrusion detection data set as our evaluation tools. Empirical studies indicate that our feature selection algorithm is capable of reducing the size of data set. Our intrusion detection method achieves a better performance than those of participating intrusion detectors.

A New Approach to Polynomial Neural Networks based on Genetic Algorithm

Recently, a lot of attention has been devoted to advanced techniques of system modeling. PNN(polynomial neural network) is a GMDH-type algorithm (Group Method of Data Handling) which is one of the useful method for modeling nonlinear systems but PNN performance depends strongly on the number of input variables and the order of polynomial which are determined by trial and error. In this paper, we introduce GPNN (genetic polynomial neural network) to improve the performance of PNN. GPNN determines the number of input variables and the order of all neurons with GA (genetic algorithm). We use GA to search between all possible values for the number of input variables and the order of polynomial. GPNN performance is obtained by two nonlinear systems. the quadratic equation and the time series Dow Jones stock index are two case studies for obtaining the GPNN performance.

Study of Chest Pain and its Risk Factors in Over 30 Year-Old Individuals

Chest pain is one of the most prevalent complaints among adults that cause the people to attend to medical centers. The aim was to determine the prevalence and risk factors of chest pain among over 30 years old people in Tehran. In this cross-sectional study, 787 adults took part from Apr 2005 until Apr 2006. The sampling method was random cluster sampling and there were 25 clusters. In each cluster, interviews were performed with 32 over 30 years old, people lived in those houses. In cases with chest pain, extra questions asked. The prevalence of CP was 9% (71 cases). Of them 21 cases (6.5%) were in 41-60 year age ranges and the remainders were over 61 year old. 19 cases (26.8%) mentioned CP in resting state and all of the cases had exertion onset CP. The CP duration was 10 minutes or less in all of the cases and in most of them (84.5%), the location of pain mentioned left anterior part of chest, left anterior part of sternum and or left arm. There was positive history of myocardial infarction in 12 cases (17%). There was significant relation between CP and age, sex and between history of myocardial infarction and marital state of study people. Our results are similar to other studies- results in most parts, however it is necessary to perform supplementary tests and follow up studies to differentiate between cardiac and non-cardiac CP exactly.

High-Speed Train Planning in France, Lessons from Mediterranean TGV-Line

To fight against the economic crisis, French Government, like many others in Europe, has decided to give a boost to high-speed line projects. This paper explores the implementation and decision-making process in TGV projects, their evolutions, especially since the Mediterranean TGV-line. This project was probably the most controversial, but paradoxically represents today a huge success for all the actors involved. What kind of lessons we can learn from this experience? How to evaluate the impact of this project on TGV-line planning? How can we characterize this implementation and decision-making process regards to the sustainability challenges? The construction of Mediterranean TGV-line was the occasion to make several innovations: to introduce more dialog into the decisionmaking process, to take into account the environment, to introduce a new project management and technological innovations. That-s why this project appears today as an example in terms of integration of sustainable development. In this paper we examine the different kinds of innovations developed in this project, by using concepts from sociology of innovation to understand how these solutions emerged in a controversial situation. Then we analyze the lessons which were drawn from this decision-making process (in the immediacy and a posteriori) and the way in which procedures evolved: creation of new tools and devices (public consultation, project management...). Finally we try to highlight the impact of this evolution on TGV projects governance. In particular, new methods of implementation and financing involve a reconfiguration of the system of actors. The aim of this paper is to define the impact of this reconfiguration on negotiations between stakeholders.

The Analysis of Two-Phase Jet in Pneumatic Powder Injection into Liquid Alloys

The results of the two-phase gas-solid jet in pneumatic powder injection process analysis were presented in the paper. The researches were conducted on model set-up with high speed camera jet movement recording. Then the recorded material was analyzed to estimate main particles movement parameters. The values obtained from this direct measurement were compared to those calculated with the use of the well-known formulas for the two-phase flows (pneumatic conveying). Moreover, they were compared to experimental results previously achieved by authors. The analysis led to conclusions which to some extent changed the assumptions used even by authors, regarding the two-phase jet in pneumatic powder injection process. Additionally, the visual analysis of the recorded clips supplied data to make a more complete evaluation of the jet behavior in the lance outlet than before.

Analysis of DNA-Recognizing Enzyme Interaction using Deaminated Lesions

Deaminated lesions were produced via nitrosative oxidation of natural nucleobases; uracul (Ura, U) from cytosine (Cyt, C), hypoxanthine (Hyp, H) from adenine (Ade, A), and xanthine (Xan, X) and oxanine (Oxa, O) from guanine (Gua, G). Such damaged nucleobases may induce mutagenic problems, so that much attentions and efforts have been poured on the revealing of their mechanisms in vivo or in vitro. In this study, we employed these deaminated lesions as useful probes for analysis of DNA-binding/recognizing proteins or enzymes. Since the pyrimidine lesions such as Hyp, Oxa and Xan are employed as analogues of guanine, their comparative uses are informative for analyzing the role of Gua in DNA sequence in DNA-protein interaction. Several DNA oligomers containing such Hyp, Oxa or Xan substituted for Gua were designed to reveal the molecular interaction between DNA and protein. From this approach, we have got useful information to understand the molecular mechanisms of the DNA-recognizing enzymes, which have not ever been observed using conventional DNA oligomer composed of just natural nucleobases.

Artificial Neural Networks and Multi-Class Support Vector Machines for Classifying Magnetic Measurements in Tokamak Reactors

This paper is mainly concerned with the application of a novel technique of data interpretation for classifying measurements of plasma columns in Tokamak reactors for nuclear fusion applications. The proposed method exploits several concepts derived from soft computing theory. In particular, Artificial Neural Networks and Multi-Class Support Vector Machines have been exploited to classify magnetic variables useful to determine shape and position of the plasma with a reduced computational complexity. The proposed technique is used to analyze simulated databases of plasma equilibria based on ITER geometry configuration. As well as demonstrating the successful recovery of scalar equilibrium parameters, we show that the technique can yield practical advantages compared with earlier methods.