Virtualization Technology as a Tool for Teaching Computer Networks

In this paper is being described a possible use of virtualization technology in teaching computer networks. The virtualization can be used as a suitable tool for creating virtual network laboratories, supplementing the real laboratories and network simulation software in teaching networking concepts. It will be given a short description of characteristic projects in the area of virtualization technology usage in network simulation, network experiments and engineering education. A method for implementing laboratory has also been explained, together with possible laboratory usage and design of laboratory exercises. At the end, the laboratory test results of virtual laboratory are presented as well.

Patterns of Sports Supplement Use among Iranian Female Athletes

Supplement use is common in athletes. Besides their cost, they may have side effects on health and performance. 250 questionnaires were distributed among female athletes (mean age 27.08 years). The questionnaire aimed to explore the frequency, type, believes, attitudes and knowledge regarding dietary supplements. Knowledge was good in 30.3%, fair in 60.2%, and poor in 9.1% of respondents. 65.3% of athletes did not use supplements regularly. The most widely used supplements were vitamins (48.4%), minerals (42.9%), energy supplements (21.3%), and herbals (20.9%). 68.9% of athletes believed in their efficacy. 34.4% experienced performance enhancement and 6.8% of reported side effects. 68.2% reported little knowledge and 60.9% were eager to learn more. In conclusion, many of the female athletes believe in the efficacy of supplements and think they are an unavoidable part of competitive sports. However, their information is not sufficient. We have to stress on education, consulting sessions, and rational prescription.

Design of an SNMP Agent for OSGi Service Platforms

On one hand, SNMP (Simple Network Management Protocol) allows integrating different enterprise elements connected through Internet into a standardized remote management. On the other hand, as a consequence of the success of Intelligent Houses they can be connected through Internet now by means of a residential gateway according to a common standard called OSGi (Open Services Gateway initiative). Due to the specifics of OSGi Service Platforms and their dynamic nature, specific design criterions should be defined to implement SNMP Agents for OSGi in order to integrate them into the SNMP remote management. Based on the analysis of the relation between both standards (SNMP and OSGi), this paper shows how OSGi Service Platforms can be included into the SNMP management of a global enterprise, giving implementation details about an SNMP Agent solution and the definition of a new MIB (Management Information Base) for managing OSGi platforms that takes into account the specifics and dynamic nature of OSGi.

A Novel Multiplex Real-Time PCR Assay Using TaqMan MGB Probes for Rapid Detection of Trisomy 21

Cytogenetic analysis still remains the gold standard method for prenatal diagnosis of trisomy 21 (Down syndrome, DS). Nevertheless, the conventional cytogenetic analysis needs live cultured cells and is too time-consuming for clinical application. In contrast, molecular methods such as FISH, QF-PCR, MLPA and quantitative Real-time PCR are rapid assays with results available in 24h. In the present study, we have successfully used a novel MGB TaqMan probe-based real time PCR assay for rapid diagnosis of trisomy 21 status in Down syndrome samples. We have also compared the results of this molecular method with corresponding results obtained by the cytogenetic analysis. Blood samples obtained from DS patients (n=25) and normal controls (n=20) were tested by quantitative Real-time PCR in parallel to standard G-banding analysis. Genomic DNA was extracted from peripheral blood lymphocytes. A high precision TaqMan probe quantitative Real-time PCR assay was developed to determine the gene dosage of DSCAM (target gene on 21q22.2) relative to PMP22 (reference gene on 17p11.2). The DSCAM/PMP22 ratio was calculated according to the formula; ratio=2 -ΔΔCT. The quantitative Real-time PCR was able to distinguish between trisomy 21 samples and normal controls with the gene ratios of 1.49±0.13 and 1.03±0.04 respectively (p value

Development of a Catchment Water Quality Model for Continuous Simulations of Pollutants Build-up and Wash-off

Estimation of runoff water quality parameters is required to determine appropriate water quality management options. Various models are used to estimate runoff water quality parameters. However, most models provide event-based estimates of water quality parameters for specific sites. The work presented in this paper describes the development of a model that continuously simulates the accumulation and wash-off of water quality pollutants in a catchment. The model allows estimation of pollutants build-up during dry periods and pollutants wash-off during storm events. The model was developed by integrating two individual models; rainfall-runoff model, and catchment water quality model. The rainfall-runoff model is based on the time-area runoff estimation method. The model allows users to estimate the time of concentration using a range of established methods. The model also allows estimation of the continuing runoff losses using any of the available estimation methods (i.e., constant, linearly varying or exponentially varying). Pollutants build-up in a catchment was represented by one of three pre-defined functions; power, exponential, or saturation. Similarly, pollutants wash-off was represented by one of three different functions; power, rating-curve, or exponential. The developed runoff water quality model was set-up to simulate the build-up and wash-off of total suspended solids (TSS), total phosphorus (TP) and total nitrogen (TN). The application of the model was demonstrated using available runoff and TSS field data from road and roof surfaces in the Gold Coast, Australia. The model provided excellent representation of the field data demonstrating the simplicity yet effectiveness of the proposed model.

Infrared Face Recognition Using Distance Transforms

In this work we present an efficient approach for face recognition in the infrared spectrum. In the proposed approach physiological features are extracted from thermal images in order to build a unique thermal faceprint. Then, a distance transform is used to get an invariant representation for face recognition. The obtained physiological features are related to the distribution of blood vessels under the face skin. This blood network is unique to each individual and can be used in infrared face recognition. The obtained results are promising and show the effectiveness of the proposed scheme.

Effect of Different Lactic Acid Bacteria on Phytic Acid Content and Quality of whole Wheat Toast Bread

Nowadays, consumption of whole flours and flours with high extraction rate is recommended, because of their high amount of fibers, vitamins and minerals. Despite nutritional benefits of whole flour, concentration of some undesirable components such as phytic acid is higher than white flour. In this study, effect of several lactic acid bacteria sourdough on Toast bread is investigated. Sourdough from lactic acid bacteria (Lb. plantarum, Lb. reuteri) with different dough yield (250 and 300) is made and incubated at 30°C for 20 hour, then added to dough in the ratio of 10, 20 and 30% replacement. Breads that supplemented with Lb. plantarum sourdough had lower phytic acid. Higher replacement of sourdough and higher DY cause higher decrease in phytic acid content. Sourdough from Lb. plantarum, DY = 300 and 30% replacement cause the highest decrease in phytic acid content (49.63 mg/100g). As indicated by panelists, Lb. reuteri sourdough can present the greatest effect on overall quality score of the breads. DY reduction cause a decrease in bread quality score. Sensory score of Toast bread is 81.71 in the samples that treated with Lb. reuteri sourdough with DY = 250 and 20% replacement.

Study on the Mechanical Behavior of the Varactor of a Micro-Phase Shifter

In this paper static and dynamic response of a varactor of a micro-phase shifter to DC, step DC and AC voltages have been studied. By presenting a mathematical modeling Galerkin-based step by step linearization method (SSLM) and Galerkin-based reduced order model have been used to solve the governing static and dynamic equations, respectively. The calculated static and dynamic pull-in voltages have been validated by previous experimental and theoretical results and a good agreement has been achieved. Then the frequency response and phase diagram of the system has been studied. It has been shown that applying the DC voltage shifts down the phase diagram and frequency response. Also increasing the damping ratio shifts up the phase diagram.

Identifying Significant Factors of Brick Laying Process through Design of Experiment and Computer Simulation: A Case Study

Improving performance measures in the construction processes has been a major concern for managers and decision makers in the industry. They seek for ways to recognize the key factors which have the largest effect on the process. Identifying such factors can guide them to focus on the right parts of the process in order to gain the best possible result. In the present study design of experiment (DOE) has been applied to a computer simulation model of brick laying process to determine significant factors while productivity has been chosen as the response of the experiment. To this end, four controllable factors and their interaction have been experimented and the best factor level has been calculated for each one. The results indicate that three factors, namely, labor of brick, labor of mortar and inter arrival time of mortar along with interaction of labor of brick and labor of mortar are significant.

Oil Refineries Emissions: Source and Impact: A Study using AERMOD

The main objectives of this paper are to measure pollutants concentrations in the oil refinery area in Kuwait over three periods during one year, obtain recent emission inventory for the three refineries of Kuwait, use AERMOD and the emission inventory to predict pollutants concentrations and distribution, compare model predictions against measured data, and perform numerical experiments to determine conditions at which emission rates and the resulting pollutant dispersion is below maximum allowable limits.

Approximations to the Distribution of the Sample Correlation Coefficient

Given a bivariate normal sample of correlated variables, (Xi, Yi), i = 1, . . . , n, an alternative estimator of Pearson’s correlation coefficient is obtained in terms of the ranges, |Xi − Yi|. An approximate confidence interval for ρX,Y is then derived, and a simulation study reveals that the resulting coverage probabilities are in close agreement with the set confidence levels. As well, a new approximant is provided for the density function of R, the sample correlation coefficient. A mixture involving the proposed approximate density of R, denoted by hR(r), and a density function determined from a known approximation due to R. A. Fisher is shown to accurately approximate the distribution of R. Finally, nearly exact density approximants are obtained on adjusting hR(r) by a 7th degree polynomial.

Scheduling a Flexible Flow Shops Problem using DEA

This paper considers a scheduling problem in flexible flow shops environment with the aim of minimizing two important criteria including makespan and cumulative tardiness of jobs. Since the proposed problem is known as an Np-hard problem in literature, we have to develop a meta-heuristic to solve it. We considered general structure of Genetic Algorithm (GA) and developed a new version of that based on Data Envelopment Analysis (DEA). Two objective functions assumed as two different inputs for each Decision Making Unit (DMU). In this paper we focused on efficiency score of DMUs and efficient frontier concept in DEA technique. After introducing the method we defined two different scenarios with considering two types of mutation operator. Also we provided an experimental design with some computational results to show the performance of algorithm. The results show that the algorithm implements in a reasonable time.

Comparison of SVC and STATCOM in Static Voltage Stability Margin Enhancement

One of the major causes of voltage instability is the reactive power limit of the system. Improving the system's reactive power handling capacity via Flexible AC transmission System (FACTS) devices is a remedy for prevention of voltage instability and hence voltage collapse. In this paper, the effects of SVC and STATCOM in Static Voltage Stability Margin Enhancement will be studied. AC and DC representations of SVC and STATCOM are used in the continuation power flow process in static voltage stability study. The IEEE-14 bus system is simulated to test the increasing loadability. It is found that these controllers significantly increase the loadability margin of power systems.

Human-s Anthropological Appearance in Abai Kunanbayev-s Works

The issue of human anthropology took an important role in the last epochs and still hasn-t lost its importance. Scientists of different countries were interested in investigating the appearance of human being and the idea of life after death. While writing this article we noticed that scientists who made research in this issue, despite of the different countries and different epochs in which they lived, had similarities in their opinions. In given article we wrote great Kazakh poet AbaiKunanbayev-s philosophical view to the problem of human anthropology.

Probabilistic Bayesian Framework for Infrared Face Recognition

Face recognition in the infrared spectrum has attracted a lot of interest in recent years. Many of the techniques used in infrared are based on their visible counterpart, especially linear techniques like PCA and LDA. In this work, we introduce a probabilistic Bayesian framework for face recognition in the infrared spectrum. In the infrared spectrum, variations can occur between face images of the same individual due to pose, metabolic, time changes, etc. Bayesian approaches permit to reduce intrapersonal variation, thus making them very interesting for infrared face recognition. This framework is compared with classical linear techniques. Non linear techniques we developed recently for infrared face recognition are also presented and compared to the Bayesian face recognition framework. A new approach for infrared face extraction based on SVM is introduced. Experimental results show that the Bayesian technique is promising and lead to interesting results in the infrared spectrum when a sufficient number of face images is used in an intrapersonal learning process.

Two Individual Genetic Algorithm

The particular interests of this paper is to explore if the simple Genetic Algorithms (GA) starts with population of only two individuals and applying different crossover technique over these parents to produced 104 children, each one has different attributes inherited from their parents; is better than starting with population of 100 individuals; and using only one type crossover (order crossover OX). For this reason we implement GA with 52 different crossover techniques; each one produce two children; which means 104 different children will be produced and this may discover more search space, also we implement classic GA with order crossover and many experiments were done over 3 Travel Salesman Problem (TSP) to find out which method is better, and according to the results we can say that GA with Multi-crossovers is much better.

Finite Element Prediction and Experimental Verification of the Failure Pattern of Proximal Femur using Quantitative Computed Tomography Images

This paper presents a novel method for prediction of the mechanical behavior of proximal femur using the general framework of the quantitative computed tomography (QCT)-based finite element Analysis (FEA). A systematic imaging and modeling procedure was developed for reliable correspondence between the QCT-based FEA and the in-vitro mechanical testing. A speciallydesigned holding frame was used to define and maintain a unique geometrical reference system during the analysis and testing. The QCT images were directly converted into voxel-based 3D finite element models for linear and nonlinear analyses. The equivalent plastic strain and the strain energy density measures were used to identify the critical elements and predict the failure patterns. The samples were destructively tested using a specially-designed gripping fixture (with five degrees of freedom) mounted within a universal mechanical testing machine. Very good agreements were found between the experimental and the predicted failure patterns and the associated load levels.

Numerical Study of MHD Effects on Drop Formation in a T-Shaped Microchannel

The effect of a uniform magnetic field on the formation of drops of specific size has been investigated numerically in a T-shaped microchannel. Previous researches indicated that the drop sizes of secondary stream decreases, with increasing main stream flow rate and decreasing interfacial tension. In the present study the effect of a uniform magnetic field on the main stream is considered, and it is proposed that by increasing the Hartmann number, the size of the drops of the secondary stream will be decreased.

Spread Spectrum Code Estimationby Particle Swarm Algorithm

In the context of spectrum surveillance, a new method to recover the code of spread spectrum signal is presented, while the receiver has no knowledge of the transmitter-s spreading sequence. In our previous paper, we used Genetic algorithm (GA), to recover spreading code. Although genetic algorithms (GAs) are well known for their robustness in solving complex optimization problems, but nonetheless, by increasing the length of the code, we will often lead to an unacceptable slow convergence speed. To solve this problem we introduce Particle Swarm Optimization (PSO) into code estimation in spread spectrum communication system. In searching process for code estimation, the PSO algorithm has the merits of rapid convergence to the global optimum, without being trapped in local suboptimum, and good robustness to noise. In this paper we describe how to implement PSO as a component of a searching algorithm in code estimation. Swarm intelligence boasts a number of advantages due to the use of mobile agents. Some of them are: Scalability, Fault tolerance, Adaptation, Speed, Modularity, Autonomy, and Parallelism. These properties make swarm intelligence very attractive for spread spectrum code estimation. They also make swarm intelligence suitable for a variety of other kinds of channels. Our results compare between swarm-based algorithms and Genetic algorithms, and also show PSO algorithm performance in code estimation process.

Comparison of Different Neural Network Approaches for the Prediction of Kidney Dysfunction

This paper presents the prediction of kidney dysfunction using different neural network (NN) approaches. Self organization Maps (SOM), Probabilistic Neural Network (PNN) and Multi Layer Perceptron Neural Network (MLPNN) trained with Back Propagation Algorithm (BPA) are used in this study. Six hundred and sixty three sets of analytical laboratory tests have been collected from one of the private clinical laboratories in Baghdad. For each subject, Serum urea and Serum creatinin levels have been analyzed and tested by using clinical laboratory measurements. The collected urea and cretinine levels are then used as inputs to the three NN models in which the training process is done by different neural approaches. SOM which is a class of unsupervised network whereas PNN and BPNN are considered as class of supervised networks. These networks are used as a classifier to predict whether kidney is normal or it will have a dysfunction. The accuracy of prediction, sensitivity and specificity were found for each type of the proposed networks .We conclude that PNN gives faster and more accurate prediction of kidney dysfunction and it works as promising tool for predicting of routine kidney dysfunction from the clinical laboratory data.