Generalized Predictive Control of Batch Polymerization Reactor

This paper describes the application of a model predictive controller to the problem of batch reactor temperature control. Although a great deal of work has been done to improve reactor throughput using batch sequence control, the control of the actual reactor temperature remains a difficult problem for many operators of these processes. Temperature control is important as many chemical reactions are sensitive to temperature for formation of desired products. This controller consist of two part (1) a nonlinear control method GLC (Global Linearizing Control) to create a linear model of system and (2) a Model predictive controller used to obtain optimal input control sequence. The temperature of reactor is tuned to track a predetermined temperature trajectory that applied to the batch reactor. To do so two input signals, electrical powers and the flow of coolant in the coil are used. Simulation results show that the proposed controller has a remarkable performance for tracking reference trajectory while at the same time it is robust against noise imposed to system output.

Design and Implementation of Optimal Winner Determination Algorithm in Combinatorial e- Auctions

The one of best robust search technique on large scale search area is heuristic and meta heuristic approaches. Especially in issue that the exploitation of combinatorial status in the large scale search area prevents the solution of the problem via classical calculating methods, so such problems is NP-complete. in this research, the problem of winner determination in combinatorial auctions have been formulated and by assessing older heuristic functions, we solve the problem by using of genetic algorithm and would show that this new method would result in better performance in comparison to other heuristic function such as simulated annealing greedy approach.

The Design and Development of Multimedia Pronunciation Learning Management System

The proposed Multimedia Pronunciation Learning Management System (MPLMS) in this study is a technology with profound potential for inducing improvement in pronunciation learning. The MPLMS optimizes the digitised phonetic symbols with the integration of text, sound and mouth movement video. The components are designed and developed in an online management system which turns the web to a dynamic user-centric collection of consistent and timely information for quality sustainable learning. The aim of this study is to design and develop the MPLMS which serves as an innovative tool to improve English pronunciation. This paper discusses the iterative methodology and the three-phase Alessi and Trollip model in the development of MPLMS. To align with the flexibility of the development of educational software, the iterative approach comprises plan, design, develop, evaluate and implement is followed. To ensure the instructional appropriateness of MPLMS, the instructional system design (ISD) model of Alessi and Trollip serves as a platform to guide the important instructional factors and process. It is expected that the results of future empirical research will support the efficacy of MPLMS and its place as the premier pronunciation learning system.

Electric Field and Potential Distributions along Surface of Silicone Rubber Polymer Insulators Using Finite Element Method

This paper presents the simulation the results of electric field and potential distributions along surface of silicone rubber polymer insulators. Near the same leakage distance subjected to 15 kV in 50 cycle salt fog ageing test, alternate sheds silicone rubber polymer insulator showed better contamination performance than straight sheds silicone rubber polymer insulator. Severe surface ageing was observed on the straight sheds insulator. The objective of this work is to elucidate that electric field distribution along straight sheds insulator higher than alternate shed insulator in salt fog ageing test. Finite element method (FEM) is adopted for this work. The simulation results confirmed the experimental data, as well.

Incidence of Chronic Disease and Lipid Profile in Veteran Rugby Athletes

Recently, the health of retired National Football League players, particularly lineman has been investigated. A number of studies have reported increased cardiometabolic risk, premature cardiovascular disease and incidence of type 2 diabetes. Rugby union players have somatotypes very similar to National Football League players which suggests that rugby players may have similar health risks. The International Golden Oldies World Rugby Festival (GORF) provided a unique opportunity to investigate the demographics of veteran rugby players. METHODOLOGIES: A cross-sectional, observational study was completed using an online web-based questionnaire that consisted of medical history and physiological measures. Data analysis was completed using a one sample t-test (50yrs) and Chi-square test. RESULTS: A total of 216 veteran rugby competitors (response rate = 6.8%) representing 10 countries, aged 35-72 yrs (mean 51.2, S.D. ±8.0), participated in the online survey. As a group, the incidence of current smokers was low at 8.8% (avg 72.4 cigs/wk) whilst the percentage consuming alcohol was high (93.1% (avg 11.2 drinks/wk). Competitors reported the following top six chronic diseases/disorders; hypertension (18.6%), arthritis (OA/RA, 11.5%), asthma (9.3%), hyperlipidemia (8.2%), diabetes (all types, 7.5%) and gout (6%), there were significant differences between groups with regard to cancer (all types) and migraines. When compared to the Australian general population (Australian Bureau of Statistics data, n=18,000), GORF competitors had a significantly lower incidence of anxiety (p

Using Morphological and Microsatellite (SSR) Markers to Assess the Genetic Diversity in Alfalfa (Medicago sativa L.)

Utilization of diverse germplasm is needed to enhance the genetic diversity of cultivars. The objective of this study was to evaluate the genetic relationships of 98 alfalfa germplasm accessions using morphological traits and SSR markers. From the 98 tested populations, 81 were locals originating in Europe, 17 were introduced from USA, Australia, New Zealand and Canada. Three primers generated 67 polymorphic bands. The average polymorphic information content (PIC) was very high (> 0.90) over all three used primer combinations. Cluster analysis using Unweighted Pair Group Method with Arithmetic Means (UPGMA) and Jaccard´s coefficient grouped the accessions into 2 major clusters with 4 sub-clusters with no correlation between genetic and morphological diversity. The SSR analysis clearly indicated that even with three polymorphic primers, reliable estimation of genetic diversity could be obtained.

Reutilization of Organic and Peat Soils by Deep Cement Mixing

Limited infrastructure development on peats and organic soils is a serious geotechnical issues common to many countries of the world especially Malaysia which distributed 1.5 mill ha of those problematic soil. These soils have high water content and organic content which exhibit different mechanical properties and may also change chemically and biologically with time. Constructing structures on peaty ground involves the risk of ground failure and extreme settlement. Nowdays, much efforts need to be done in making peatlands usable for construction due to increased landuse. Deep mixing method employing cement as binders, is generally used as measure again peaty/ organic ground failure problem. Where the technique is widely adopted because it can improved ground considerably in a short period of time. An understanding of geotechnical properties as shear strength, stiffness and compressibility behavior of these soils was requires before continues construction on it. Therefore, 1- 1.5 meter peat soil sample from states of Johor and an organic soil from Melaka, Malaysia were investigated. Cement were added to the soil in the pre-mixing stage with water cement ratio at range 3.5,7,14,140 for peats and 5,10,30 for organic soils, essentially to modify the original soil textures and properties. The mixtures which in slurry form will pour to polyvinyl chloride (pvc) tube and cured at room temperature 250C for 7,14 and 28 days. Laboratory experiments were conducted including unconfined compressive strength and bender element , to monitor the improved strength and stiffness of the 'stabilised mixed soils'. In between, scanning electron miscroscopic (SEM) were observations to investigate changes in microstructures of stabilised soils and to evaluated hardening effect of a peat and organic soils stabilised cement. This preliminary effort indicated that pre-mixing peat and organic soils contributes in gaining soil strength while help the engineers to establish a new method for those problematic ground improvement in further practical and long term applications.

Mutational Effect to Particular Interaction Energy of Cycloguanil Drug to Plasmodium Plasmodium Falciparum Dihydrofolate Reductase Enzymes

In order to find the particular interaction energy between cylcloguanil and the amino acids surrounding the pocket of wild type and quadruple mutant type PfDHFR enzymes, the MP2 method with basis set 6-31G(d,p) level of calculations was performed. The obtained interaction energies found that Asp54 has the strongest interaction energy to both wild type and mutant type of - 12.439 and -11.250 kcal/mol, respectively and three amino acids; Asp54, Ile164 and Ile14 formed the H-bonding with cycloguanil drug. Importantly, the mutation at Ser108Asn was the key important of cycloguanil resistant with showing repulsive interaction energy.

The Rank-scaled Mutation Rate for Genetic Algorithms

A novel method of individual level adaptive mutation rate control called the rank-scaled mutation rate for genetic algorithms is introduced. The rank-scaled mutation rate controlled genetic algorithm varies the mutation parameters based on the rank of each individual within the population. Thereby the distribution of the fitness of the papulation is taken into consideration in forming the new mutation rates. The best fit mutate at the lowest rate and the least fit mutate at the highest rate. The complexity of the algorithm is of the order of an individual adaptation scheme and is lower than that of a self-adaptation scheme. The proposed algorithm is tested on two common problems, namely, numerical optimization of a function and the traveling salesman problem. The results show that the proposed algorithm outperforms both the fixed and deterministic mutation rate schemes. It is best suited for problems with several local optimum solutions without a high demand for excessive mutation rates.

IFS on the Multi-Fuzzy Fractal Space

The IFS is a scheme for describing and manipulating complex fractal attractors using simple mathematical models. More precisely, the most popular “fractal –based" algorithms for both representation and compression of computer images have involved some implementation of the method of Iterated Function Systems (IFS) on complete metric spaces. In this paper a new generalized space called Multi-Fuzzy Fractal Space was constructed. On these spases a distance function is defined, and its completeness is proved. The completeness property of this space ensures the existence of a fixed-point theorem for the family of continuous mappings. This theorem is the fundamental result on which the IFS methods are based and the fractals are built. The defined mappings are proved to satisfy some generalizations of the contraction condition.

A MATLAB Simulink Library for Transient Flow Simulation of Gas Networks

An efficient transient flow simulation for gas pipelines and networks is presented. The proposed transient flow simulation is based on the transfer function models and MATLABSimulink. The equivalent transfer functions of the nonlinear governing equations are derived for different types of the boundary conditions. Next, a MATLAB-Simulink library is developed and proposed considering any boundary condition type. To verify the accuracy and the computational efficiency of the proposed simulation, the results obtained are compared with those of the conventional finite difference schemes (such as TVD, method of lines, and other finite difference implicit and explicit schemes). The effects of the flow inertia and the pipeline inclination are incorporated in this simulation. It is shown that the proposed simulation has a sufficient accuracy and it is computationally more efficient than the other methods.

An Alternative Method for Generating Almost Infinite Sequence of Gaussian Variables

Most of the well known methods for generating Gaussian variables require at least one standard uniform distributed value, for each Gaussian variable generated. The length of the random number generator therefore, limits the number of independent Gaussian distributed variables that can be generated meanwhile the statistical solution of complex systems requires a large number of random numbers for their statistical analysis. We propose an alternative simple method of generating almost infinite number of Gaussian distributed variables using a limited number of standard uniform distributed random numbers.

A Study on Early Prediction of Fault Proneness in Software Modules using Genetic Algorithm

Fault-proneness of a software module is the probability that the module contains faults. To predict faultproneness of modules different techniques have been proposed which includes statistical methods, machine learning techniques, neural network techniques and clustering techniques. The aim of proposed study is to explore whether metrics available in the early lifecycle (i.e. requirement metrics), metrics available in the late lifecycle (i.e. code metrics) and metrics available in the early lifecycle (i.e. requirement metrics) combined with metrics available in the late lifecycle (i.e. code metrics) can be used to identify fault prone modules using Genetic Algorithm technique. This approach has been tested with real time defect C Programming language datasets of NASA software projects. The results show that the fusion of requirement and code metric is the best prediction model for detecting the faults as compared with commonly used code based model.

Assessment of Vulnerability Curves Using Vulnerability Index Method for Reinforced Concrete Structures

The seismic feedback experiences in Algeria have shown higher percentage of damages for non-code conforming reinforced concrete (RC) buildings. Furthermore, the vulnerability of these buildings was further aggravated due to presence of many factors (e.g. weak the seismic capacity of these buildings, shorts columns, Pounding effect, etc.). Consequently Seismic risk assessments were carried out on populations of buildings to identify the buildings most likely to undergo losses during an earthquake. The results of such studies are important in the mitigation of losses under future seismic events as they allow strengthening intervention and disaster management plans to be drawn up. Within this paper, the state of the existing structures is assessed using "the vulnerability index" method. This method allows the classification of RC constructions taking into account both, structural and non structural parameters, considered to be ones of the main parameters governing the vulnerability of the structure. Based on seismic feedback from past earthquakes DPM (damage probability matrices) were developed too.

A Practical Scheme for Transmission Loss Allocation to Generators and Loads in Restructured Power Systems

This paper presents a practical scheme that can be used for allocating the transmission loss to generators and loads. In this scheme first the share of a generator or load on the current through a branch is determined using Z-bus modified matrix. Then the current components are decomposed and the branch loss allocation is obtained. A motivation of proposed scheme is to improve the results of Z-bus method and to reach more fair allocation. The proposed scheme has been implemented and tested on several networks. To achieve practical and applicable results, the proposed scheme is simulated and compared on the transmission network (400kv) of Khorasan region in Iran and the 14-bus standard IEEE network. The results show that the proposed scheme is comprehensive and fair to allocating the energy losses of a power market to its participants.

Fuzzy Group Decision Making for the Assessment of Health-Care Waste Disposal Alternatives in Istanbul

Disposal of health-care waste (HCW) is considered as an important environmental problem especially in large cities. Multiple criteria decision making (MCDM) techniques are apt to deal with quantitative and qualitative considerations of the health-care waste management (HCWM) problems. This research proposes a fuzzy multi-criteria group decision making approach with a multilevel hierarchical structure including qualitative as well as quantitative performance attributes for evaluating HCW disposal alternatives for Istanbul. Using the entropy weighting method, objective weights as well as subjective weights are taken into account to determine the importance weighting of quantitative performance attributes. The results obtained using the proposed methodology are thoroughly analyzed.

Hybrid Modulation Technique for Fingerprinting

This paper addresses an efficient technique to embed and detect digital fingerprint code. Orthogonal modulation method is a straightforward and widely used approach for digital fingerprinting but shows several limitations in computational cost and signal efficiency. Coded modulation method can solve these limitations in theory. However it is difficult to perform well in practice if host signals are not available during tracing colluders, other kinds of attacks are applied, and the size of fingerprint code becomes large. In this paper, we propose a hybrid modulation method, in which the merits of or-thogonal modulation and coded modulation method are combined so that we can achieve low computational cost and high signal efficiency. To analyze the performance, we design a new fingerprint code based on GD-PBIBD theory and modulate this code into images by our method using spread-spectrum watermarking on frequency domain. The results show that the proposed method can efficiently handle large fingerprint code and trace colluders against averaging attacks.

ORank: An Ontology Based System for Ranking Documents

Increasing growth of information volume in the internet causes an increasing need to develop new (semi)automatic methods for retrieval of documents and ranking them according to their relevance to the user query. In this paper, after a brief review on ranking models, a new ontology based approach for ranking HTML documents is proposed and evaluated in various circumstances. Our approach is a combination of conceptual, statistical and linguistic methods. This combination reserves the precision of ranking without loosing the speed. Our approach exploits natural language processing techniques for extracting phrases and stemming words. Then an ontology based conceptual method will be used to annotate documents and expand the query. To expand a query the spread activation algorithm is improved so that the expansion can be done in various aspects. The annotated documents and the expanded query will be processed to compute the relevance degree exploiting statistical methods. The outstanding features of our approach are (1) combining conceptual, statistical and linguistic features of documents, (2) expanding the query with its related concepts before comparing to documents, (3) extracting and using both words and phrases to compute relevance degree, (4) improving the spread activation algorithm to do the expansion based on weighted combination of different conceptual relationships and (5) allowing variable document vector dimensions. A ranking system called ORank is developed to implement and test the proposed model. The test results will be included at the end of the paper.

An Embedded System for Artificial Intelligence Applications

Conventional approaches in the implementation of logic programming applications on embedded systems are solely of software nature. As a consequence, a compiler is needed that transforms the initial declarative logic program to its equivalent procedural one, to be programmed to the microprocessor. This approach increases the complexity of the final implementation and reduces the overall system's performance. On the contrary, presenting hardware implementations which are only capable of supporting logic programs prevents their use in applications where logic programs need to be intertwined with traditional procedural ones, for a specific application. We exploit HW/SW codesign methods to present a microprocessor, capable of supporting hybrid applications using both programming approaches. We take advantage of the close relationship between attribute grammar (AG) evaluation and knowledge engineering methods to present a programmable hardware parser that performs logic derivations and combine it with an extension of a conventional RISC microprocessor that performs the unification process to report the success or failure of those derivations. The extended RISC microprocessor is still capable of executing conventional procedural programs, thus hybrid applications can be implemented. The presented implementation is programmable, supports the execution of hybrid applications, increases the performance of logic derivations (experimental analysis yields an approximate 1000% increase in performance) and reduces the complexity of the final implemented code. The proposed hardware design is supported by a proposed extended C-language called C-AG.

Solution of Nonlinear Second-Order Pantograph Equations via Differential Transformation Method

In this work, we successfully extended one-dimensional differential transform method (DTM), by presenting and proving some theorems, to solving nonlinear high-order multi-pantograph equations. This technique provides a sequence of functions which converges to the exact solution of the problem. Some examples are given to demonstrate the validity and applicability of the present method and a comparison is made with existing results.