Design and Implementation of a Hybrid Fuzzy Controller for a High-Performance Induction

This paper proposes an effective algorithm approach to hybrid control systems combining fuzzy logic and conventional control techniques of controlling the speed of induction motor assumed to operate in high-performance drives environment. The introducing of fuzzy logic in the control systems helps to achieve good dynamical response, disturbance rejection and low sensibility to parameter variations and external influences. Some fundamentals of the fuzzy logic control are preliminary illustrated. The developed control algorithm is robust, efficient and simple. It also assures precise trajectory tracking with the prescribed dynamics. Experimental results have shown excellent tracking performance of the proposed control system, and have convincingly demonstrated the validity and the usefulness of the hybrid fuzzy controller in high-performance drives with parameter and load uncertainties. Satisfactory performance was observed for most reference tracks.

A Hybrid Method for Eyes Detection in Facial Images

This paper proposes a hybrid method for eyes localization in facial images. The novelty is in combining techniques that utilise colour, edge and illumination cues to improve accuracy. The method is based on the observation that eye regions have dark colour, high density of edges and low illumination as compared to other parts of face. The first step in the method is to extract connected regions from facial images using colour, edge density and illumination cues separately. Some of the regions are then removed by applying rules that are based on the general geometry and shape of eyes. The remaining connected regions obtained through these three cues are then combined in a systematic way to enhance the identification of the candidate regions for the eyes. The geometry and shape based rules are then applied again to further remove the false eye regions. The proposed method was tested using images from the PICS facial images database. The proposed method has 93.7% and 87% accuracies for initial blobs extraction and final eye detection respectively.

Modeling Oxygen-transfer by Multiple Plunging Jets using Support Vector Machines and Gaussian Process Regression Techniques

The paper investigates the potential of support vector machines and Gaussian process based regression approaches to model the oxygen–transfer capacity from experimental data of multiple plunging jets oxygenation systems. The results suggest the utility of both the modeling techniques in the prediction of the overall volumetric oxygen transfer coefficient (KLa) from operational parameters of multiple plunging jets oxygenation system. The correlation coefficient root mean square error and coefficient of determination values of 0.971, 0.002 and 0.945 respectively were achieved by support vector machine in comparison to values of 0.960, 0.002 and 0.920 respectively achieved by Gaussian process regression. Further, the performances of both these regression approaches in predicting the overall volumetric oxygen transfer coefficient was compared with the empirical relationship for multiple plunging jets. A comparison of results suggests that support vector machines approach works well in comparison to both empirical relationship and Gaussian process approaches, and could successfully be employed in modeling oxygen-transfer.

The Data Mining usage in Production System Management

The paper gives the pilot results of the project that is oriented on the use of data mining techniques and knowledge discoveries from production systems through them. They have been used in the management of these systems. The simulation models of manufacturing systems have been developed to obtain the necessary data about production. The authors have developed the way of storing data obtained from the simulation models in the data warehouse. Data mining model has been created by using specific methods and selected techniques for defined problems of production system management. The new knowledge has been applied to production management system. Gained knowledge has been tested on simulation models of the production system. An important benefit of the project has been proposal of the new methodology. This methodology is focused on data mining from the databases that store operational data about the production process.

The Association of Matrix Metalloproteinase-3 Gene -1612 5A/6A Polymorphism with Susceptibility to Coronary Artery Stenosis in an Iranian Population

Matrix metalloproteinase-3 (MMP3) is key member of the MMP family, and is known to be present in coronary atherosclerotic. Several studies have demonstrated that MMP-3 5A/6A polymorphism modify each transcriptional activity in allele specific manner. We hypothesized that this polymorphism may play a role as risk factor for development of coronary stenosis. The aim of our study was to estimate MMP-3 (5A/6A) gene polymorphism on interindividual variability in risk for coronary stenosis in an Iranian population.DNA was extracted from white blood cells and genotypes were obtained from coronary stenosis cases (n=95) and controls (n=100) by PCR (polymerase chain reaction) and restriction fragment length polymorphism techniques. Significant differences between cases and controls were observed for MMP3 genotype frequencies (X2=199.305, p< 0.001); the 6A allele was less frequently seen in the control group, compared to the disease group (85.79 vs. 78%, 6A/6A+5A/6A vs. 5A/5A, P≤0.001). These data imply the involvement of -1612 5A/6A polymorphism in coronary stenosis, and suggest that probably the 6A/6A MMP-3 genotype is a genetic susceptibility factor for coronary stenosis.

Numerical Investigation of Flow Patterns and Thermal Comfort in Air-Conditioned Lecture Rooms

The present paper was concerned primarily with the analysis, simulation of the air flow and thermal patterns in a lecture room. The paper is devoted to numerically investigate the influence of location and number of ventilation and air conditioning supply and extracts openings on air flow properties in a lecture room. The work focuses on air flow patterns, thermal behaviour in lecture room where large number of students. The effectiveness of an air flow system is commonly assessed by the successful removal of sensible and latent loads from occupants with additional of attaining air pollutant at a prescribed level to attain the human thermal comfort conditions and to improve the indoor air quality; this is the main target during the present paper. The study is carried out using computational fluid dynamics (CFD) simulation techniques as embedded in the commercially available CFD code (FLUENT 6.2). The CFD modelling techniques solved the continuity, momentum and energy conservation equations in addition to standard k – ε model equations for turbulence closure. Throughout the investigations, numerical validation is carried out by way of comparisons of numerical and experimental results. Good agreement is found among both predictions.

A CFD Study of Turbulent Convective Heat Transfer Enhancement in Circular Pipeflow

Addition of milli or micro sized particles to the heat transfer fluid is one of the many techniques employed for improving heat transfer rate. Though this looks simple, this method has practical problems such as high pressure loss, clogging and erosion of the material of construction. These problems can be overcome by using nanofluids, which is a dispersion of nanosized particles in a base fluid. Nanoparticles increase the thermal conductivity of the base fluid manifold which in turn increases the heat transfer rate. Nanoparticles also increase the viscosity of the basefluid resulting in higher pressure drop for the nanofluid compared to the base fluid. So it is imperative that the Reynolds number (Re) and the volume fraction have to be optimum for better thermal hydraulic effectiveness. In this work, the heat transfer enhancement using aluminium oxide nanofluid using low and high volume fraction nanofluids in turbulent pipe flow with constant wall temperature has been studied by computational fluid dynamic modeling of the nanofluid flow adopting the single phase approach. Nanofluid, up till a volume fraction of 1% is found to be an effective heat transfer enhancement technique. The Nusselt number (Nu) and friction factor predictions for the low volume fractions (i.e. 0.02%, 0.1 and 0.5%) agree very well with the experimental values of Sundar and Sharma (2010). While, predictions for the high volume fraction nanofluids (i.e. 1%, 4% and 6%) are found to have reasonable agreement with both experimental and numerical results available in the literature. So the computationally inexpensive single phase approach can be used for heat transfer and pressure drop prediction of new nanofluids.

CScheme in Traditional Concurrency Problems

CScheme, a concurrent programming paradigm based on scheme concept enables concurrency schemes to be constructed from smaller synchronization units through a GUI based composer and latter be reused on other concurrency problems of a similar nature. This paradigm is particularly important in the multi-core environment prevalent nowadays. In this paper, we demonstrate techniques to separate concurrency from functional code using the CScheme paradigm. Then we illustrate how the CScheme methodology can be used to solve some of the traditional concurrency problems – critical section problem, and readers-writers problem - using synchronization schemes such as Single Threaded Execution Scheme, and Readers Writers Scheme.

Intellectual Capital and Competitive Advantage: An Analysis of the Biotechnology Industry

Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.

A Visual Educational Modeling Language to Help Teachers in Learning Scenario Design

The success of an e-learning system is highly dependent on the quality of its educational content and how effective, complete, and simple the design tool can be for teachers. Educational modeling languages (EMLs) are proposed as design languages intended to teachers for modeling diverse teaching-learning experiences, independently of the pedagogical approach and in different contexts. However, most existing EMLs are criticized for being too abstract and too complex to be understood and manipulated by teachers. In this paper, we present a visual EML that simplifies the process of designing learning scenarios for teachers with no programming background. Based on the conceptual framework of the activity theory, our resulting visual EML focuses on using Domainspecific modeling techniques to provide a pedagogical level of abstraction in the design process.

Collaboration of Multi-Agent and Hyper-Heuristics Systems for Production Scheduling Problem

This paper introduces a framework based on the collaboration of multi agent and hyper-heuristics to find a solution of the real single machine production problem. There are many techniques used to solve this problem. Each of it has its own advantages and disadvantages. By the collaboration of multi agent system and hyper-heuristics, we can get more optimal solution. The hyper-heuristics approach operates on a search space of heuristics rather than directly on a search space of solutions. The proposed framework consists of some agents, i.e. problem agent, trainer agent, algorithm agent (GPHH, GAHH, and SAHH), optimizer agent, and solver agent. Some low level heuristics used in this paper are MRT, SPT, LPT, EDD, LDD, and MON

Using Automated Database Reverse Engineering for Database Integration

One important problem in today organizations is the existence of non-integrated information systems, inconsistency and lack of suitable correlations between legacy and modern systems. One main solution is to transfer the local databases into a global one. In this regards we need to extract the data structures from the legacy systems and integrate them with the new technology systems. In legacy systems, huge amounts of a data are stored in legacy databases. They require particular attention since they need more efforts to be normalized, reformatted and moved to the modern database environments. Designing the new integrated (global) database architecture and applying the reverse engineering requires data normalization. This paper proposes the use of database reverse engineering in order to integrate legacy and modern databases in organizations. The suggested approach consists of methods and techniques for generating data transformation rules needed for the data structure normalization.

Image Segmentation Based on Graph Theoretical Approach to Improve the Quality of Image Segmentation

Graph based image segmentation techniques are considered to be one of the most efficient segmentation techniques which are mainly used as time & space efficient methods for real time applications. How ever, there is need to focus on improving the quality of segmented images obtained from the earlier graph based methods. This paper proposes an improvement to the graph based image segmentation methods already described in the literature. We contribute to the existing method by proposing the use of a weighted Euclidean distance to calculate the edge weight which is the key element in building the graph. We also propose a slight modification of the segmentation method already described in the literature, which results in selection of more prominent edges in the graph. The experimental results show the improvement in the segmentation quality as compared to the methods that already exist, with a slight compromise in efficiency.

A Technique for Execution of Written Values on Shared Variables

The current paper conceptualizes the technique of release consistency indispensable with the concept of synchronization that is user-defined. Programming model concreted with object and class is illustrated and demonstrated. The essence of the paper is phases, events and parallel computing execution .The technique by which the values are visible on shared variables is implemented. The second part of the paper consist of user defined high level synchronization primitives implementation and system architecture with memory protocols. There is a proposition of techniques which are core in deciding the validating and invalidating a stall page .

Categorical Missing Data Imputation Using Fuzzy Neural Networks with Numerical and Categorical Inputs

There are many situations where input feature vectors are incomplete and methods to tackle the problem have been studied for a long time. A commonly used procedure is to replace each missing value with an imputation. This paper presents a method to perform categorical missing data imputation from numerical and categorical variables. The imputations are based on Simpson-s fuzzy min-max neural networks where the input variables for learning and classification are just numerical. The proposed method extends the input to categorical variables by introducing new fuzzy sets, a new operation and a new architecture. The procedure is tested and compared with others using opinion poll data.

Dynamic Features Selection for Heart Disease Classification

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the Coronary Heart Disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts- knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

Conversion of Methanol to Propylene over a High Silica B-HZSM-5 Catalyst

Hydrothermally synthesized high silica borosilicates with the MFI structure was subjected to several characterization techniques. The effect of boron on the structure and acidity of HZSM-5 catalyst were studied by XRD, SEM, N2 adsorption, solid state NMR, NH3-TPD. It was confirmed that boron had entered the framework in the boron samples. The results also revealed that strong acidity was weakened and weak acidity was strengthened by the boron added zeolite framework compared with parent catalyst. The catalytic performance was carried out in a fixed bed at 460°C for methanol to propylene (MTP) reaction. The results of MTP reaction showed a great increment of the propylene selectivity and excellent stability for the B-HZSM-5. The catalyst exhibited about 81% selectivity to C2 = - C4 = olefins with 40% selectivity of propylene as major component at near 100% methanol conversion, and the stable performance in the studied period was 100h.

Slug Tracking Simulation of Severe Slugging Experiments

Experimental data from an atmospheric air/water terrain slugging case has been made available by the Shell Amsterdam research center, and has been subject to numerical simulation and comparison with a one-dimensional two-phase slug tracking simulator under development at the Norwegian University of Science and Technology. The code is based on tracking of liquid slugs in pipelines by use of a Lagrangian grid formulation implemented in Cµ by use of object oriented techniques. An existing hybrid spatial discretization scheme is tested, in which the stratified regions are modelled by the two-fluid model. The slug regions are treated incompressible, thus requiring a single momentum balance over the whole slug. Upon comparison with the experimental data, the period of the simulated severe slugging cycle is observed to be sensitive to slug generation in the horizontal parts of the system. Two different slug initiation methods have been tested with the slug tracking code, and grid dependency has been investigated.

Value Engineering and Its Effect in Reduction of Industrial Organization Energy Expenses

The review performed on the condition of energy consumption & rate in Iran, shows that unfortunately the subject of optimization and conservation of energy in active industries of country lacks a practical & effective method and in most factories, the energy consumption and rate is more than in similar industries of industrial countries. The increasing demand of electrical energy and the overheads which it imposes on the organization, forces companies to search for suitable approaches to optimize energy consumption and demand management. Application of value engineering techniques is among these approaches. Value engineering is considered a powerful tool for improving profitability. These tools are used for reduction of expenses, increasing profits, quality improvement, increasing market share, performing works in shorter durations, more efficient utilization of sources & etc. In this article, we shall review the subject of value engineering and its capabilities for creating effective transformations in industrial organizations, in order to reduce energy costs & the results have been investigated and described during a case study in Mazandaran wood and paper industries, the biggest consumer of energy in north of Iran, for the purpose of presenting the effects of performed tasks in optimization of energy consumption by utilizing value engineering techniques in one case study.

Implementing an Intuitive Reasoner with a Large Weather Database

In this paper, the implementation of a rule-based intuitive reasoner is presented. The implementation included two parts: the rule induction module and the intuitive reasoner. A large weather database was acquired as the data source. Twelve weather variables from those data were chosen as the “target variables" whose values were predicted by the intuitive reasoner. A “complex" situation was simulated by making only subsets of the data available to the rule induction module. As a result, the rules induced were based on incomplete information with variable levels of certainty. The certainty level was modeled by a metric called "Strength of Belief", which was assigned to each rule or datum as ancillary information about the confidence in its accuracy. Two techniques were employed to induce rules from the data subsets: decision tree and multi-polynomial regression, respectively for the discrete and the continuous type of target variables. The intuitive reasoner was tested for its ability to use the induced rules to predict the classes of the discrete target variables and the values of the continuous target variables. The intuitive reasoner implemented two types of reasoning: fast and broad where, by analogy to human thought, the former corresponds to fast decision making and the latter to deeper contemplation. . For reference, a weather data analysis approach which had been applied on similar tasks was adopted to analyze the complete database and create predictive models for the same 12 target variables. The values predicted by the intuitive reasoner and the reference approach were compared with actual data. The intuitive reasoner reached near-100% accuracy for two continuous target variables. For the discrete target variables, the intuitive reasoner predicted at least 70% as accurately as the reference reasoner. Since the intuitive reasoner operated on rules derived from only about 10% of the total data, it demonstrated the potential advantages in dealing with sparse data sets as compared with conventional methods.