Controllability of Efficiency of Antiviral Therapy in Hepatitis B Virus Infections

An optimal control problem for a mathematical model of efficiency of antiviral therapy in hepatitis B virus infections is considered. The aim of the study is to control the new viral production, block the new infection cells and maintain the number of uninfected cells in the given range. The optimal controls represent the efficiency of antiviral therapy in inhibiting viral production and preventing new infections. Defining the cost functional, the optimal control problem is converted into the constrained optimization problem and the first order optimality system is derived. For the numerical simulation, we propose the steepest descent algorithm based on the adjoint variable method. A computer program in MATLAB is developed for the numerical simulations.

Thermodynamic Optimization of Turboshaft Engine using Multi-Objective Genetic Algorithm

In this paper multi-objective genetic algorithms are employed for Pareto approach optimization of ideal Turboshaft engines. In the multi-objective optimization a number of conflicting objective functions are to be optimized simultaneously. The important objective functions that have been considered for optimization are specific thrust (F/m& 0), specific fuel consumption ( P S ), output shaft power 0 (& /&) shaft W m and overall efficiency( ) O η . These objectives are usually conflicting with each other. The design variables consist of thermodynamic parameters (compressor pressure ratio, turbine temperature ratio and Mach number). At the first stage single objective optimization has been investigated and the method of NSGA-II has been used for multiobjective optimization. Optimization procedures are performed for two and four objective functions and the results are compared for ideal Turboshaft engine. In order to investigate the optimal thermodynamic behavior of two objectives, different set, each including two objectives of output parameters, are considered individually. For each set Pareto front are depicted. The sets of selected decision variables based on this Pareto front, will cause the best possible combination of corresponding objective functions. There is no superiority for the points on the Pareto front figure, but they are superior to any other point. In the case of four objective optimization the results are given in tables.

A Combinatorial Approach to Planning Manufacturing Safety Programme

Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme

Design of Variable Fractional-Delay FIR Differentiators

In this paper, the least-squares design of variable fractional-delay (VFD) finite impulse response (FIR) digital differentiators is proposed. The used transfer function is formulated so that Farrow structure can be applied to realize the designed system. Also, the symmetric characteristics of filter coefficients are derived, which leads to the complexity reduction by saving almost a half of the number of coefficients. Moreover, all the elements of related vectors or matrices for the optimal process can be represented in closed forms, which make the design easier. Design example is also presented to illustrate the effectiveness of the proposed method.

Integration of Image and Patient Data, Software and International Coding Systems for Use in a Mammography Research Project

Mammographic images and data analysis to facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file formats and relate these to other patient information. This would optimize the use of the data as both primary reporting and enhanced information extraction of research data could be performed from the single dataset. One desired improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically available in the images. The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research purposes. An interface was developed for accessing, adding, updating, modifying and extracting data from the common database, enhancing the future possible application of the data in CAD processing. Technically, future developments envisaged include the creation of an advanced search function to selects image files based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a user friendly configuration utility for importing of the required fields from the DICOM files must be done.

Component Based Framework for Authoring and Multimedia Training in Mathematics

The new programming technologies allow for the creation of components which can be automatically or manually assembled to reach a new experience in knowledge understanding and mastering or in getting skills for a specific knowledge area. The project proposes an interactive framework that permits the creation, combination and utilization of components that are specific to mathematical training in high schools. The main framework-s objectives are: • authoring lessons by the teacher or the students; all they need are simple operating skills for Equation Editor (or something similar, or Latex); the rest are just drag & drop operations, inserting data into a grid, or navigating through menus • allowing sonorous presentations of mathematical texts and solving hints (easier understood by the students) • offering graphical representations of a mathematical function edited in Equation • storing of learning objects in a database • storing of predefined lessons (efficient for expressions and commands, the rest being calculations; allows a high compression) • viewing and/or modifying predefined lessons, according to the curricula The whole thing is focused on a mathematical expressions minicompiler, storing the code that will be later used for different purposes (tables, graphics, and optimisations). Programming technologies used. A Visual C# .NET implementation is proposed. New and innovative digital learning objects for mathematics will be developed; they are capable to interpret, contextualize and react depending on the architecture where they are assembled.

Fenestration Effects on Cross Ventilation for a Typical Taiwanese School Building When Applying Wind Profile

Appropriate ventilation in a classroom is helpful for enhancing air exchange rate and student concentration. This study focuses on the effects of fenestration in a four-story school building by performing numerical simulation of a building when considering indoor and outdoor environments simultaneously. The wind profile function embedded in PHOENICS code was set as the inlet boundary condition in a suburban environment. Sixteen fenestration combinations were compared in a classroom containing thirty seats. This study evaluates mean age of air (AGE) and airflow pattern of a classroom on different floors. Considering both wind profile and fenestration effects, the airflow on higher floors is channeled toward the area near ceiling in a room and causes older mean age of air in the breathing zone. The results in this study serve as a useful guide for enhancing natural ventilation in a typical school building.

Requirements and Design of RFID based EManufacturing System

This paper proposes the requirements and design of RFID based system for SFC (Shop Floor Control) in order to achieve the factory real time controllability, Allowing to develop EManufacturing System. The detailed logical specifications of the core functions and the design diagrams of RFID based system are developed. Then RFID deployment in E-Manufacturing systems is investigated..

Delay-Dependent Stability Criteria for Linear Time-Delay System of Neutral Type

This paper proposes improved delay-dependent stability conditions of the linear time-delay systems of neutral type. The proposed methods employ a suitable Lyapunov-Krasovskii’s functional and a new form of the augmented system. New delay-dependent stability criteria for the systems are established in terms of Linear matrix inequalities (LMIs) which can be easily solved by various effective optimization algorithms. Numerical examples showed that the proposed method is effective and can provide less conservative results.

Evaluating the Response of Rainfed-Chickpea to Population Density in Iran, Using Simulation

The response of growth and yield of rainfed-chickpea to population density should be evaluated based on long-term experiments to include the climate variability. This is achievable just by simulation. In this simulation study, this evaluation was done by running the CYRUS model for long-term daily weather data of five locations in Iran. The tested population densities were 7 to 59 (with interval of 2) stands per square meter. Various functions, including quadratic, segmented, beta, broken linear, and dent-like functions, were tested. Considering root mean square of deviations and linear regression statistics [intercept (a), slope (b), and correlation coefficient (r)] for predicted versus observed variables, the quadratic and broken linear functions appeared to be appropriate for describing the changes in biomass and grain yield, and in harvest index, respectively. Results indicated that in all locations, grain yield tends to show increasing trend with crowding the population, but subsequently decreases. This was also true for biomass in five locations. The harvest index appeared to have plateau state across low population densities, but decreasing trend with more increasing density. The turning point (optimum population density) for grain yield was 30.68 stands per square meter in Isfahan, 30.54 in Shiraz, 31.47 in Kermanshah, 34.85 in Tabriz, and 32.00 in Mashhad. The optimum population density for biomass ranged from 24.6 (in Tabriz) to 35.3 stands per square meter (Mashhad). For harvest index it varied between 35.87 and 40.12 stands per square meter.

Predicting the Impact of the Defect on the Overall Environment in Function Based Systems

There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.

Characterisation and Classification of Natural Transients

Monitoring lightning electromagnetic pulses (sferics) and other terrestrial as well as extraterrestrial transient radiation signals is of considerable interest for practical and theoretical purposes in astro- and geophysics as well as meteorology. Managing a continuous flow of data, automisation of the detection and classification process is important. Features based on a combination of wavelet and statistical methods proved efficient for analysis and characterisation of transients and as input into a radial basis function network that is trained to discriminate transients from pulse like to wave like.

H∞ Approach to Functional Projective Synchronization for Chaotic Systems with Disturbances

This paper presents a method for functional projective H∞ synchronization problem of chaotic systems with external disturbance. Based on Lyapunov theory and linear matrix inequality (LMI) formulation, the novel feedback controller is established to not only guarantee stable synchronization of both drive and response systems but also reduce the effect of external disturbance to an H∞ norm constraint.

More on Gaussian Quadratures for Fuzzy Functions

In this paper, the Gaussian type quadrature rules for fuzzy functions are discussed. The errors representation and convergence theorems are given. Moreover, four kinds of Gaussian type quadrature rules with error terms for approximate of fuzzy integrals are presented. The present paper complements the theoretical results of the paper by T. Allahviranloo and M. Otadi [T. Allahviranloo, M. Otadi, Gaussian quadratures for approximate of fuzzy integrals, Applied Mathematics and Computation 170 (2005) 874-885]. The obtained results are illustrated by solving some numerical examples.

An Ant-based Clustering System for Knowledge Discovery in DNA Chip Analysis Data

Biological data has several characteristics that strongly differentiate it from typical business data. It is much more complex, usually large in size, and continuously changes. Until recently business data has been the main target for discovering trends, patterns or future expectations. However, with the recent rise in biotechnology, the powerful technology that was used for analyzing business data is now being applied to biological data. With the advanced technology at hand, the main trend in biological research is rapidly changing from structural DNA analysis to understanding cellular functions of the DNA sequences. DNA chips are now being used to perform experiments and DNA analysis processes are being used by researchers. Clustering is one of the important processes used for grouping together similar entities. There are many clustering algorithms such as hierarchical clustering, self-organizing maps, K-means clustering and so on. In this paper, we propose a clustering algorithm that imitates the ecosystem taking into account the features of biological data. We implemented the system using an Ant-Colony clustering algorithm. The system decides the number of clusters automatically. The system processes the input biological data, runs the Ant-Colony algorithm, draws the Topic Map, assigns clusters to the genes and displays the output. We tested the algorithm with a test data of 100 to1000 genes and 24 samples and show promising results for applying this algorithm to clustering DNA chip data.

Fingerprint Compression Using Contourlet Transform and Multistage Vector Quantization

This paper presents a new fingerprint coding technique based on contourlet transform and multistage vector quantization. Wavelets have shown their ability in representing natural images that contain smooth areas separated with edges. However, wavelets cannot efficiently take advantage of the fact that the edges usually found in fingerprints are smooth curves. This issue is addressed by directional transforms, known as contourlets, which have the property of preserving edges. The contourlet transform is a new extension to the wavelet transform in two dimensions using nonseparable and directional filter banks. The computation and storage requirements are the major difficulty in implementing a vector quantizer. In the full-search algorithm, the computation and storage complexity is an exponential function of the number of bits used in quantizing each frame of spectral information. The storage requirement in multistage vector quantization is less when compared to full search vector quantization. The coefficients of contourlet transform are quantized by multistage vector quantization. The quantized coefficients are encoded by Huffman coding. The results obtained are tabulated and compared with the existing wavelet based ones.

Usability and Functionality: A Comparison of Key Project Personnel's and Potential Users' Evaluations

Meeting users- requirements is one of predictors of project success. There should be a match between the expectations of the users and the perception of key project personnel with respect to usability and functionality. The aim of this study is to make a comparison of key project personnel-s and potential users- (customer representatives) evaluations of the relative importance of usability and functionality factors in a software design project. Analytical Network Process (ANP) was used to analyze the relative importance of the factors. The results show that navigation and interaction are the most significant factors,andsatisfaction and efficiency are the least important factors for both groups. Further, it can be concluded that having similar orders and scores of usability and functionality factors for both groups shows that key project personnel have captured the expectations and requirements of potential users accurately.

A Cohesive Lagrangian Swarm and Its Application to Multiple Unicycle-like Vehicles

Swarm principles are increasingly being used to design controllers for the coordination of multi-robot systems or, in general, multi-agent systems. This paper proposes a two-dimensional Lagrangian swarm model that enables the planar agents, modeled as point masses, to swarm whilst effectively avoiding each other and obstacles in the environment. A novel method, based on an extended Lyapunov approach, is used to construct the model. Importantly, the Lyapunov method ensures a form of practical stability that guarantees an emergent behavior, namely, a cohesive and wellspaced swarm with a constant arrangement of individuals about the swarm centroid. Computer simulations illustrate this basic feature of collective behavior. As an application, we show how multiple planar mobile unicycle-like robots swarm to eventually form patterns in which their velocities and orientations stabilize.

An Ontology Abstract Machine

As more people from non-technical backgrounds are becoming directly involved with large-scale ontology development, the focal point of ontology research has shifted from the more theoretical ontology issues to problems associated with the actual use of ontologies in real-world, large-scale collaborative applications. Recently the National Science Foundation funded a large collaborative ontology development project for which a new formal ontology model, the Ontology Abstract Machine (OAM), was developed to satisfy some unique functional and data representation requirements. This paper introduces the OAM model and the related algorithms that enable maintenance of an ontology that supports node-based user access. The successful software implementation of the OAM model and its subsequent acceptance by a large research community proves its validity and its real-world application value.

A Novel Approach for Protein Classification Using Fourier Transform

Discovering new biological knowledge from the highthroughput biological data is a major challenge to bioinformatics today. To address this challenge, we developed a new approach for protein classification. Proteins that are evolutionarily- and thereby functionally- related are said to belong to the same classification. Identifying protein classification is of fundamental importance to document the diversity of the known protein universe. It also provides a means to determine the functional roles of newly discovered protein sequences. Our goal is to predict the functional classification of novel protein sequences based on a set of features extracted from each protein sequence. The proposed technique used datasets extracted from the Structural Classification of Proteins (SCOP) database. A set of spectral domain features based on Fast Fourier Transform (FFT) is used. The proposed classifier uses multilayer back propagation (MLBP) neural network for protein classification. The maximum classification accuracy is about 91% when applying the classifier to the full four levels of the SCOP database. However, it reaches a maximum of 96% when limiting the classification to the family level. The classification results reveal that spectral domain contains information that can be used for classification with high accuracy. In addition, the results emphasize that sequence similarity measures are of great importance especially at the family level.