Adaptive Filtering of Heart Rate Signals for an Improved Measure of Cardiac Autonomic Control

In order to provide accurate heart rate variability indices of sympathetic and parasympathetic activity, the low frequency and high frequency components of an RR heart rate signal must be adequately separated. This is not always possible by just applying spectral analysis, as power from the high and low frequency components often leak into their adjacent bands. Furthermore, without the respiratory spectra it is not obvious that the low frequency component is not another respiratory component, which can appear in the lower band. This paper describes an adaptive filter, which aids the separation of the low frequency sympathetic and high frequency parasympathetic components from an ECG R-R interval signal, enabling the attainment of more accurate heart rate variability measures. The algorithm is applied to simulated signals and heart rate and respiratory signals acquired from an ambulatory monitor incorporating single lead ECG and inductive plethysmography sensors embedded in a garment. The results show an improvement over standard heart rate variability spectral measurements.

Applying Fuzzy FP-Growth to Mine Fuzzy Association Rules

In data mining, the association rules are used to find for the associations between the different items of the transactions database. As the data collected and stored, rules of value can be found through association rules, which can be applied to help managers execute marketing strategies and establish sound market frameworks. This paper aims to use Fuzzy Frequent Pattern growth (FFP-growth) to derive from fuzzy association rules. At first, we apply fuzzy partition methods and decide a membership function of quantitative value for each transaction item. Next, we implement FFP-growth to deal with the process of data mining. In addition, in order to understand the impact of Apriori algorithm and FFP-growth algorithm on the execution time and the number of generated association rules, the experiment will be performed by using different sizes of databases and thresholds. Lastly, the experiment results show FFPgrowth algorithm is more efficient than other existing methods.

An Interactive Web-based Simulation Tool for Surgical Thread

Interactive web-based computer simulations are needed by the medical community to replicate the experience of surgical procedures as closely and realistically as possible without the need to practice on corpses, animals and/or plastic models. In this paper, we offer a review on current state of the research on simulations of surgical threads, identify future needs and present our proposed plans to meet them. Our goal is to create a physics-based simulator, which will predict the behavior of surgical thread when subjected to conditions commonly encountered during surgery. To that end, we will i) develop three dimensional finite element models based on the Cosserat theory of elasticity ii) test and feedback results with the medical community and iii) develop a web-based user interface to run/command our simulator and visualize the results. The impacts of our research are that i) it will contribute to the development of a new generation of training for medical school students and ii) the simulator will be useful to expert surgeons in developing new, better and less risky procedures.

Ethics, Identity and Organizational Learning –Challenges for South African Managers

As a result of the ever-changing environment and the demands of rganisations- customers, it is important to recognise the importance of some important managerial challenges. It is the sincere belief that failure to meet these challenges, will ultimately contribute to inevitable problems for organisations. This recognition requires from managers and by implication organisations to be engaged in ethical behaviour, identity awareness and learning organisational behaviour. All these aspects actually reflect on the importance of intellectual capital as the competitive weapons for organisations in the future.

Mechanical Design and Theoretical Analysis of a Skip-Cycle Mechanism for an Internal Combustion Engine

Skip cycle is a working strategy for spark ignition engines, which allows changing the effective stroke of an engine through skipping some of the four stroke cycles. This study proposes a new mechanism to achieve the desired skip-cycle strategy for internal combustion engines. The air and fuel leakage, which occurs through the gas exchange, negatively affects the efficiency of the engine at high speeds and loads. An absolute sealing is assured by direct use of poppet valves, which are kept in fully closed position during the skipped mode. All the components of the mechanism were designed according to the real dimensions of the Anadolu Motor's gasoline engine and modeled in 3D by means of CAD software. As the mechanism operates in two modes, two dynamically equivalent models are established to obtain the force and strength analysis for critical components.

Visualizing Transit Through a Web Based Geographic Information System

Currently in many major cities, public transit schedules are disseminated through lists of routes, grids of stop times and static maps. This paper describes a web based geographic information system which disseminates the same schedule information through intuitive GIS techniques. Using data from Calgary, Canada, an map based interface has been created to allow users to see routes, stops and moving buses all at once. Zoom and pan controls as well as satellite imagery allows users to apply their personal knowledge about the local geography to achieve faster, and more pertinent transit results. Using asynchronous requests to web services, users are immersed in an application where buses and stops can be added and removed interactively, without the need to wait for responses to HTTP requests.

Vector Control of Multimotor Drive

Three-phase induction machines are today a standard for industrial electrical drives. Cost, reliability, robustness and maintenance free operation are among the reasons these machines are replacing dc drive systems. The development of power electronics and signal processing systems has eliminated one of the greatest disadvantages of such ac systems, which is the issue of control. With modern techniques of field oriented vector control, the task of variable speed control of induction machines is no longer a disadvantage. The need to increase system performance, particularly when facing limits on the power ratings of power supplies and semiconductors, motivates the use of phase number other than three, In this paper a novel scheme of connecting two, three phase induction motors in parallel fed by two inverters; viz. VSI and CSI and their vector control is presented.

Research of a Multistep Method Applied to Numerical Solution of Volterra Integro-Differential Equation

Solution of some practical problems is reduced to the solution of the integro-differential equations. But for the numerical solution of such equations basically quadrature methods or its combination with multistep or one-step methods are used. The quadrature methods basically is applied to calculation of the integral participating in right hand side of integro-differential equations. As this integral is of Volterra type, it is obvious that at replacement with its integrated sum the upper limit of the sum depends on a current point in which values of the integral are defined. Thus we receive the integrated sum with variable boundary, to work with is hardly. Therefore multistep method with the constant coefficients, which is free from noted lack and gives the way for finding it-s coefficients is present.

Solving the Teacher Assignment-Course Scheduling Problem by a Hybrid Algorithm

This paper presents a hybrid algorithm for solving a timetabling problem, which is commonly encountered in many universities. The problem combines both teacher assignment and course scheduling problems simultaneously, and is presented as a mathematical programming model. However, this problem becomes intractable and it is unlikely that a proven optimal solution can be obtained by an integer programming approach, especially for large problem instances. A hybrid algorithm that combines an integer programming approach, a greedy heuristic and a modified simulated annealing algorithm collaboratively is proposed to solve the problem. Several randomly generated data sets of sizes comparable to that of an institution in Indonesia are solved using the proposed algorithm. Computational results indicate that the algorithm can overcome difficulties of large problem sizes encountered in previous related works.

Molecular Dynamic Simulation and Receptor-based Pharmacophore Modeling on Human Renin for Discovery of Novel Inhibitors

Hypertension is characterized with stress on the heart and blood vessels thus increasing the risk of heart attack and renal diseases. The Renin angiotensin system (RAS) plays a major role in blood pressure control. Renin is the enzyme that controls the RAS at the rate-limiting step. Our aim is to develop new drug-like leads which can inhibit renin and thereby emerge as therapeutics for hypertension. To achieve this, molecular dynamics (MD) simulation and receptor-based pharmacophore modeling were implemented, and three rennin-inhibitor complex structures were selected based on IC50 value and scaffolds of inhibitors. Three pharmacophore models were generated considering conformations induced by inhibitor. The compounds mapped to these models were selected and subjected to drug-like screening. The identified hits were docked into the active site of renin. Finally, hit1 satisfying the binding mode and interaction energy was selected as possible lead candidate to be used in novel renin inhibitors.

Structure-vibration Analysis of a Power Transformer(154kV/60MVA/Single Phase)

The most common cause of power transformer failures is mechanical defect brought about by excessive vibration, which is formed by the combination of multiples of a frequency of 120 Hz. In this paper, the types of mechanical exciting forces applied to the power transformer were classified, and the mechanical damage mechanism of the power transformer was identified using the vibration transfer route to the machine or structure. The general effects of 120 Hz-vibration on the enclosure, bushing, Buchholz relay, pressure release valve and tap changer of the transformer were also examined.

Philosophy of Education: The Challenges of Globalization and Innovation in the Information Society

Information society is an absolutely new public formation at which the infrastructure and the social relations correspond to the socialized essence of «information genotype» mankind. Information society is a natural social environment which allows the person to open completely the information nature, to use intelligence for joint creation with other people of new information on the basis of knowledge earlier saved up by previous generations.

Using Genetic Algorithms in Closed Loop Identification of the Systems with Variable Structure Controller

This work presents a recursive identification algorithm. This algorithm relates to the identification of closed loop system with Variable Structure Controller. The approach suggested includes two stages. In the first stage a genetic algorithm is used to obtain the parameters of switching function which gives a control signal rich in commutations (i.e. a control signal whose spectral characteristics are closest possible to those of a white noise signal). The second stage consists in the identification of the system parameters by the instrumental variable method and using the optimal switching function parameters obtained with the genetic algorithm. In order to test the validity of this algorithm a simulation example is presented.

Maximizer of the Posterior Marginal Estimate for Noise Reduction of JPEG-compressed Image

We constructed a method of noise reduction for JPEG-compressed image based on Bayesian inference using the maximizer of the posterior marginal (MPM) estimate. In this method, we tried the MPM estimate using two kinds of likelihood, both of which enhance grayscale images converted into the JPEG-compressed image through the lossy JPEG image compression. One is the deterministic model of the likelihood and the other is the probabilistic one expressed by the Gaussian distribution. Then, using the Monte Carlo simulation for grayscale images, such as the 256-grayscale standard image “Lena" with 256 × 256 pixels, we examined the performance of the MPM estimate based on the performance measure using the mean square error. We clarified that the MPM estimate via the Gaussian probabilistic model of the likelihood is effective for reducing noises, such as the blocking artifacts and the mosquito noise, if we set parameters appropriately. On the other hand, we found that the MPM estimate via the deterministic model of the likelihood is not effective for noise reduction due to the low acceptance ratio of the Metropolis algorithm.

Development of Web-based Teams Management System in Construction

Construction project control attempts to obtain real-time information and effectively enhance dynamic control and management via information sharing and analysis among project participants to eliminate construction conflicts and project delays. However, survey results for Taiwan indicate that construction commercial project management software is not widely accepted for subcontractors and suppliers. To solve the project communications problems among participants, this study presents a novel system called the Construction Dynamic Teams Communication Management (Con-DTCM) system for small-to-medium sized subcontractors and suppliers in Taiwanese Construction industry, and demonstrates that the Con-DTCM system responds to the most recent project information efficiently and enhances management of project teams (general contractor, suppliers and subcontractors) through web-based environment. Web-based technology effectively enhances information sharing during construction project management, and generates cost savings via the Internet. The main unique characteristic of the proposed Con-DTCM system is extremely user friendly and easily design compared with current commercial project management applications. The Con-DTCM system is applied to a case study of construction of a building project in Taiwan to confirm the proposed methodology and demonstrate the effectiveness of information sharing during the construction phase. The advantages of the Con-DTCM system are in improving project control and management efficiency for general contractors, and in providing dynamic project tracking and management, which enables subcontractors and suppliers to acquire the most recent project-related information. Furthermore, this study presents and implements a generic system architecture.

Optimized Data Fusion in an Intelligent Integrated GPS/INS System Using Genetic Algorithm

Most integrated inertial navigation systems (INS) and global positioning systems (GPS) have been implemented using the Kalman filtering technique with its drawbacks related to the need for predefined INS error model and observability of at least four satellites. Most recently, a method using a hybrid-adaptive network based fuzzy inference system (ANFIS) has been proposed which is trained during the availability of GPS signal to map the error between the GPS and the INS. Then it will be used to predict the error of the INS position components during GPS signal blockage. This paper introduces a genetic optimization algorithm that is used to update the ANFIS parameters with respect to the INS/GPS error function used as the objective function to be minimized. The results demonstrate the advantages of the genetically optimized ANFIS for INS/GPS integration in comparison with conventional ANFIS specially in the cases of satellites- outages. Coping with this problem plays an important role in assessment of the fusion approach in land navigation.

Discovery of Quantified Hierarchical Production Rules from Large Set of Discovered Rules

Automated discovery of Rule is, due to its applicability, one of the most fundamental and important method in KDD. It has been an active research area in the recent past. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form: Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. This paper focuses on the issue of mining Quantified rules with crisp hierarchical structure using Genetic Programming (GP) approach to knowledge discovery. The post-processing scheme presented in this work uses Quantified production rules as initial individuals of GP and discovers hierarchical structure. In proposed approach rules are quantified by using Dempster Shafer theory. Suitable genetic operators are proposed for the suggested encoding. Based on the Subsumption Matrix(SM), an appropriate fitness function is suggested. Finally, Quantified Hierarchical Production Rules (HPRs) are generated from the discovered hierarchy, using Dempster Shafer theory. Experimental results are presented to demonstrate the performance of the proposed algorithm.

Image Restoration in Non-Linear Filtering Domain using MDB approach

This paper proposes a new technique based on nonlinear Minmax Detector Based (MDB) filter for image restoration. The aim of image enhancement is to reconstruct the true image from the corrupted image. The process of image acquisition frequently leads to degradation and the quality of the digitized image becomes inferior to the original image. Image degradation can be due to the addition of different types of noise in the original image. Image noise can be modeled of many types and impulse noise is one of them. Impulse noise generates pixels with gray value not consistent with their local neighborhood. It appears as a sprinkle of both light and dark or only light spots in the image. Filtering is a technique for enhancing the image. Linear filter is the filtering in which the value of an output pixel is a linear combination of neighborhood values, which can produce blur in the image. Thus a variety of smoothing techniques have been developed that are non linear. Median filter is the one of the most popular non-linear filter. When considering a small neighborhood it is highly efficient but for large window and in case of high noise it gives rise to more blurring to image. The Centre Weighted Mean (CWM) filter has got a better average performance over the median filter. However the original pixel corrupted and noise reduction is substantial under high noise condition. Hence this technique has also blurring affect on the image. To illustrate the superiority of the proposed approach, the proposed new scheme has been simulated along with the standard ones and various restored performance measures have been compared.

Diagnosis of the Abdominal Aorta Aneurysm in Magnetic Resonance Imaging Images

This paper presents a technique for diagnosis of the abdominal aorta aneurysm in magnetic resonance imaging (MRI) images. First, our technique is designed to segment the aorta image in MRI images. This is a required step to determine the volume of aorta image which is the important step for diagnosis of the abdominal aorta aneurysm. Our proposed technique can detect the volume of aorta in MRI images using a new external energy for snakes model. The new external energy for snakes model is calculated from Law-s texture. The new external energy can increase the capture range of snakes model efficiently more than the old external energy of snakes models. Second, our technique is designed to diagnose the abdominal aorta aneurysm by Bayesian classifier which is classification models based on statistical theory. The feature for data classification of abdominal aorta aneurysm was derived from the contour of aorta images which was a result from segmenting of our snakes model, i.e., area, perimeter and compactness. We also compare the proposed technique with the traditional snakes model. In our experiment results, 30 images are trained, 20 images are tested and compared with expert opinion. The experimental results show that our technique is able to provide more accurate results than 95%.

Constraint Based Frequent Pattern Mining Technique for Solving GCS Problem

Generalized Center String (GCS) problem are generalized from Common Approximate Substring problem and Common substring problems. GCS are known to be NP-hard allowing the problems lies in the explosion of potential candidates. Finding longest center string without concerning the sequence that may not contain any motifs is not known in advance in any particular biological gene process. GCS solved by frequent pattern-mining techniques and known to be fixed parameter tractable based on the fixed input sequence length and symbol set size. Efficient method known as Bpriori algorithms can solve GCS with reasonable time/space complexities. Bpriori 2 and Bpriori 3-2 algorithm are been proposed of any length and any positions of all their instances in input sequences. In this paper, we reduced the time/space complexity of Bpriori algorithm by Constrained Based Frequent Pattern mining (CBFP) technique which integrates the idea of Constraint Based Mining and FP-tree mining. CBFP mining technique solves the GCS problem works for all center string of any length, but also for the positions of all their mutated copies of input sequence. CBFP mining technique construct TRIE like with FP tree to represent the mutated copies of center string of any length, along with constraints to restraint growth of the consensus tree. The complexity analysis for Constrained Based FP mining technique and Bpriori algorithm is done based on the worst case and average case approach. Algorithm's correctness compared with the Bpriori algorithm using artificial data is shown.