Shape Restoration of the Left Ventricle

This paper describes an automatic algorithm to restore the shape of three-dimensional (3D) left ventricle (LV) models created from magnetic resonance imaging (MRI) data using a geometry-driven optimization approach. Our basic premise is to restore the LV shape such that the LV epicardial surface is smooth after the restoration. A geometrical measure known as the Minimum Principle Curvature (κ2) is used to assess the smoothness of the LV. This measure is used to construct the objective function of a two-step optimization process. The objective of the optimization is to achieve a smooth epicardial shape by iterative in-plane translation of the MRI slices. Quantitatively, this yields a minimum sum in terms of the magnitude of κ 2, when κ2 is negative. A limited memory quasi-Newton algorithm, L-BFGS-B, is used to solve the optimization problem. We tested our algorithm on an in vitro theoretical LV model and 10 in vivo patient-specific models which contain significant motion artifacts. The results show that our method is able to automatically restore the shape of LV models back to smoothness without altering the general shape of the model. The magnitudes of in-plane translations are also consistent with existing registration techniques and experimental findings.

A Block World Problem Based Sudoku Solver

There are many approaches proposed for solving Sudoku puzzles. One of them is by modelling the puzzles as block world problems. There have been three model for Sudoku solvers based on this approach. Each model expresses Sudoku solver as a parameterized multi agent systems. In this work, we propose a new model which is an improvement over the existing models. This paper presents the development of a Sudoku solver that implements all the proposed models. Some experiments have been conducted to determine the performance of each model.

Fuzzy Logic PID Control of Automatic Voltage Regulator System

The application of a simple microcontroller to deal with a three variable input and a single output fuzzy logic controller, with Proportional – Integral – Derivative (PID) response control built-in has been tested for an automatic voltage regulator. The fuzzifiers are based on fixed range of the variables of output voltage. The control output is used to control the wiper motor of the auto transformer to adjust the voltage, using fuzzy logic principles, so that the voltage is stabilized. In this report, the author will demonstrate how fuzzy logic might provide elegant and efficient solutions in the design of multivariable control based on experimental results rather than on mathematical models.

Optimizing usage of ICTs and Outsourcing Strategic in Business Models and Customer Satisfaction

Nowadays, under developed countries for progress in science and technology and decreasing the technologic gap with developed countries, increasing the capacities and technology transfer from developed countries. To remain competitive, industry is continually searching for new methods to evolve their products. Business model is one of the latest buzzwords in the Internet and electronic business world. To be successful, organizations must look into the needs and wants of their customers. This research attempts to identify a specific feature of the company with a strong competitive advantage by analyzing the cause of Customer satisfaction. Due to the rapid development of knowledge and information technology, business environments have become much more complicated. Information technology can help a firm aiming to gain a competitive advantage. This study explores the role and effect of Information Communication Technology in Business Models and Customer satisfaction on firms and also relationships between ICTs and Outsourcing strategic.

Reversible, Embedded and Highly Scalable Image Compression System

In this work a new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuous-tone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different importance levels from which the bit stream will be generated. The subcomponents of each importance level are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several improvement levels.

COTT – A Testability Framework for Object-Oriented Software Testing

Testable software has two inherent properties – observability and controllability. Observability facilitates observation of internal behavior of software to required degree of detail. Controllability allows creation of difficult-to-achieve states prior to execution of various tests. In this paper, we describe COTT, a Controllability and Observability Testing Tool, to create testable object-oriented software. COTT provides a framework that helps the user to instrument object-oriented software to build the required controllability and observability. During testing, the tool facilitates creation of difficult-to-achieve states required for testing of difficultto- test conditions and observation of internal details of execution at unit, integration and system levels. The execution observations are logged in a test log file, which are used for post analysis and to generate test coverage reports.

Density, Strength, Thermal Conductivity and Leachate Characteristics of Light-Weight Fired Clay Bricks Incorporating Cigarette Butts

Several trillion cigarettes produced worldwide annually lead to many thousands of kilograms of toxic waste. Cigarette butts (CBs) accumulate in the environment due to the poor biodegradability of the cellulose acetate filters. This paper presents some of the results from a continuing study on recycling CBs into fired clay bricks. Physico-mechanical properties of fired clay bricks manufactured with different percentages of CBs are reported and discussed. The results show that the density of fired bricks was reduced by up to 30 %, depending on the percentage of CBs incorporated into the raw materials. Similarly, the compressive strength of bricks tested decreased according to the percentage of CBs included in the mix. The thermal conductivity performance of bricks was improved by 51 and 58 % for 5 and 10 % CBs content respectively. Leaching tests were carried out to investigate the levels of possible leachates of heavy metals from the manufactured clay-CB bricks. The results revealed trace amounts of heavy metals.

Nonlinear Modeling and Analysis of AAC infilled Sandwich Panels for out of Plane Loads

Sandwich panels are widely used in the construction industry for their ease of assembly, light weight and efficient thermal performance. They are composed of two RC thin outer layers separated by an insulating inner layer. In this research the inner insulating layer is made of lightweight Autoclaved Aerated Concrete (AAC) blocks which has good thermal insulation properties and yet possess reasonable mechanical strength. The shear strength of the AAC infill is relied upon to replace the traditionally used insulating foam and to provide the shear capacity of the panel. A comprehensive experimental program was conducted on full scale sandwich panels subjected to bending. In this paper, detailed numerical modeling of the tested sandwich panels is reported. Nonlinear 3-D finite element modeling of the composite action of the sandwich panel is developed using ANSYS. Solid elements with different crashing and cracking capabilities and different constitutive laws were selected for the concrete and the AAC. Contact interface elements are used in this research to adequately model the shear transfer at the interface between the different layers. The numerical results showed good correlation with the experimental ones indicating the adequacy of the model in estimating the loading capacity of panels.

Curvature Ductility Factor of Rectangular Sections Reinforced Concrete Beams

The present work presents a method of calculating the ductility of rectangular sections of beams considering nonlinear behavior of concrete and steel. This calculation procedure allows us to trace the curvature of the section according to the bending moment, and consequently deduce ductility. It also allowed us to study the various parameters that affect the value of the ductility. A comparison of the effect of maximum rates of tension steel, adopted by the codes, ACI [1], EC8 [2] and RPA [3] on the value of the ductility was made. It was concluded that the maximum rate of steels permitted by the ACI [1] codes and RPA [3] are almost similar in their effect on the ductility and too high. Therefore, the ductility mobilized in case of an earthquake is low, the inverse of code EC8 [2]. Recommendations have been made in this direction.

Analytical Model Based Evaluation of Human Machine Interfaces Using Cognitive Modeling

Cognitive models allow predicting some aspects of utility and usability of human machine interfaces (HMI), and simulating the interaction with these interfaces. The action of predicting is based on a task analysis, which investigates what a user is required to do in terms of actions and cognitive processes to achieve a task. Task analysis facilitates the understanding of the system-s functionalities. Cognitive models are part of the analytical approaches, that do not associate the users during the development process of the interface. This article presents a study about the evaluation of a human machine interaction with a contextual assistant-s interface using ACTR and GOMS cognitive models. The present work shows how these techniques may be applied in the evaluation of HMI, design and research by emphasizing firstly the task analysis and secondly the time execution of the task. In order to validate and support our results, an experimental study of user performance is conducted at the DOMUS laboratory, during the interaction with the contextual assistant-s interface. The results of our models show that the GOMS and ACT-R models give good and excellent predictions respectively of users performance at the task level, as well as the object level. Therefore, the simulated results are very close to the results obtained in the experimental study.

An Owl Ontology for Commonkads Template Knowledge Models

This paper gives an overview of how an OWL ontology has been created to represent template knowledge models defined in CML that are provided by CommonKADS. CommonKADS is a mature knowledge engineering methodology which proposes the use of template knowledge model for knowledge modelling. The aim of developing this ontology is to present the template knowledge model in a knowledge representation language that can be easily understood and shared in the knowledge engineering community. Hence OWL is used as it has become a standard for ontology and also it already has user friendly tools for viewing and editing.

A Framework for Product Development Process including HW and SW Components

This paper proposes a framework for product development including hardware and software components. It provides separation of hardware dependent software, modifications of current product development process, and integration of software modules with existing product configuration models and assembly product structures. In order to decide the dependent software, the framework considers product configuration modules and engineering changes of associated software and hardware components. In order to support efficient integration of the two different hardware and software development, a modified product development process is proposed. The process integrates the dependent software development into product development through the interchanges of specific product information. By using existing product data models in Product Data Management (PDM), the framework represents software as modules for product configurations and software parts for product structure. The framework is applied to development of a robot system in order to show its effectiveness.

Sovereign Credit Risk Measures

This paper focuses on sovereign credit risk meaning a hot topic related to the current Eurozone crisis. In the light of the recent financial crisis, market perception of the creditworthiness of individual sovereigns has changed significantly. Before the outbreak of the financial crisis, market participants did not differentiate between credit risk born by individual states despite different levels of public indebtedness. In the proceeding of the financial crisis, the market participants became aware of the worsening fiscal situation in the European countries and started to discriminate among government issuers. Concerns about the increasing sovereign risk were reflected in surging sovereign risk premium. The main of this paper is to shed light on the characteristics of the sovereign risk with the special attention paid to the mutual relation between credit spread and the CDS premium as the main measures of the sovereign risk premium.

Development of Mathematical Model for Overall Oxygen Transfer Coefficient of an Aerator and Comparison with CFD Modeling

The value of overall oxygen transfer Coefficient (KLa), which is the best measure of oxygen transfer in water through aeration, is obtained by a simple approach, which sufficiently explains the utility of the method to eliminate the discrepancies due to inaccurate assumption of saturation dissolved oxygen concentration. The rate of oxygen transfer depends on number of factors like intensity of turbulence, which in turns depends on the speed of rotation, size, and number of blades, diameter and immersion depth of the rotor, and size and shape of aeration tank, as well as on physical, chemical, and biological characteristic of water. An attempt is made in this paper to correlate the overall oxygen transfer Coefficient (KLa), as an independent parameter with other influencing parameters mentioned above. It has been estimated that the simulation equation developed predicts the values of KLa and power with an average standard error of estimation of 0.0164 and 7.66 respectively and with R2 values of 0.979 and 0.989 respectively, when compared with experimentally determined values. The comparison of this model is done with the model generated using Computational fluid dynamics (CFD) and both the models were found to be in good agreement with each other.

Effect of Physical Contact (Hand-Holding) on Heart Rate Variability

Heart-s electric field can be measured anywhere on the surface of the body (ECG). When individuals touch, one person-s ECG signal can be registered in other person-s EEG and elsewhere on his body. Now, the aim of this study was to test the hypothesis that physical contact (hand-holding) of two persons changes their heart rate variability. Subjects were sixteen healthy female (age: 20- 26) which divided into eight sets. In each sets, we had two friends that they passed intimacy test of J.sternberg. ECG of two subjects (each set) acquired for 5 minutes before hand-holding (as control group) and 5 minutes during they held their hands (as experimental group). Then heart rate variability signals were extracted from subjects' ECG and analyzed in linear feature space (time and frequency domain) and nonlinear feature space. Considering the results, we conclude that physical contact (hand-holding of two friends) increases parasympathetic activity, as indicate by increase SD1, SD1/SD2, HF and MF power (p

Partial Oxidation of Methane in the Pulsed Compression Reactor: Experiments and Simulation

The Pulsed Compression Reactor promises to be a compact, economical and energy efficient alternative to conventional chemical reactors. In this article, the production of synthesis gas using the Pulsed Compression Reactor is investigated. This is done experimentally as well as with simulations. The experiments are done by means of a single shot reactor, which replicates a representative, single reciprocation of the Pulsed Compression Reactor with great control over the reactant composition, reactor temperature and pressure and temperature history. Simulations are done with a relatively simple method, which uses different models for the chemistry and thermodynamic properties of the species in the reactor. Simulation results show very good agreement with the experimental data, and give great insight into the reaction processes that occur within the cycle.

Increasing Convergence Rate of a Fractionally-Spaced Channel Equalizer

In this paper a technique for increasing the convergence rate of fractionally spaced channel equalizer is proposed. Instead of symbol-spaced updating of the equalizer filter, a mechanism has been devised to update the filter at a higher rate. This ensures convergence of the equalizer filter at a higher rate and therefore less time-consuming. The proposed technique has been simulated and tested for two-ray modeled channels with various delay spreads. These channels include minimum-phase and nonminimum- phase channels. Simulation results suggest that that proposed technique outperforms the conventional technique of symbol-spaced updating of equalizer filter.

Energy-Efficient Electrical Power Distribution with Multi-Agent Control at Parallel DC/DC Converters

Consumer electronics are pervasive. It is impossible to imagine a household or office without DVD players, digital cameras, printers, mobile phones, shavers, electrical toothbrushes, etc. All these devices operate at different voltage levels ranging from 1.8 to 20 VDC, in the absence of universal standards. The voltages available are however usually 120/230 VAC at 50/60 Hz. This situation makes an individual electrical energy conversion system necessary for each device. Such converters usually involve several conversion stages and often operate with excessive losses and poor reliability. The aim of the project presented in this paper is to design and implement a multi-channel DC/DC converter system, customizing the output voltage and current ratings according to the requirements of the load. Distributed, multi-agent techniques will be applied for the control of the DC/DC converters.

Region-Based Image Fusion with Artificial Neural Network

For most image fusion algorithms separate relationship by pixels in the image and treat them more or less independently. In addition, they have to be adjusted different parameters in different time or weather. In this paper, we propose a region–based image fusion which combines aspects of feature and pixel-level fusion method to replace only by pixel. The basic idea is to segment far infrared image only and to add information of each region from segmented image to visual image respectively. Then we determine different fused parameters according different region. At last, we adopt artificial neural network to deal with the problems of different time or weather, because the relationship between fused parameters and image features are nonlinear. It render the fused parameters can be produce automatically according different states. The experimental results present the method we proposed indeed have good adaptive capacity with automatic determined fused parameters. And the architecture can be used for lots of applications.

Production Throughput Modeling under Five Uncertain Variables Using Bayesian Inference

Throughput is an important measure of performance of production system. Analyzing and modeling of production throughput is complex in today-s dynamic production systems due to uncertainties of production system. The main reasons are that uncertainties are materialized when the production line faces changes in setup time, machinery break down, lead time of manufacturing, and scraps. Besides, demand changes are fluctuating from time to time for each product type. These uncertainties affect the production performance. This paper proposes Bayesian inference for throughput modeling under five production uncertainties. Bayesian model utilized prior distributions related to previous information about the uncertainties where likelihood distributions are associated to the observed data. Gibbs sampling algorithm as the robust procedure of Monte Carlo Markov chain was employed for sampling unknown parameters and estimating the posterior mean of uncertainties. The Bayesian model was validated with respect to convergence and efficiency of its outputs. The results presented that the proposed Bayesian models were capable to predict the production throughput with accuracy of 98.3%.