Constitutive Equations for Human Saphenous Vein Coronary Artery Bypass Graft

Coronary artery bypass grafts (CABG) are widely studied with respect to hemodynamic conditions which play important role in presence of a restenosis. However, papers which concern with constitutive modeling of CABG are lacking in the literature. The purpose of this study is to find a constitutive model for CABG tissue. A sample of the CABG obtained within an autopsy underwent an inflation–extension test. Displacements were recoredered by CCD cameras and subsequently evaluated by digital image correlation. Pressure – radius and axial force – elongation data were used to fit material model. The tissue was modeled as onelayered composite reinforced by two families of helical fibers. The material is assumed to be locally orthotropic, nonlinear, incompressible and hyperelastic. Material parameters are estimated for two strain energy functions (SEF). The first is classical exponential. The second SEF is logarithmic which allows interpretation by means of limiting (finite) strain extensibility. Presented material parameters are estimated by optimization based on radial and axial equilibrium equation in a thick-walled tube. Both material models fit experimental data successfully. The exponential model fits significantly better relationship between axial force and axial strain than logarithmic one.

Parallel Branch and Bound Model Using Logarithmic Sampling (PBLS) for Symmetric Traveling Salesman Problem

Very Large and/or computationally complex optimization problems sometimes require parallel or highperformance computing for achieving a reasonable time for computation. One of the most popular and most complicate problems of this family is “Traveling Salesman Problem". In this paper we have introduced a Branch & Bound based algorithm for the solution of such complicated problems. The main focus of the algorithm is to solve the “symmetric traveling salesman problem". We reviewed some of already available algorithms and felt that there is need of new algorithm which should give optimal solution or near to the optimal solution. On the basis of the use of logarithmic sampling, it was found that the proposed algorithm produced a relatively optimal solution for the problem and results excellent performance as compared with the traditional algorithms of this series.

Schmitt Trigger Based SRAM Using Finfet Technology- Shorted Gate Mode

The most widely used semiconductor memory types are the Dynamic Random Access Memory (DRAM) and Static Random Access memory (SRAM). Competition among memory manufacturers drives the need to decrease power consumption and reduce the probability of read failure. A technology that is relatively new and has not been explored is the FinFET technology. In this paper, a single cell Schmitt Trigger Based Static RAM using FinFET technology is proposed and analyzed. The accuracy of the result is validated by means of HSPICE simulations with 32nm FinFET technology and the results are then compared with 6T SRAM using the same technology.

Dynamic Routing to Multiple Destinations in IP Networks using Hybrid Genetic Algorithm (DRHGA)

In this paper we have proposed a novel dynamic least cost multicast routing protocol using hybrid genetic algorithm for IP networks. Our protocol finds the multicast tree with minimum cost subject to delay, degree, and bandwidth constraints. The proposed protocol has the following features: i. Heuristic local search function has been devised and embedded with normal genetic operation to increase the speed and to get the optimized tree, ii. It is efficient to handle the dynamic situation arises due to either change in the multicast group membership or node / link failure, iii. Two different crossover and mutation probabilities have been used for maintaining the diversity of solution and quick convergence. The simulation results have shown that our proposed protocol generates dynamic multicast tree with lower cost. Results have also shown that the proposed algorithm has better convergence rate, better dynamic request success rate and less execution time than other existing algorithms. Effects of degree and delay constraints have also been analyzed for the multicast tree interns of search success rate.

Application of Micro-continuum Approach in the Estimation of Snow Drift Density, Velocity and Mass Transport in Hilly Bound Cold Regions

We estimate snow velocity and snow drift density on hilly terrain under the assumption that the drifting snow mass can be represented using a micro-continuum approach (i.e. using a nonclassical mechanics approach assuming a class of fluids for which basic equations of mass, momentum and energy have been derived). In our model, the theory of coupled stress fluids proposed by Stokes [1] has been employed for the computation of flow parameters. Analyses of bulk drift velocity, drift density, drift transport and mass transport of snow particles have been carried out and computations made, considering various parametric effects. Results are compared with those of classical mechanics (logarithmic wind profile). The results indicate that particle size affects the flow characteristics significantly.

Identification of Cardiac Arrhythmias using Natural Resonance Complex Frequencies

An electrocardiogram (ECG) feature extraction system based on the calculation of the complex resonance frequency employing Prony-s method is developed. Prony-s method is applied on five different classes of ECG signals- arrhythmia as a finite sum of exponentials depending on the signal-s poles and the resonant complex frequencies. Those poles and resonance frequencies of the ECG signals- arrhythmia are evaluated for a large number of each arrhythmia. The ECG signals of lead II (ML II) were taken from MIT-BIH database for five different types. These are the ventricular couplet (VC), ventricular tachycardia (VT), ventricular bigeminy (VB), and ventricular fibrillation (VF) and the normal (NR). This novel method can be extended to any number of arrhythmias. Different classification techniques were tried using neural networks (NN), K nearest neighbor (KNN), linear discriminant analysis (LDA) and multi-class support vector machine (MC-SVM).

Evaluation of Handover Latency in Intra- Domain Mobility

Mobile IPv6 (MIPv6) describes how mobile node can change its point of attachment from one access router to another. As a demand for wireless mobile devices increases, many enhancements for macro-mobility (inter-domain) protocols have been proposed, designed and implemented in Mobile IPv6. Hierarchical Mobile IPv6 (HMIPv6) is one of them that is designed to reduce the amount of signaling required and to improve handover speed for mobile connections. This is achieved by introducing a new network entity called Mobility Anchor Point (MAP). This report presents a comparative study of the Hierarchical Mobility IPv6 and Mobile IPv6 protocols and we have narrowed down the scope to micro-mobility (intra-domain). The architecture and operation of each protocol is studied and they are evaluated based on the Quality of Service (QoS) parameter; handover latency. The simulation was carried out by using the Network Simulator-2. The outcome from this simulation has been discussed. From the results, it shows that, HMIPv6 performs best under intra-domain mobility compared to MIPv6. The MIPv6 suffers large handover latency. As enhancement we proposed to HMIPv6 to locate the MAP to be in the middle of the domain with respect to all Access Routers. That gives approximately same distance between MAP and Mobile Node (MN) regardless of the new location of MN, and possible shorter distance. This will reduce the delay since the distance is shorter. As a future work performance analysis is to be carried for the proposed HMIPv6 and compared to HMIPv6.

An Unified Approach to Thermodynamics of Power Yield in Thermal, Chemical and Electrochemical Systems

This paper unifies power optimization approaches in various energy converters, such as: thermal, solar, chemical, and electrochemical engines, in particular fuel cells. Thermodynamics leads to converter-s efficiency and limiting power. Efficiency equations serve to solve problems of upgrading and downgrading of resources. While optimization of steady systems applies the differential calculus and Lagrange multipliers, dynamic optimization involves variational calculus and dynamic programming. In reacting systems chemical affinity constitutes a prevailing component of an overall efficiency, thus the power is analyzed in terms of an active part of chemical affinity. The main novelty of the present paper in the energy yield context consists in showing that the generalized heat flux Q (involving the traditional heat flux q plus the product of temperature and the sum products of partial entropies and fluxes of species) plays in complex cases (solar, chemical and electrochemical) the same role as the traditional heat q in pure heat engines. The presented methodology is also applied to power limits in fuel cells as to systems which are electrochemical flow engines propelled by chemical reactions. The performance of fuel cells is determined by magnitudes and directions of participating streams and mechanism of electric current generation. Voltage lowering below the reversible voltage is a proper measure of cells imperfection. The voltage losses, called polarization, include the contributions of three main sources: activation, ohmic and concentration. Examples show power maxima in fuel cells and prove the relevance of the extension of the thermal machine theory to chemical and electrochemical systems. The main novelty of the present paper in the FC context consists in introducing an effective or reduced Gibbs free energy change between products p and reactants s which take into account the decrease of voltage and power caused by the incomplete conversion of the overall reaction.

CAD/CAM Algorithms for 3D Woven Multilayer Textile Structures

This paper proposes new algorithms for the computeraided design and manufacture (CAD/CAM) of 3D woven multi-layer textile structures. Existing commercial CAD/CAM systems are often restricted to the design and manufacture of 2D weaves. Those CAD/CAM systems that do support the design and manufacture of 3D multi-layer weaves are often limited to manual editing of design paper grids on the computer display and weave retrieval from stored archives. This complex design activity is time-consuming, tedious and error-prone and requires considerable experience and skill of a technical weaver. Recent research reported in the literature has addressed some of the shortcomings of commercial 3D multi-layer weave CAD/CAM systems. However, earlier research results have shown the need for further work on weave specification, weave generation, yarn path editing and layer binding. Analysis of 3D multi-layer weaves in this research has led to the design and development of efficient and robust algorithms for the CAD/CAM of 3D woven multi-layer textile structures. The resulting algorithmically generated weave designs can be used as a basis for lifting plans that can be loaded onto looms equipped with electronic shedding mechanisms for the CAM of 3D woven multi-layer textile structures.

Different Teaching Methods for Program Design and Algorithmic Language

This paper covers the present situation and problem of experimental teaching of mathematics specialty in recent years, puts forward and demonstrates experimental teaching methods for different education. From the aspects of content and experimental teaching approach, uses as an example the course “Experiment for Program Designing & Algorithmic Language" and discusses teaching practice and laboratory course work. In addition a series of successful methods and measures are introduced in experimental teaching.

Effect Comparison of Speckle Noise Reduction Filters on 2D-Echocardigraphic Images

Echocardiography imaging is one of the most common diagnostic tests that are widely used for assessing the abnormalities of the regional heart ventricle function. The main goal of the image enhancement task in 2D-echocardiography (2DE) is to solve two major anatomical structure problems; speckle noise and low quality. Therefore, speckle noise reduction is one of the important steps that used as a pre-processing to reduce the distortion effects in 2DE image segmentation. In this paper, we present the common filters that based on some form of low-pass spatial smoothing filters such as Mean, Gaussian, and Median. The Laplacian filter was used as a high-pass sharpening filter. A comparative analysis was presented to test the effectiveness of these filters after being applied to original 2DE images of 4-chamber and 2-chamber views. Three statistical quantity measures: root mean square error (RMSE), peak signal-to-ratio (PSNR) and signal-tonoise ratio (SNR) are used to evaluate the filter performance quantitatively on the output enhanced image.

Efficient Supplies to Assembly Areas from Storage Stages

Guaranteeing the availability of the required parts at the scheduled time represents a key logistical challenge. This is especially important when several parts are required together. This article describes a tool that supports the positioning in the area of conflict between low stock costs and a high service level for a consumer.

Combination of Different Classifiers for Cardiac Arrhythmia Recognition

This paper describes a new supervised fusion (hybrid) electrocardiogram (ECG) classification solution consisting of a new QRS complex geometrical feature extraction as well as a new version of the learning vector quantization (LVQ) classification algorithm aimed for overcoming the stability-plasticity dilemma. Toward this objective, after detection and delineation of the major events of ECG signal via an appropriate algorithm, each QRS region and also its corresponding discrete wavelet transform (DWT) are supposed as virtual images and each of them is divided into eight polar sectors. Then, the curve length of each excerpted segment is calculated and is used as the element of the feature space. To increase the robustness of the proposed classification algorithm versus noise, artifacts and arrhythmic outliers, a fusion structure consisting of five different classifiers namely as Support Vector Machine (SVM), Modified Learning Vector Quantization (MLVQ) and three Multi Layer Perceptron-Back Propagation (MLP–BP) neural networks with different topologies were designed and implemented. The new proposed algorithm was applied to all 48 MIT–BIH Arrhythmia Database records (within–record analysis) and the discrimination power of the classifier in isolation of different beat types of each record was assessed and as the result, the average accuracy value Acc=98.51% was obtained. Also, the proposed method was applied to 6 number of arrhythmias (Normal, LBBB, RBBB, PVC, APB, PB) belonging to 20 different records of the aforementioned database (between– record analysis) and the average value of Acc=95.6% was achieved. To evaluate performance quality of the new proposed hybrid learning machine, the obtained results were compared with similar peer– reviewed studies in this area.

Atrial Fibrillation Analysis Based on Blind Source Separation in 12-lead ECG

Atrial Fibrillation is the most common sustained arrhythmia encountered by clinicians. Because of the invisible waveform of atrial fibrillation in atrial activation for human, it is necessary to develop an automatic diagnosis system. 12-Lead ECG now is available in hospital and is appropriate for using Independent Component Analysis to estimate the AA period. In this research, we also adopt a second-order blind identification approach to transform the sources extracted by ICA to more precise signal and then we use frequency domain algorithm to do the classification. In experiment, we gather a significant result of clinical data.

Averaging Mechanisms to Decision Making for Handover in GSM

In cellular networks, limited availability of resources has to be tapped to its fullest potential. In view of this aspect, a sophisticated averaging and voting technique has been discussed in this paper, wherein the radio resources available are utilized to the fullest value by taking into consideration, several network and radio parameters which decide on when the handover has to be made and thereby reducing the load on Base station .The increase in the load on the Base station might be due to several unnecessary handover taking place which can be eliminated by making judicious use of the radio and network parameters.

Obstacles as Switches between Different Cardiac Arrhythmias

Ventricular fibrillation is a very important health problem as is the cause of most of the sudden deaths in the world. Waves of electrical activity are sent by the SA node, propagate through the cardiac tissue and activate the mechanisms of cell contraction, and therefore are responsible to pump blood to the body harmonically. A spiral wave is an abnormal auto sustainable wave that is responsible of certain types of arrhythmias. When these waves break up, give rise to the fibrillation regime, in which there is a complete loss in the coordination of the contraction of the heart muscle. Interaction of spiral waves and obstacles is also of great importance as it is believed that the attachment of a spiral wave to an obstacle can provide with a transition of two different arrhythmias. An obstacle can be partially excitable or non excitable. In this talk, we present a numerical study of the interaction of meandering spiral waves with partially and non excitable obstacles and focus on the problem where the obstacle plays a fundamental role in the switch between different spiral regimes, which represent different arrhythmic regimes. Particularly, we study the phenomenon of destabilization of spiral waves due to the presence of obstacles, a phenomenon not completely understood (This work will appear as a Chapter in a Book named Cardiac Arrhytmias by INTECH under the name "Spiral Waves, Obstacles and Cardiac Arrhythmias", ISBN 979-953-307-050-5.).

Multi-algorithmic Iris Authentication System

The paper proposes a novel technique for iris recognition using texture and phase features. Texture features are extracted on the normalized iris strip using Haar Wavelet while phase features are obtained using LOG Gabor Wavelet. The matching scores generated from individual modules are combined using sum of score technique. The system is tested on database obtained from Bath University and Indian Institute of Technology Kanpur and is giving an accuracy of 95.62% and 97.66% respectively. The FAR and FRR of the combined system is also reduced comparatively.

Role of Investment in the Course of Economic Growth in Pakistan

The present research was focused to investigate the role of investment in the course of economic growth with reference to Pakistan. The study analyzed the role of the public and private investment and impact of the political and macroeconomic uncertainty on economic growth of Pakistan by using the vector autoregressive approach (VAR). In long-run both public and private investment showed a positive impact on economic growth but the growth was largely driven by private investment as compared to public investment. Government consumption expenditure, economic uncertainty and political instability hampered the economic growth of Pakistan. In short-run the private investment positively influences the growth but there was negative and insignificant effect of the public investment and government consumption expenditure on the growth. There was a positive relationship found between economic uncertainty (proxy for inflation) and GDP in short run.

N-Sun Decomposition of Complete, Complete Bipartite and Some Harary Graphs

Graph decompositions are vital in the study of combinatorial design theory. A decomposition of a graph G is a partition of its edge set. An n-sun graph is a cycle Cn with an edge terminating in a vertex of degree one attached to each vertex. In this paper, we define n-sun decomposition of some even order graphs with a perfect matching. We have proved that the complete graph K2n, complete bipartite graph K2n, 2n and the Harary graph H4, 2n have n-sun decompositions. A labeling scheme is used to construct the n-suns.

The Resource Description Framework (RDF) as a Modern Structure for Medical Data

The amount and heterogeneity of data in biomedical research, notably in interdisciplinary fields, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charité - University Hospital Berlin has established together with the German Research Foundation (DFG) a new information service centre for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). Beside a collaborative aspect to create new research groups every single partner or institution of this science information centre making his own data available is allowed to search the whole data pool of the various involved centres. A core task is the implementation of a non-restricting open data structure for the various different data sources. We decided to use a modern RDF model and in a first phase transformed original data coming from the web-based Electronic Patient Record database TBase©.