Endothelial Specificity of ICAM2, Flt-1, and Tie2 Promoters In Vitro and In Vivo

To identify an endothelial cell-specific promoter suitable for vascular-specific targeting, we tested five promoters in vitro--Tie2SE, Tie2LE, ICAM2, Flt-1 and vWF--for promoter activity and specificity in endothelial cells, smooth muscle cells and non-vascular resident cells as well as tissues. These promoters, except for vWF, exhibited good endothelial activity and specificity in vitro. In a syngenic heart transplantation model, the ICAM2 promoter was variably functional in coronary endothelial cells of donor hearts. Thus, the ICAM2, Flt-1, Tie2SE and Tie2LE promoters hold promise for endothelial-specific targeting, but in vitro expression may not predict in vivo expression.

Combination of Different Classifiers for Cardiac Arrhythmia Recognition

This paper describes a new supervised fusion (hybrid) electrocardiogram (ECG) classification solution consisting of a new QRS complex geometrical feature extraction as well as a new version of the learning vector quantization (LVQ) classification algorithm aimed for overcoming the stability-plasticity dilemma. Toward this objective, after detection and delineation of the major events of ECG signal via an appropriate algorithm, each QRS region and also its corresponding discrete wavelet transform (DWT) are supposed as virtual images and each of them is divided into eight polar sectors. Then, the curve length of each excerpted segment is calculated and is used as the element of the feature space. To increase the robustness of the proposed classification algorithm versus noise, artifacts and arrhythmic outliers, a fusion structure consisting of five different classifiers namely as Support Vector Machine (SVM), Modified Learning Vector Quantization (MLVQ) and three Multi Layer Perceptron-Back Propagation (MLP–BP) neural networks with different topologies were designed and implemented. The new proposed algorithm was applied to all 48 MIT–BIH Arrhythmia Database records (within–record analysis) and the discrimination power of the classifier in isolation of different beat types of each record was assessed and as the result, the average accuracy value Acc=98.51% was obtained. Also, the proposed method was applied to 6 number of arrhythmias (Normal, LBBB, RBBB, PVC, APB, PB) belonging to 20 different records of the aforementioned database (between– record analysis) and the average value of Acc=95.6% was achieved. To evaluate performance quality of the new proposed hybrid learning machine, the obtained results were compared with similar peer– reviewed studies in this area.

Efficient Web-Learning Collision Detection Tool on Five-Axis Machine

As networking has become popular, Web-learning tends to be a trend while designing a tool. Moreover, five-axis machining has been widely used in industry recently; however, it has potential axial table colliding problems. Thus this paper aims at proposing an efficient web-learning collision detection tool on five-axis machining. However, collision detection consumes heavy resource that few devices can support, thus this research uses a systematic approach based on web knowledge to detect collision. The methodologies include the kinematics analyses for five-axis motions, separating axis method for collision detection, and computer simulation for verification. The machine structure is modeled as STL format in CAD software. The input to the detection system is the g-code part program, which describes the tool motions to produce the part surface. This research produced a simulation program with C programming language and demonstrated a five-axis machining example with collision detection on web site. The system simulates the five-axis CNC motion for tool trajectory and detects for any collisions according to the input g-codes and also supports high-performance web service benefiting from C. The result shows that our method improves 4.5 time of computational efficiency, comparing to the conventional detection method.

Hippocampus Segmentation using a Local Prior Model on its Boundary

Segmentation techniques based on Active Contour Models have been strongly benefited from the use of prior information during their evolution. Shape prior information is captured from a training set and is introduced in the optimization procedure to restrict the evolution into allowable shapes. In this way, the evolution converges onto regions even with weak boundaries. Although significant effort has been devoted on different ways of capturing and analyzing prior information, very little thought has been devoted on the way of combining image information with prior information. This paper focuses on a more natural way of incorporating the prior information in the level set framework. For proof of concept the method is applied on hippocampus segmentation in T1-MR images. Hippocampus segmentation is a very challenging task, due to the multivariate surrounding region and the missing boundary with the neighboring amygdala, whose intensities are identical. The proposed method, mimics the human segmentation way and thus shows enhancements in the segmentation accuracy.

Application of Wavelet Neural Networks in Optimization of Skeletal Buildings under Frequency Constraints

The main goal of the present work is to decrease the computational burden for optimum design of steel frames with frequency constraints using a new type of neural networks called Wavelet Neural Network. It is contested to train a suitable neural network for frequency approximation work as the analysis program. The combination of wavelet theory and Neural Networks (NN) has lead to the development of wavelet neural networks. Wavelet neural networks are feed-forward networks using wavelet as activation function. Wavelets are mathematical functions within suitable inner parameters, which help them to approximate arbitrary functions. WNN was used to predict the frequency of the structures. In WNN a RAtional function with Second order Poles (RASP) wavelet was used as a transfer function. It is shown that the convergence speed was faster than other neural networks. Also comparisons of WNN with the embedded Artificial Neural Network (ANN) and with approximate techniques and also with analytical solutions are available in the literature.

Improving the Shunt Active Power Filter Performance Using Synchronous Reference Frame PI Based Controller with Anti-Windup Scheme

In this paper the reference current for Voltage Source Converter (VSC) of the Shunt Active Power Filter (SAPF) is generated using Synchronous Reference Frame method, incorporating the PI controller with anti-windup scheme. The proposed method improves the harmonic filtering by compensating the winding up phenomenon caused by the integral term of the PI controller. Using Reference Frame Transformation, the current is transformed from om a - b - c stationery frame to rotating 0 - d - q frame. Using the PI controller, the current in the 0 - d - q frame is controlled to get the desired reference signal. A controller with integral action combined with an actuator that becomes saturated can give some undesirable effects. If the control error is so large that the integrator saturates the actuator, the feedback path becomes ineffective because the actuator will remain saturated even if the process output changes. The integrator being an unstable system may then integrate to a very large value, the phenomenon known as integrator windup. Implementing the integrator anti-windup circuit turns off the integrator action when the actuator saturates, hence improving the performance of the SAPF and dynamically compensating harmonics in the power network. In this paper the system performance is examined with Shunt Active Power Filter simulation model.

A Formative Assessment Tool for Effective Feedback

In this study we present our developed formative assessment tool for students' assignments. The tool enables lecturers to define assignments for the course and assign each problem in each assignment a list of criteria and weights by which the students' work is evaluated. During assessment, the lecturers feed the scores for each criterion with justifications. When the scores of the current assignment are completely fed in, the tool automatically generates reports for both students and lecturers. The students receive a report by email including detailed description of their assessed work, their relative score and their progress across the criteria along the course timeline. This information is presented via charts generated automatically by the tool based on the scores fed in. The lecturers receive a report that includes summative (e.g., averages, standard deviations) and detailed (e.g., histogram) data of the current assignment. This information enables the lecturers to follow the class achievements and adjust the learning process accordingly. The tool was examined on two pilot groups of college students that study a course in (1) Object-Oriented Programming (2) Plane Geometry. Results reveal that most of the students were satisfied with the assessment process and the reports produced by the tool. The lecturers who used the tool were also satisfied with the reports and their contribution to the learning process.

An Intelligent System for Phish Detection, using Dynamic Analysis and Template Matching

Phishing, or stealing of sensitive information on the web, has dealt a major blow to Internet Security in recent times. Most of the existing anti-phishing solutions fail to handle the fuzziness involved in phish detection, thus leading to a large number of false positives. This fuzziness is attributed to the use of highly flexible and at the same time, highly ambiguous HTML language. We introduce a new perspective against phishing, that tries to systematically prove, whether a given page is phished or not, using the corresponding original page as the basis of the comparison. It analyzes the layout of the pages under consideration to determine the percentage distortion between them, indicative of any form of malicious alteration. The system design represents an intelligent system, employing dynamic assessment which accurately identifies brand new phishing attacks and will prove effective in reducing the number of false positives. This framework could potentially be used as a knowledge base, in educating the internet users against phishing.

Study on the Particle Removal Efficiency of Multi Inner Stage Cyclone by CFD Simulation

A new multi inner stage (MIS) cyclone was designed to remove the acidic gas and fine particles produced from electronic industry. To characterize gas flow in MIS cyclone, pressure and velocity distribution were calculated by means of CFD program. Also, the flow locus of fine particles and particle removal efficiency were analyzed by Lagrangian method. When outlet pressure condition was –100mmAq, the efficiency was the best in this study.

Averaging Mechanisms to Decision Making for Handover in GSM

In cellular networks, limited availability of resources has to be tapped to its fullest potential. In view of this aspect, a sophisticated averaging and voting technique has been discussed in this paper, wherein the radio resources available are utilized to the fullest value by taking into consideration, several network and radio parameters which decide on when the handover has to be made and thereby reducing the load on Base station .The increase in the load on the Base station might be due to several unnecessary handover taking place which can be eliminated by making judicious use of the radio and network parameters.

Zero Inflated Strict Arcsine Regression Model

Zero inflated strict arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, we extend zero inflated strict arcsine model to zero inflated strict arcsine regression model by taking into consideration the extra variability caused by extra zeros and covariates in count data. Maximum likelihood estimation method is used in estimating the parameters for this zero inflated strict arcsine regression model.

A New Hardware Implementation of Manchester Line Decoder

In this paper, we present a simple circuit for Manchester decoding and without using any complicated or programmable devices. This circuit can decode 90kbps of transmitted encoded data; however, greater than this transmission rate can be decoded if high speed devices were used. We also present a new method for extracting the embedded clock from Manchester data in order to use it for serial-to-parallel conversion. All of our experimental measurements have been done using simulation.

Wind Load Characteristics in Libya

Recent trends in building constructions in Libya are more toward tall (high-rise) building projects. As a consequence, a better estimation of the lateral loading in the design process is becoming the focal of a safe and cost effective building industry. Byin- large, Libya is not considered a potential earthquake prone zone, making wind is the dominant design lateral loads. Current design practice in the country estimates wind speeds on a mere random bases by considering certain factor of safety to the chosen wind speed. Therefore, a need for a more accurate estimation of wind speeds in Libya was the motivation behind this study. Records of wind speed data were collected from 22 metrological stations in Libya, and were statistically analysed. The analysis of more than four decades of wind speed records suggests that the country can be divided into four zones of distinct wind speeds. A computer “survey" program was manipulated to draw design wind speeds contour map for the state of Libya. The paper presents the statistical analysis of Libya-s recorded wind speed data and proposes design wind speed values for a 50-year return period that covers the entire country.

Stability Issues on an Implemented All-Pass Filter Circuitry

The so-called all-pass filter circuits are commonly used in the field of signal processing, control and measurement. Being connected to capacitive loads, these circuits tend to loose their stability; therefore the elaborate analysis of their dynamic behavior is necessary. The compensation methods intending to increase the stability of such circuits are discussed in this paper, including the socalled lead-lag compensation technique being treated in detail. For the dynamic modeling, a two-port network model of the all-pass filter is being derived. The results of the model analysis show, that effective lead-lag compensation can be achieved, alone by the optimization of the circuit parameters; therefore the application of additional electric components are not needed to fulfill the stability requirement.

Abstraction Hierarchies for Engineering Design

Complex engineering design problems consist of numerous factors of varying criticalities. Considering fundamental features of design and inferior details alike will result in an extensive waste of time and effort. Design parameters should be introduced gradually as appropriate based on their significance relevant to the problem context. This motivates the representation of design parameters at multiple levels of an abstraction hierarchy. However, developing abstraction hierarchies is an area that is not well understood. Our research proposes a novel hierarchical abstraction methodology to plan effective engineering designs and processes. It provides a theoretically sound foundation to represent, abstract and stratify engineering design parameters and tasks according to causality and criticality. The methodology creates abstraction hierarchies in a recursive and bottom-up approach that guarantees no backtracking across any of the abstraction levels. The methodology consists of three main phases, representation, abstraction, and layering to multiple hierarchical levels. The effectiveness of the developed methodology is demonstrated by a design problem.

Optical Coherence Tomography Combined with the Confocal Microscopy Method and Fluorescence for Class V Cavities Investigations

The purpose of this study is to present a non invasive method for the marginal adaptation evaluation in class V composite restorations. Standardized class V cavities, prepared in human extracted teeth, were filled with Premise (Kerr) composite. The specimens were thermo cycled. The interfaces were examined by Optical Coherence Tomography method (OCT) combined with the confocal microscopy and fluorescence. The optical configuration uses two single mode directional couplers with a superluminiscent diode as the source at 1300 nm. The scanning procedure is similar to that used in any confocal microscope, where the fast scanning is enface (line rate) and the depth scanning is much slower (at the frame rate). Gaps at the interfaces as well as inside the composite resin materials were identified. OCT has numerous advantages which justify its use in vivo as well as in vitro in comparison with conventional techniques.

Selective Transverse Modes in a Diode End- Pumped Nd:Yag Pulsed Laser

The output beam quality of multi transverse modes of laser, are relatively poor. In order to obtain better beam quality, one may use an aperture inside the laser resonator. In this case, various transverse modes can be selected. We have selected various transverse modes both by simulation and doing experiment. By inserting a circular aperture inside the diode end-pumped Nd:YAG pulsed laser resonator, we have obtained 00 TEM , 01 TEM , 20 TEM and have studied which parameters, can change the mode shape. Then, we have determined the beam quality factor of TEM00 gaussian mode.

Qualification and Provisioning of xDSL Broadband Lines using a GIS Approach

In this paper is presented a Geographic Information System (GIS) approach in order to qualify and monitor the broadband lines in efficient way. The methodology used for interpolation is the Delaunay Triangular Irregular Network (TIN). This method is applied for a case study in ISP Greece monitoring 120,000 broadband lines.

An Effective Framework for Chinese Syntactic Parsing

This paper presents an effective framework for Chinesesyntactic parsing, which includes two parts. The first one is a parsing framework, which is based on an improved bottom-up chart parsingalgorithm, and integrates the idea of the beam search strategy of N bestalgorithm and heuristic function of A* algorithm for pruning, then get multiple parsing trees. The second is a novel evaluation model, which integrates contextual and partial lexical information into traditional PCFG model and defines a new score function. Using this model, the tree with the highest score is found out as the best parsing tree. Finally,the contrasting experiment results are given. Keywords?syntactic parsing, PCFG, pruning, evaluation model.