A Reconfigurable Processing Element for Cholesky Decomposition and Matrix Inversion

Fixed-point simulation results are used for the performance measure of inverting matrices by Cholesky decomposition. The fixed-point Cholesky decomposition algorithm is implemented using a fixed-point reconfigurable processing element. The reconfigurable processing element provides all mathematical operations required by Cholesky decomposition. The fixed-point word length analysis is based on simulations using different condition numbers and different matrix sizes. Simulation results show that 16 bits word length gives sufficient performance for small matrices with low condition number. Larger matrices and higher condition numbers require more dynamic range for a fixedpoint implementation.

Atrial Fibrillation Analysis Based on Blind Source Separation in 12-lead ECG

Atrial Fibrillation is the most common sustained arrhythmia encountered by clinicians. Because of the invisible waveform of atrial fibrillation in atrial activation for human, it is necessary to develop an automatic diagnosis system. 12-Lead ECG now is available in hospital and is appropriate for using Independent Component Analysis to estimate the AA period. In this research, we also adopt a second-order blind identification approach to transform the sources extracted by ICA to more precise signal and then we use frequency domain algorithm to do the classification. In experiment, we gather a significant result of clinical data.

Hippocampus Segmentation using a Local Prior Model on its Boundary

Segmentation techniques based on Active Contour Models have been strongly benefited from the use of prior information during their evolution. Shape prior information is captured from a training set and is introduced in the optimization procedure to restrict the evolution into allowable shapes. In this way, the evolution converges onto regions even with weak boundaries. Although significant effort has been devoted on different ways of capturing and analyzing prior information, very little thought has been devoted on the way of combining image information with prior information. This paper focuses on a more natural way of incorporating the prior information in the level set framework. For proof of concept the method is applied on hippocampus segmentation in T1-MR images. Hippocampus segmentation is a very challenging task, due to the multivariate surrounding region and the missing boundary with the neighboring amygdala, whose intensities are identical. The proposed method, mimics the human segmentation way and thus shows enhancements in the segmentation accuracy.

Application of Wavelet Neural Networks in Optimization of Skeletal Buildings under Frequency Constraints

The main goal of the present work is to decrease the computational burden for optimum design of steel frames with frequency constraints using a new type of neural networks called Wavelet Neural Network. It is contested to train a suitable neural network for frequency approximation work as the analysis program. The combination of wavelet theory and Neural Networks (NN) has lead to the development of wavelet neural networks. Wavelet neural networks are feed-forward networks using wavelet as activation function. Wavelets are mathematical functions within suitable inner parameters, which help them to approximate arbitrary functions. WNN was used to predict the frequency of the structures. In WNN a RAtional function with Second order Poles (RASP) wavelet was used as a transfer function. It is shown that the convergence speed was faster than other neural networks. Also comparisons of WNN with the embedded Artificial Neural Network (ANN) and with approximate techniques and also with analytical solutions are available in the literature.

Development of a Model for the Comprehensive Analysis and Evaluation of Service Productivity

Although services play a crucial role in economy, service did not gain as much importance as productivity management in manufacturing. This paper presents key findings from literature and practice. Based on an initial definition of complex services, seven productivity concepts are briefly presented and assessed by relevant, complex service specific criteria. Following the findings a complex service productivity model is proposed. The novel model comprises of all specific dimensions of service provision from both, the provider-s as well as costumer-s perspective. A clear assignment of identified value drivers and relationships between them is presented. In order to verify the conceptual service productivity model a case study from a project engineering department of a chemical plant development and construction company is presented.

The Link between Unemployment and Inflation Using Johansen’s Co-Integration Approach and Vector Error Correction Modelling

In this paper bi-annual time series data on unemployment rates (from the Labour Force Survey) are expanded to quarterly rates and linked to quarterly unemployment rates (from the Quarterly Labour Force Survey). The resultant linked series and the consumer price index (CPI) series are examined using Johansen’s cointegration approach and vector error correction modeling. The study finds that both the series are integrated of order one and are cointegrated. A statistically significant co-integrating relationship is found to exist between the time series of unemployment rates and the CPI. Given this significant relationship, the study models this relationship using Vector Error Correction Models (VECM), one with a restriction on the deterministic term and the other with no restriction. A formal statistical confirmation of the existence of a unique linear and lagged relationship between inflation and unemployment for the period between September 2000 and June 2011 is presented. For the given period, the CPI was found to be an unbiased predictor of the unemployment rate. This relationship can be explored further for the development of appropriate forecasting models incorporating other study variables.

The Impact of Subsequent Stock Market Liberalization on the Integration of Stock Markets in ASEAN-4 + South Korea

To strengthen the capital market, there is a need to integrate the capital markets within the region by removing legal or informal restriction, specifically, stock market liberalization. Thus the paper is to investigate the effects of the subsequent stock market liberalization on stock market integration in 4 ASEAN countries (Malaysia, Indonesia, Thailand, Singapore) and Korea from 1997 to 2007. The correlation between stock market liberalization and stock market integration are to be examined by analyzing the stock prices and returns within the region and in comparison with the world MSCI index. Event study method is to be used with windows of ±12 months and T-7 + T. The results show that the subsequent stock market liberalization generally, gives minor positive effects to stock returns, except for one or two countries. The subsequent liberalization also integrates the markets short-run and long-run.

A Formative Assessment Tool for Effective Feedback

In this study we present our developed formative assessment tool for students' assignments. The tool enables lecturers to define assignments for the course and assign each problem in each assignment a list of criteria and weights by which the students' work is evaluated. During assessment, the lecturers feed the scores for each criterion with justifications. When the scores of the current assignment are completely fed in, the tool automatically generates reports for both students and lecturers. The students receive a report by email including detailed description of their assessed work, their relative score and their progress across the criteria along the course timeline. This information is presented via charts generated automatically by the tool based on the scores fed in. The lecturers receive a report that includes summative (e.g., averages, standard deviations) and detailed (e.g., histogram) data of the current assignment. This information enables the lecturers to follow the class achievements and adjust the learning process accordingly. The tool was examined on two pilot groups of college students that study a course in (1) Object-Oriented Programming (2) Plane Geometry. Results reveal that most of the students were satisfied with the assessment process and the reports produced by the tool. The lecturers who used the tool were also satisfied with the reports and their contribution to the learning process.

Effect of Pectinase on the Physico-Chemical Properties of Juice from Pawpaw (Carica papaya) Fruits

A procedure for the preparation of clarified Pawpaw Juice was developed. About 750ml Pawpaw pulp was measured into 2 measuring cylinders A & B of capacity 1 litre heated to 400C, cooled to 200C. 30mls pectinase was added into cylinder A, while 30mls distilled water was added into cylinder B. Enzyme treated sample (A) was allowed to digest for 5hours after which it was heated to 900C for 15 minutes to inactivate the enzyme. The heated sample was cooled and with the aid of a mucillin cloth the pulp was filtered to obtain the clarified pawpaw juice. The juice was filled into 100ml plastic bottles, pasteurized at 950C for 45 minutes, cooled and stored at room temperature. The sample treated with 30mls distilled water also underwent the same process. Freshly pasteurized sample was analyzed for specific gravity, titratable acidity, pH, sugars and ascorbic acid. The remaining sample was then stored for 2 weeks and the above analyses repeated. There were differences in the results of the freshly pasteurized samples and stored sample in pH and ascorbic acid levels, also sample treated with pectinase yielded higher volumes of juice than that treated with distilled water.

Study on the Particle Removal Efficiency of Multi Inner Stage Cyclone by CFD Simulation

A new multi inner stage (MIS) cyclone was designed to remove the acidic gas and fine particles produced from electronic industry. To characterize gas flow in MIS cyclone, pressure and velocity distribution were calculated by means of CFD program. Also, the flow locus of fine particles and particle removal efficiency were analyzed by Lagrangian method. When outlet pressure condition was –100mmAq, the efficiency was the best in this study.

Virulent-GO: Prediction of Virulent Proteins in Bacterial Pathogens Utilizing Gene Ontology Terms

Prediction of bacterial virulent protein sequences can give assistance to identification and characterization of novel virulence-associated factors and discover drug/vaccine targets against proteins indispensable to pathogenicity. Gene Ontology (GO) annotation which describes functions of genes and gene products as a controlled vocabulary of terms has been shown effectively for a variety of tasks such as gene expression study, GO annotation prediction, protein subcellular localization, etc. In this study, we propose a sequence-based method Virulent-GO by mining informative GO terms as features for predicting bacterial virulent proteins. Each protein in the datasets used by the existing method VirulentPred is annotated by using BLAST to obtain its homologies with known accession numbers for retrieving GO terms. After investigating various popular classifiers using the same five-fold cross-validation scheme, Virulent-GO using the single kind of GO term features with an accuracy of 82.5% is slightly better than VirulentPred with 81.8% using five kinds of sequence-based features. For the evaluation of independent test, Virulent-GO also yields better results (82.0%) than VirulentPred (80.7%). When evaluating single kind of feature with SVM, the GO term feature performs much well, compared with each of the five kinds of features.

The Role of Private Equity during Global Crises

The term private equity usually refers to any type of equity investment in an asset in which the equity is not freely tradable on a public stock market. Some researchers believe that private equity contributed to the extent of the crisis and increased the pace of its spread over the world. We do not agree with this. On the other hand, we argue that during the economic recession private equity might become an important source of funds for firms with special needs (e.g. for firms seeking buyout financing, venture capital, expansion capital or distress debt financing). However, over-regulation of private equity in both the European Union and the US can slow down this specific funding channel to the economy and deepen credit crunch during global crises.

Service Quality vs. Customer Satisfaction: Perspectives of Visitors to a Public University Library

This study proposes a conceptual model and empirically tests the relationships between customers and librarians (i.e. tangibles, responsiveness, assurance, reliability and empathy) with a dependent variable (customer satisfaction) regarding library services. The SERVQUAL instrument was administered to 100 respondents which comprises of staff and students at a public higher learning institution in the Federal Territory of Labuan, Malaysia. They were public university library users. Results revealed that all service quality dimensions tested were significant and influenced customer satisfaction of visitors to a public university library. Assurance is the most important factor that influences customer satisfaction with the services rendered by the librarian. It is imperative for the library management to take note that the top five service attributes that gained greatest attention from library visitors- perspective includes employee willingness to help customers, availability of customer representatives online for response to queries, library staff actively and promptly provide services, signs in the building are clear and library staff are friendly and courteous. This study provides valuable results concerning the determinants of the service quality and customer satisfaction of public university library services from the users' perspective.

Improved Closed Set Text-Independent Speaker Identification by Combining MFCC with Evidence from Flipped Filter Banks

A state of the art Speaker Identification (SI) system requires a robust feature extraction unit followed by a speaker modeling scheme for generalized representation of these features. Over the years, Mel-Frequency Cepstral Coefficients (MFCC) modeled on the human auditory system has been used as a standard acoustic feature set for SI applications. However, due to the structure of its filter bank, it captures vocal tract characteristics more effectively in the lower frequency regions. This paper proposes a new set of features using a complementary filter bank structure which improves distinguishability of speaker specific cues present in the higher frequency zone. Unlike high level features that are difficult to extract, the proposed feature set involves little computational burden during the extraction process. When combined with MFCC via a parallel implementation of speaker models, the proposed feature set outperforms baseline MFCC significantly. This proposition is validated by experiments conducted on two different kinds of public databases namely YOHO (microphone speech) and POLYCOST (telephone speech) with Gaussian Mixture Models (GMM) as a Classifier for various model orders.

Experimental Studies on Multiphase Flow in Porous Media and Pore Wettability

Multiphase flow transport in porous medium is very common and significant in science and engineering applications. For example, in CO2 Storage and Enhanced Oil Recovery processes, CO2 has to be delivered to the pore spaces in reservoirs and aquifers. CO2 storage and enhance oil recovery are actually displacement processes, in which oil or water is displaced by CO2. This displacement is controlled by pore size, chemical and physical properties of pore surfaces and fluids, and also pore wettability. In this study, a technique was developed to measure the pressure profile for driving gas/liquid to displace water in pores. Through this pressure profile, the impact of pore size on the multiphase flow transport and displacement can be analyzed. The other rig developed can be used to measure the static and dynamic pore wettability and investigate the effects of pore size, surface tension, viscosity and chemical structure of liquids on pore wettability.

Estimation of Broadcast Probability in Wireless Adhoc Networks

Most routing protocols (DSR, AODV etc.) that have been designed for wireless adhoc networks incorporate the broadcasting operation in their route discovery scheme. Probabilistic broadcasting techniques have been developed to optimize the broadcast operation which is otherwise very expensive in terms of the redundancy and the traffic it generates. In this paper we have explored percolation theory to gain a different perspective on probabilistic broadcasting schemes which have been actively researched in the recent years. This theory has helped us estimate the value of broadcast probability in a wireless adhoc network as a function of the size of the network. We also show that, operating at those optimal values of broadcast probability there is at least 25-30% reduction in packet regeneration during successful broadcasting.

Abstraction Hierarchies for Engineering Design

Complex engineering design problems consist of numerous factors of varying criticalities. Considering fundamental features of design and inferior details alike will result in an extensive waste of time and effort. Design parameters should be introduced gradually as appropriate based on their significance relevant to the problem context. This motivates the representation of design parameters at multiple levels of an abstraction hierarchy. However, developing abstraction hierarchies is an area that is not well understood. Our research proposes a novel hierarchical abstraction methodology to plan effective engineering designs and processes. It provides a theoretically sound foundation to represent, abstract and stratify engineering design parameters and tasks according to causality and criticality. The methodology creates abstraction hierarchies in a recursive and bottom-up approach that guarantees no backtracking across any of the abstraction levels. The methodology consists of three main phases, representation, abstraction, and layering to multiple hierarchical levels. The effectiveness of the developed methodology is demonstrated by a design problem.

Corporate Credit Rating using Multiclass Classification Models with order Information

Corporate credit rating prediction using statistical and artificial intelligence (AI) techniques has been one of the attractive research topics in the literature. In recent years, multiclass classification models such as artificial neural network (ANN) or multiclass support vector machine (MSVM) have become a very appealing machine learning approaches due to their good performance. However, most of them have only focused on classifying samples into nominal categories, thus the unique characteristic of the credit rating - ordinality - has been seldom considered in their approaches. This study proposes new types of ANN and MSVM classifiers, which are named OMANN and OMSVM respectively. OMANN and OMSVM are designed to extend binary ANN or SVM classifiers by applying ordinal pairwise partitioning (OPP) strategy. These models can handle ordinal multiple classes efficiently and effectively. To validate the usefulness of these two models, we applied them to the real-world bond rating case. We compared the results of our models to those of conventional approaches. The experimental results showed that our proposed models improve classification accuracy in comparison to typical multiclass classification techniques with the reduced computation resource.

Qualification and Provisioning of xDSL Broadband Lines using a GIS Approach

In this paper is presented a Geographic Information System (GIS) approach in order to qualify and monitor the broadband lines in efficient way. The methodology used for interpolation is the Delaunay Triangular Irregular Network (TIN). This method is applied for a case study in ISP Greece monitoring 120,000 broadband lines.

Effect of Size of the Step in the Response Surface Methodology using Nonlinear Test Functions

The response surface methodology (RSM) is a collection of mathematical and statistical techniques useful in the modeling and analysis of problems in which the dependent variable receives the influence of several independent variables, in order to determine which are the conditions under which should operate these variables to optimize a production process. The RSM estimated a regression model of first order, and sets the search direction using the method of maximum / minimum slope up / down MMS U/D. However, this method selects the step size intuitively, which can affect the efficiency of the RSM. This paper assesses how the step size affects the efficiency of this methodology. The numerical examples are carried out through Monte Carlo experiments, evaluating three response variables: efficiency gain function, the optimum distance and the number of iterations. The results in the simulation experiments showed that in response variables efficiency and gain function at the optimum distance were not affected by the step size, while the number of iterations is found that the efficiency if it is affected by the size of the step and function type of test used.