Quality-Controlled Compression Method using Wavelet Transform for Electrocardiogram Signals

This paper presents a new Quality-Controlled, wavelet based, compression method for electrocardiogram (ECG) signals. Initially, an ECG signal is decomposed using the wavelet transform. Then, the resulting coefficients are iteratively thresholded to guarantee that a predefined goal percent root mean square difference (GPRD) is matched within tolerable boundaries. The quantization strategy of extracted non-zero wavelet coefficients (NZWC), according to the combination of RLE, HUFFMAN and arithmetic encoding of the NZWC and a resulting look up table, allow the accomplishment of high compression ratios with good quality reconstructed signals.

Simulation of Particle Damping under Centrifugal Loads

Particle damping is a technique to reduce the structural vibrations by means of placing small metallic particles inside a cavity that is attached to the structure at location of high vibration amplitudes. In this paper, we have presented an analytical model to simulate the particle damping of two dimensional transient vibrations in structure operating under high centrifugal loads. The simulation results show that this technique remains effective as long as the ratio of the dynamic acceleration of the structure to the applied centrifugal load is more than 0.1. Particle damping increases with the increase of particle to structure mass ratio. However, unlike to the case of particle damping in the absence of centrifugal loads where the damping efficiency strongly depends upon the size of the cavity, here this dependence becomes very weak. Despite the simplicity of the model, the simulation results are considerably in good agreement with the very scarce experimental data available in the literature for particle damping under centrifugal loads.

Reform-Oriented Teaching of Introductory Statistics in the Health, Social and Behavioral Sciences – Historical Context and Rationale

There is widespread emphasis on reform in the teaching of introductory statistics at the college level. Underpinning this reform is a consensus among educators and practitioners that traditional curricular materials and pedagogical strategies have not been effective in promoting statistical literacy, a competency that is becoming increasingly necessary for effective decision-making and evidence-based practice. This paper explains the historical context of, and rationale for reform-oriented teaching of introductory statistics (at the college level) in the health, social and behavioral sciences (evidence-based disciplines). A firm understanding and appreciation of the basis for change in pedagogical approach is important, in order to facilitate commitment to reform, consensus building on appropriate strategies, and adoption and maintenance of best practices. In essence, reform-oriented pedagogy, in this context, is a function of the interaction among content, pedagogy, technology, and assessment. The challenge is to create an appropriate balance among these domains.

Optimum Control Strategy of Three-Phase Shunt Active Filter System

The aim of this paper is to identify an optimum control strategy of three-phase shunt active filters to minimize the total harmonic distortion factor of the supply current. A classical PIPI cascade control solution of the output current of the active filterand the voltage across the DC capacitor based on Modulus–Optimum criterion is taken into consideration. The control system operation has been simulated using Matlab-Simulink environment and the results agree with the theoretical expectation. It is shown that there is an optimum value of the DC-bus voltage which minimizes the supply current harmonic distortion factor. It corresponds to the equality of the apparent power at the output of the active filter and the apparent power across the capacitor. Finally, predicted results are verified experimentally on a MaxSine active power filter.

Design for Manufacturability and Concurrent Engineering for Product Development

In the 1980s, companies began to feel the effect of three major influences on their product development: newer and innovative technologies, increasing product complexity and larger organizations. And therefore companies were forced to look for new product development methods. This paper tries to focus on the two of new product development methods (DFM and CE). The aim of this paper is to see and analyze different product development methods specifically on Design for Manufacturability and Concurrent Engineering. Companies can achieve and be benefited by minimizing product life cycle, cost and meeting delivery schedule. This paper also presents simplified models that can be modified and used by different companies based on the companies- objective and requirements. Methodologies that are followed to do this research are case studies. Two companies were taken and analysed on the product development process. Historical data, interview were conducted on these companies in addition to that, Survey of literatures and previous research works on similar topics has been done during this research. This paper also tries to show the implementation cost benefit analysis and tries to calculate the implementation time. From this research, it has been found that the two companies did not achieve the delivery time to the customer. Some of most frequently coming products are analyzed and 50% to 80 % of their products are not delivered on time to the customers. The companies are following the traditional way of product development that is sequentially design and production method, which highly affect time to market. In the case study it is found that by implementing these new methods and by forming multi disciplinary team in designing and quality inspection; the company can reduce the workflow steps from 40 to 30.

Forecasting Enrollment Model Based on First-Order Fuzzy Time Series

This paper proposes a novel improvement of forecasting approach based on using time-invariant fuzzy time series. In contrast to traditional forecasting methods, fuzzy time series can be also applied to problems, in which historical data are linguistic values. It is shown that proposed time-invariant method improves the performance of forecasting process. Further, the effect of using different number of fuzzy sets is tested as well. As with the most of cited papers, historical enrollment of the University of Alabama is used in this study to illustrate the forecasting process. Subsequently, the performance of the proposed method is compared with existing fuzzy time series time-invariant models based on forecasting accuracy. It reveals a certain performance superiority of the proposed method over methods described in the literature.

Auction Theory: Bidder's Perspective in an English Auction Environment

This paper provides an overview of auction theory literature. We present a general review on literature of various auctions and focus ourselves specifically on an English auction. We are interested in modelling bidder's behavior in an English auction environment. And hence, we present an overview of the New Zealand wool auction followed by a model that would describe a bidder's decision making behavior from the New Zealand wool auction. The mathematical assumptions in an English auction environment are demonstrated from the perspective of the New Zealand wool auction.

A Simple Adaptive Algorithm for Norm-Constrained Optimization

In this paper we propose a simple adaptive algorithm iteratively solving the unit-norm constrained optimization problem. Instead of conventional parameter norm based normalization, the proposed algorithm incorporates scalar normalization which is computationally much simpler. The analysis of stationary point is presented to show that the proposed algorithm indeed solves the constrained optimization problem. The simulation results illustrate that the proposed algorithm performs as good as conventional ones while being computationally simpler.

Optimal Location of Multi Type Facts Devices for Multiple Contingencies Using Particle Swarm Optimization

In deregulated operating regime power system security is an issue that needs due thoughtfulness from researchers in the horizon of unbundling of generation and transmission. Electric power systems are exposed to various contingencies. Network contingencies often contribute to overloading of branches, violation of voltages and also leading to problems of security/stability. To maintain the security of the systems, it is desirable to estimate the effect of contingencies and pertinent control measurement can be taken on to improve the system security. This paper presents the application of particle swarm optimization algorithm to find the optimal location of multi type FACTS devices in a power system in order to eliminate or alleviate the line over loads. The optimizations are performed on the parameters, namely the location of the devices, their types, their settings and installation cost of FACTS devices for single and multiple contingencies. TCSC, SVC and UPFC are considered and modeled for steady state analysis. The selection of UPFC and TCSC suitable location uses the criteria on the basis of improved system security. The effectiveness of the proposed method is tested for IEEE 6 bus and IEEE 30 bus test systems.

The Performance of the Character-Access on the Checking Phase in String Searching Algorithms

A new algorithm called Character-Comparison to Character-Access (CCCA) is developed to test the effect of both: 1) converting character-comparison and number-comparison into character-access and 2) the starting point of checking on the performance of the checking operation in string searching. An experiment is performed; the results are compared with five algorithms, namely, Naive, BM, Inf_Suf_Pref, Raita, and Circle. With the CCCA algorithm, the results suggest that the evaluation criteria of the average number of comparisons are improved up to 74.0%. Furthermore, the results suggest that the clock time required by the other algorithms is improved in range from 28% to 68% by the new CCCA algorithm

Leadership Branding for Sustainable Customer Engagement

The purpose of this paper is to examine the inter relationships among various leadership branding constructs of entrepreneurs in small and medium sized enterprises (SMEs). We employ a quantitative structural equation modeling through a new leadership branding engagement model comprises constructs of leader-s or entrepreneur-s personality, branding practice and customer engagement. The results confirm that there are significant relationships between the three constructs and the major fit indices indicate that the data fits the proposed model. The findings provide insights and fill in the literature gaps on statistically validated representation of leadership branding for SMEs across new economic regions of Malaysia that may implicate other economic zones with similar situations. This study extends the establishment of a leadership branding engagement model with a new mechanism of using leaders- personality as a predictor to branding practice and customer engagement performance.

Some New Upper Bounds for the Spectral Radius of Iterative Matrices

In this paper, we present some new upper bounds for the spectral radius of iterative matrices based on the concept of doubly α diagonally dominant matrix. And subsequently, we give two examples to show that our results are better than the earlier ones.

Analyzing of Temperature-Dependent Thermal Conductivity Effect in the Numerical Modeling of Fin-Tube Radiators: Introduction of a New Method

In all industries which are related to heat, suitable thermal ranges are defined for each device to operate well. Consideration of these limits requires a thermal control unit beside the main system. The Satellite Thermal Control Unit exploits from different methods and facilities individually or mixed. For enhancing heat transfer between primary surface and the environment, utilization of radiating extended surfaces are common. Especially for large temperature differences; variable thermal conductivity has a strong effect on performance of such a surface .In most literatures, thermo-physical properties, such as thermal conductivity, are assumed as constant. However, in some recent researches the variation of these parameters is considered. This may be helpful for the evaluation of fin-s temperature distribution in relatively large temperature differences. A new method is introduced to evaluate temperature-dependent thermal conductivity values. The finite volume method is employed to simulate numerically the temperature distribution in a space radiating fin. The present modeling is carried out for Aluminum as fin material and compared with previous method. The present results are also compared with those of two other analytical methods and good agreement is shown.

The Situation in the Public Procurement Market in Post-Communist Countries: The Case of the Czech Republic

Public procurement is one of the most important areas in the public sector that introduces a possibility for a corruption. Due to the volume of the funds that are allocated through this institution (in the EU countries it is between 10 – 15% of GDP), it has very serious implications for the efficiency of public expenditures and the overall economic efficiency as well. Indicators that are usually used for the measurement of the corruption (such as Corruption Perceptions Index - CPI) show that the worst situation is in the post-communist countries and Mediterranean countries. The presented paper uses the Czech Republic as an example of a post-communist country and analyses the factors which influence the scope of corruption in public procurement. Moreover, the paper discusses indicators that could point at the public procurement market inefficiency. The presented results show that post-communist states use the institute of public contracts significantly more than the old member countries of the continental Europe. It has a very important implication because it gives more space for corruption. Furthermore, it appears that the inefficient functioning of public procurement market is clearly manifested in the low number of bids, low level of market transparency and an ineffective control system. Some of the observed indicators are statistically significantly correlated with the CPI.

Feature Subset Selection approach based on Maximizing Margin of Support Vector Classifier

Identification of cancer genes that might anticipate the clinical behaviors from different types of cancer disease is challenging due to the huge number of genes and small number of patients samples. The new method is being proposed based on supervised learning of classification like support vector machines (SVMs).A new solution is described by the introduction of the Maximized Margin (MM) in the subset criterion, which permits to get near the least generalization error rate. In class prediction problem, gene selection is essential to improve the accuracy and to identify genes for cancer disease. The performance of the new method was evaluated with real-world data experiment. It can give the better accuracy for classification.

A Projection Method Based on Extended Krylov Subspaces for Solving Sylvester Equations

In this paper we study numerical methods for solving Sylvester matrix equations of the form AX +XBT +CDT = 0. A new projection method is proposed. The union of Krylov subspaces in A and its inverse and the union of Krylov subspaces in B and its inverse are used as the right and left projection subspaces, respectively. The Arnoldi-like process for constructing the orthonormal basis of the projection subspaces is outlined. We show that the approximate solution is an exact solution of a perturbed Sylvester matrix equation. Moreover, exact expression for the norm of residual is derived and results on finite termination and convergence are presented. Some numerical examples are presented to illustrate the effectiveness of the proposed method.

Towards a New Methodology for Developing Web-Based Systems

Web-based systems have become increasingly important due to the fact that the Internet and the World Wide Web have become ubiquitous, surpassing all other technological developments in our history. The Internet and especially companies websites has rapidly evolved in their scope and extent of use, from being a little more than fixed advertising material, i.e. a "web presences", which had no particular influence for the company's business, to being one of the most essential parts of the company's core business. Traditional software engineering approaches with process models such as, for example, CMM and Waterfall models, do not work very well since web system development differs from traditional development. The development differs in several ways, for example, there is a large gap between traditional software engineering designs and concepts and the low-level implementation model, many of the web based system development activities are business oriented (for example web application are sales-oriented, web application and intranets are content-oriented) and not engineering-oriented. This paper aims to introduce Increment Iterative extreme Programming (IIXP) methodology for developing web based systems. In difference to the other existence methodologies, this methodology is combination of different traditional and modern software engineering and web engineering principles.

Impact of the Decoder Connection Schemes on Iterative Decoding of GPCB Codes

In this paper we present a study of the impact of connection schemes on the performance of iterative decoding of Generalized Parallel Concatenated block (GPCB) constructed from one step majority logic decodable (OSMLD) codes and we propose a new connection scheme for decoding them. All iterative decoding connection schemes use a soft-input soft-output threshold decoding algorithm as a component decoder. Numerical result for GPCB codes transmitted over Additive White Gaussian Noise (AWGN) channel are provided. It will show that the proposed scheme is better than Hagenauer-s scheme and Lucas-s scheme [1] and slightly better than the Pyndiah-s scheme.

Comparison of the Existing Methods in Determination of the Characteristic Polynomial

This paper presents comparison among methods of determination of the characteristic polynomial coefficients. First, the resultant systems from the methods are compared based on frequency criteria such as the closed loop bandwidth, gain and phase margins. Then the step responses of the resultant systems are compared on the basis of the transient behavior criteria including overshoot, rise time, settling time and error (via IAE, ITAE, ISE and ITSE integral indices). Also relative stability of the systems is compared together. Finally the best choices in regards to the above diverse criteria are presented.

School Age and Building Defects: Analysis Using Condition Survey Protocol (CSP) 1 Matrix

Building condition assessment is a critical activity in Malaysia-s Comprehensive Asset Management Model. It is closely related to building performance that impact user-s life and decision making. This study focuses on public primary school, one of the most valuable assets for the country. The assessment was carried out based on CSP1 Matrix in Kuching Division of Sarawak, Malaysia. Based on the matrix used, three main criteria of the buildings has successfully evaluate: the number of defects; schools rating; and total schools rating. The analysis carried out on 24 schools found that the overall 4, 725 defects has been identified. Meanwhile, the overall score obtained was 45, 868 and the overall rating is 9.71, which is at the fair condition. This result has been associated with building age to evaluate its impacts on school buildings condition. The findings proved that building condition is closely related to building age and its support the theory that 'the ageing building has more defect than the new one'.