A High Quality Speech Coder at 600 bps

This paper presents a vocoder to obtain high quality synthetic speech at 600 bps. To reduce the bit rate, the algorithm is based on a sinusoidally excited linear prediction model which extracts few coding parameters, and three consecutive frames are grouped into a superframe and jointly vector quantization is used to obtain high coding efficiency. The inter-frame redundancy is exploited with distinct quantization schemes for different unvoiced/voiced frame combinations in the superframe. Experimental results show that the quality of the proposed coder is better than that of 2.4kbps LPC10e and achieves approximately the same as that of 2.4kbps MELP and with high robustness.

Effective Keyword and Similarity Thresholds for the Discovery of Themes from the User Web Access Patterns

Clustering techniques have been used by many intelligent software agents to group similar access patterns of the Web users into high level themes which express users intentions and interests. However, such techniques have been mostly focusing on one salient feature of the Web document visited by the user, namely the extracted keywords. The major aim of these techniques is to come up with an optimal threshold for the number of keywords needed to produce more focused themes. In this paper we focus on both keyword and similarity thresholds to generate themes with concentrated themes, and hence build a more sound model of the user behavior. The purpose of this paper is two fold: use distance based clustering methods to recognize overall themes from the Proxy log file, and suggest an efficient cut off levels for the keyword and similarity thresholds which tend to produce more optimal clusters with better focus and efficient size.

Accurate Crosstalk Analysis for RLC On-Chip VLSI Interconnect

This work proposes an accurate crosstalk noise estimation method in the presence of multiple RLC lines for the use in design automation tools. This method correctly models the loading effects of non switching aggressors and aggressor tree branches using resistive shielding effect and realistic exponential input waveforms. Noise peak and width expressions have been derived. The results obtained are at good agreement with SPICE results. Results show that average error for noise peak is 4.7% and for the width is 6.15% while allowing a very fast analysis.

A Comparison of Some Splines-Based Methods for the One-dimensional Heat Equation

In this paper, collocation based cubic B-spline and extended cubic uniform B-spline method are considered for solving one-dimensional heat equation with a nonlocal initial condition. Finite difference and θ-weighted scheme is used for time and space discretization respectively. The stability of the method is analyzed by the Von Neumann method. Accuracy of the methods is illustrated with an example. The numerical results are obtained and compared with the analytical solutions.

MATLAB/SIMULINK Based Model of Single- Machine Infinite-Bus with TCSC for Stability Studies and Tuning Employing GA

With constraints on data availability and for study of power system stability it is adequate to model the synchronous generator with field circuit and one equivalent damper on q-axis known as the model 1.1. This paper presents a systematic procedure for modelling and simulation of a single-machine infinite-bus power system installed with a thyristor controlled series compensator (TCSC) where the synchronous generator is represented by model 1.1, so that impact of TCSC on power system stability can be more reasonably evaluated. The model of the example power system is developed using MATLAB/SIMULINK which can be can be used for teaching the power system stability phenomena, and also for research works especially to develop generator controllers using advanced technologies. Further, the parameters of the TCSC controller are optimized using genetic algorithm. The non-linear simulation results are presented to validate the effectiveness of the proposed approach.

Vibration Base Identification of Impact Force Using Genetic Algorithm

This paper presents the identification of the impact force acting on a simply supported beam. The force identification is an inverse problem in which the measured response of the structure is used to determine the applied force. The identification problem is formulated as an optimization problem and the genetic algorithm is utilized to solve the optimization problem. The objective function is calculated on the difference between analytical and measured responses and the decision variables are the location and magnitude of the applied force. The results from simulation show the effectiveness of the approach and its robustness vs. the measurement noise and sensor location.

eLearning Tools Evaluation based on Quality Concept Distance Computing. A Case Study

Despite the extensive use of eLearning systems, there is no consensus on a standard framework for evaluating this kind of quality system. Hence, there is only a minimum set of tools that can supervise this judgment and gives information about the course content value. This paper presents two kinds of quality set evaluation indicators for eLearning courses based on the computational process of three known metrics, the Euclidian, Hamming and Levenshtein distances. The “distance" calculus is applied to standard evaluation templates (i.e. the European Commission Programme procedures vs. the AFNOR Z 76-001 Standard), determining a reference point in the evaluation of the e-learning course quality vs. the optimal concept(s). The case study, based on the results of project(s) developed in the framework of the European Programme “Leonardo da Vinci", with Romanian contractors, try to put into evidence the benefits of such a method.

A CTL Specification of Serializability for Transactions Accessing Uniform Data

Existing work in temporal logic on representing the execution of infinitely many transactions, uses linear-time temporal logic (LTL) and only models two-step transactions. In this paper, we use the comparatively efficient branching-time computational tree logic CTL and extend the transaction model to a class of multistep transactions, by introducing distinguished propositional variables to represent the read and write steps of n multi-step transactions accessing m data items infinitely many times. We prove that the well known correspondence between acyclicity of conflict graphs and serializability for finite schedules, extends to infinite schedules. Furthermore, in the case of transactions accessing the same set of data items in (possibly) different orders, serializability corresponds to the absence of cycles of length two. This result is used to give an efficient encoding of the serializability condition into CTL.

An Enhance of the Energy Effectiveness of the Convectors Used for Heating or Cooling

The objective of this paper is to present a research study of the convectors that are used for heating or cooling of the living room or industrial halls. The key points are experimental measurement and comprehensive numerical simulation of the flow coming throughout the part of the convector such as heat exchanger, input from the fan etc.. From the obtained results, the components of the convector are optimized in sense to increase thermal power efficiency due to improvement of heat convection or reduction of air drag friction. Both optimized aspects are leading to the more effective service conditions and to energy saving. The significant part of the convector research is a design of the unique measurement laboratory and adopting measure techniques. The new laboratory provides possibility to measure thermal power efficiency and other relevant parameters under specific service conditions of the convectors.

Adaptive Kernel Principal Analysis for Online Feature Extraction

The batch nature limits the standard kernel principal component analysis (KPCA) methods in numerous applications, especially for dynamic or large-scale data. In this paper, an efficient adaptive approach is presented for online extraction of the kernel principal components (KPC). The contribution of this paper may be divided into two parts. First, kernel covariance matrix is correctly updated to adapt to the changing characteristics of data. Second, KPC are recursively formulated to overcome the batch nature of standard KPCA.This formulation is derived from the recursive eigen-decomposition of kernel covariance matrix and indicates the KPC variation caused by the new data. The proposed method not only alleviates sub-optimality of the KPCA method for non-stationary data, but also maintains constant update speed and memory usage as the data-size increases. Experiments for simulation data and real applications demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy.

WLAN Positioning Based on Joint TOA and RSS Characteristics

WLAN Positioning has been presented by many approaches in literatures using the characteristics of Received Signal Strength (RSS), Time of Arrival (TOA) or Time Difference of Arrival (TDOA), Angle of Arrival (AOA) and cell ID. Among these, RSS approach is the simplest method to implement because there is no need of modification on both access points and client devices whereas its accuracy is terrible due to physical environments. For TOA or TDOA approach, the accuracy is quite acceptable but most researches have to modify either software or hardware on existing WLAN infrastructure. The scales of modifications are made on only access card up to the changes in protocol of WLAN. Hence, it is an unattractive approach to use TOA or TDOA for positioning system. In this paper, the new concept of merging both RSS and TOA positioning techniques is proposed. In addition, the method to achieve TOA characteristic for positioning WLAN user without any extra modification necessarily appended in the existing system is presented. The measurement results confirm that the proposed technique using both RSS and TOA characteristics provides better accuracy than using only either RSS or TOA approach.

Cash Flow Optimization on Synthetic CDOs

Collateralized Debt Obligations are not as widely used nowadays as they were before 2007 Subprime crisis. Nonetheless there remains an enthralling challenge to optimize cash flows associated with synthetic CDOs. A Gaussian-based model is used here in which default correlation and unconditional probabilities of default are highlighted. Then numerous simulations are performed based on this model for different scenarios in order to evaluate the associated cash flows given a specific number of defaults at different periods of time. Cash flows are not solely calculated on a single bought or sold tranche but rather on a combination of bought and sold tranches. With some assumptions, the simplex algorithm gives a way to find the maximum cash flow according to correlation of defaults and maturities. The used Gaussian model is not realistic in crisis situations. Besides present system does not handle buying or selling a portion of a tranche but only the whole tranche. However the work provides the investor with relevant elements on how to know what and when to buy and sell.

On Use of Semiconductor Detector Arrays on COMPASS Tokamak

Semiconductor detector arrays are widely used in high-temperature plasma diagnostics. They have a fast response, which allows observation of many processes and instabilities in tokamaks. In this paper, there are reviewed several diagnostics based on semiconductor arrays as cameras, AXUV photodiodes (referred often as fast “bolometers") and detectors of both soft X-rays and visible light installed on the COMPASS tokamak recently. Fresh results from both spring and summer campaigns in 2012 are introduced. Examples of the utilization of the detectors are shown on the plasma shape determination, fast calculation of the radiation center, two-dimensional plasma radiation tomography in different spectral ranges, observation of impurity inflow, and also on investigation of MHD activity in the COMPASS tokamak discharges.

Tele-Diagnosis System for Rural Thailand

Thailand-s health system is challenged by the rising number of patients and decreasing ratio of medical practitioners/patients, especially in rural areas. This may tempt inexperienced GPs to rush through the process of anamnesis with the risk of incorrect diagnosis. Patients have to travel far to the hospital and wait for a long time presenting their case. Many patients try to cure themselves with traditional Thai medicine. Many countries are making use of the Internet for medical information gathering, distribution and storage. Telemedicine applications are a relatively new field of study in Thailand; the infrastructure of ICT had hampered widespread use of the Internet for using medical information. With recent improvements made health and technology professionals can work out novel applications and systems to help advance telemedicine for the benefit of the people. Here we explore the use of telemedicine for people with health problems in rural areas in Thailand and present a Telemedicine Diagnosis System for Rural Thailand (TEDIST) for diagnosing certain conditions that people with Internet access can use to establish contact with Community Health Centers, e.g. by mobile phone. The system uses a Web-based input method for individual patients- symptoms, which are taken by an expert system for the analysis of conditions and appropriate diseases. The analysis harnesses a knowledge base and a backward chaining component to find out, which health professionals should be presented with the case. Doctors have the opportunity to exchange emails or chat with the patients they are responsible for or other specialists. Patients- data are then stored in a Personal Health Record.

Performance of Soft Handover Algorithm in Varied Propagation Environments

CDMA cellular networks support soft handover, which guarantees the continuity of wireless services and enhanced communication quality. Cellular networks support multimedia services under varied propagation environmental conditions. In this paper, we have shown the effect of characteristic parameters of the cellular environments on the soft handover performance. We consider path loss exponent, standard deviation of shadow fading and correlation coefficient of shadow fading as the characteristic parameters of the radio propagation environment. A very useful statistical measure for characterizing the performance of mobile radio system is the probability of outage. It is shown through numerical results that above parameters have decisive effect on the probability of outage and hence the overall performance of the soft handover algorithm.

Investigation of VMAT Algorithms and Dosimetry

Purpose: Planning and dosimetry of different VMAT algorithms (SmartArc, Ergo++, Autobeam) is compared with IMRT for Head and Neck Cancer patients. Modelling was performed to rule out the causes of discrepancies between planned and delivered dose. Methods: Five HNC patients previously treated with IMRT were re-planned with SmartArc (SA), Ergo++ and Autobeam. Plans were compared with each other and against IMRT and evaluated using DVHs for PTVs and OARs, delivery time, monitor units (MU) and dosimetric accuracy. Modelling of control point (CP) spacing, Leaf-end Separation and MLC/Aperture shape was performed to rule out causes of discrepancies between planned and delivered doses. Additionally estimated arc delivery times, overall plan generation times and effect of CP spacing and number of arcs on plan generation times were recorded. Results: Single arc SmartArc plans (SA4d) were generally better than IMRT and double arc plans (SA2Arcs) in terms of homogeneity and target coverage. Double arc plans seemed to have a positive role in achieving improved Conformity Index (CI) and better sparing of some Organs at Risk (OARs) compared to Step and Shoot IMRT (ss-IMRT) and SA4d. Overall Ergo++ plans achieved best CI for both PTVs. Dosimetric validation of all VMAT plans without modelling was found to be lower than ss-IMRT. Total MUs required for delivery were on average 19%, 30%, 10.6% and 6.5% lower than ss-IMRT for SA4d, SA2d (Single arc with 20 Gantry Spacing), SA2Arcs and Autobeam plans respectively. Autobeam was most efficient in terms of actual treatment delivery times whereas Ergo++ plans took longest to deliver. Conclusion: Overall SA single arc plans on average achieved best target coverage and homogeneity for both PTVs. SA2Arc plans showed improved CI and some OARs sparing. Very good dosimetric results were achieved with modelling. Ergo++ plans achieved best CI. Autobeam resulted in fastest treatment delivery times.

An Interactive Tool for Teaching and Learning English at Upper Primary Level for Mauritius

E-learning refers to the specific kind of learning experienced within the domain of educational technology, which can be used in or out of the classroom. In this paper, we give an overview of an e-learning platform 'An Innovative Interactive and Online English Platform for Upper Primary Students' is an interactive web-based application which will serve as an aid to the primary school students in Mauritius. The objectives of this platform are to offer quality learning resources for the English subject at our primary level of education, encourage self-learning and hence promote e-learning. The platform developed consists of several interesting features, for example, the English Verb Conjugation tool, Negative Form tool, Interrogative Form tool and Close Test Generator. Thus, this learning platform will be useful at a time where our country is looking for an alternative to private tuition and also, looking forward to increase the pass rate.

High Precision Draw Bending of Asymmetric Channel Section with Restriction Dies and Axial Tension

In recent years asymmetric cross section aluminum alloy stock has been finding increasing use in various industrial manufacturing areas such as general structures and automotive components. In these areas, components are generally required to have complex curved configuration and, as such, a bending process is required during manufacture. Undesirable deformation in bending processes such as flattening or wrinkling can easily occur when thin-walled sections are bent. Hence, a thorough understanding of the bending behavior of such sections is needed to prevent these undesirable deformations. In this study, the bending behavior of asymmetric channel section was examined using finite element analysis (FEA). Typical methods of preventing undesirable deformation, such as asymmetric laminated elastic mandrels were included in FEA model of draw bending. Additionally, axial tension was applied to prevent wrinkling. By utilizing the FE simulations effect of restriction dies and axial tension on undesirable deformation during the process was clarified.

An Improved ICI Self-Cancellation Scheme for Multi-Carrier Communication Systems

For broadband wireless mobile communication systems the orthogonal frequency division multiplexing (OFDM) is a suitable modulation scheme. The frequency offset between transmitter and receiver local oscillator is main drawback of OFDM systems, which causes intercarrier interference (ICI) in the subcarriers of the OFDM system. This ICI degrades the bit error rate (BER) performance of the system. In this paper an improved self-ICI cancellation scheme is proposed to improve the system performance. The proposed scheme is based on discrete Fourier transform-inverse discrete Fourier transform (DFT-IDFT). The simulation results show that there is satisfactory improvement in the bit error rate (BER) performance of the present scheme.

A Cell-centered Diffusion Finite Volume Scheme and it's Application to Magnetic Flux Compression Generators

A cell-centered finite volume scheme for discretizing diffusion operators on distorted quadrilateral meshes has recently been designed and added to APMFCG to enable that code to be used as a tool for studying explosive magnetic flux compression generators. This paper describes this scheme. Comparisons with analytic results for 2-D test cases are presented, as well as 2-D results from a test of a "realistic" generator configuration.