Complexity of Mathematical Expressions in Adaptive Multimodal Multimedia System Ensuring Access to Mathematics for Visually Impaired Users

Our adaptive multimodal system aims at correctly presenting a mathematical expression to visually impaired users. Given an interaction context (i.e. combination of user, environment and system resources) as well as the complexity of the expression itself and the user-s preferences, the suitability scores of different presentation format are calculated. Unlike the current state-of-the art solutions, our approach takes into account the user-s situation and not imposes a solution that is not suitable to his context and capacity. In this wok, we present our methodology for calculating the mathematical expression complexity and the results of our experiment. Finally, this paper discusses the concepts and principles applied on our system as well as their validation through cases studies. This work is our original contribution to an ongoing research to make informatics more accessible to handicapped users.

Fusion of ETM+ Multispectral and Panchromatic Texture for Remote Sensing Classification

This paper proposes to use ETM+ multispectral data and panchromatic band as well as texture features derived from the panchromatic band for land cover classification. Four texture features including one 'internal texture' and three GLCM based textures namely correlation, entropy, and inverse different moment were used in combination with ETM+ multispectral data. Two data sets involving combination of multispectral, panchromatic band and its texture were used and results were compared with those obtained by using multispectral data alone. A decision tree classifier with and without boosting were used to classify different datasets. Results from this study suggest that the dataset consisting of panchromatic band, four of its texture features and multispectral data was able to increase the classification accuracy by about 2%. In comparison, a boosted decision tree was able to increase the classification accuracy by about 3% with the same dataset.

Generation of Sets of Synthetic Classifiers for the Evaluation of Abstract-Level Combination Methods

This paper presents a new technique for generating sets of synthetic classifiers to evaluate abstract-level combination methods. The sets differ in terms of both recognition rates of the individual classifiers and degree of similarity. For this purpose, each abstract-level classifier is considered as a random variable producing one class label as the output for an input pattern. From the initial set of classifiers, new slightly different sets are generated by applying specific operators, which are defined at the purpose. Finally, the sets of synthetic classifiers have been used to estimate the performance of combination methods for abstract-level classifiers. The experimental results demonstrate the effectiveness of the proposed approach.

Immune Responce in Mice Immunized with Live Cold-Adapted Influenza Vaccine in Combination with Chitosan-Based Adjuvants

An influence of intranasal combined injection of live cold-adapted influenza vaccine with chitosan derivatives as adjuvants on the subpopulation structure of mononuclear leukocytes of mouse spleen which reflects the orientation of the immune response was studied. It is found that the inclusion of chitosan preparations promotes activation of cellular-level of immune response.

Electroremediation of Cu-Contaminated Soil

This study investigated the removal efficiency of electrokinetic remediation of copper-contaminated soil at different combinations of enhancement reagents used as anolyte and catholyte. Sodium hydroxide (at 0.1, 0.5, and 1.0 M concentrations) and distilled water were used as anolyte, while lactic acid (at 0.01, 0.1, and 0.5 M concentrations), ammonium citrate (also at 0.01, 0.1, and 0.5 M concentrations) and distilled water were used as catholyte. A continuous voltage application (1.0 VDC/cm) was employed for 240 hours for each experiment. The copper content of the catholyte was determined at the end of the 240-hour period. Optimization was carried out with a Response Surface Methodology - Optimal Design, including F test, and multiple comparison method, to determine which pair of anolyte-catholyte was the most significant for the removal efficiency. "1.0 M NaOH" was found to be the most significant anolyte while it was established that lactic acid was the most significant type of catholyte to be used for the most successful electrokinetic experiments. Concentrations of lactic acid should be at the range of 0.1 M to 0.5 M to achieve maximum percent removal values.

Optimal Data Compression and Filtering: The Case of Infinite Signal Sets

We present a theory for optimal filtering of infinite sets of random signals. There are several new distinctive features of the proposed approach. First, we provide a single optimal filter for processing any signal from a given infinite signal set. Second, the filter is presented in the special form of a sum with p terms where each term is represented as a combination of three operations. Each operation is a special stage of the filtering aimed at facilitating the associated numerical work. Third, an iterative scheme is implemented into the filter structure to provide an improvement in the filter performance at each step of the scheme. The final step of the concerns signal compression and decompression. This step is based on the solution of a new rank-constrained matrix approximation problem. The solution to the matrix problem is described in this paper. A rigorous error analysis is given for the new filter.

Classifier Combination Approach in Motion Imagery Signals Processing for Brain Computer Interface

In this study we focus on improvement performance of a cue based Motor Imagery Brain Computer Interface (BCI). For this purpose, data fusion approach is used on results of different classifiers to make the best decision. At first step Distinction Sensitive Learning Vector Quantization method is used as a feature selection method to determine most informative frequencies in recorded signals and its performance is evaluated by frequency search method. Then informative features are extracted by packet wavelet transform. In next step 5 different types of classification methods are applied. The methodologies are tested on BCI Competition II dataset III, the best obtained accuracy is 85% and the best kappa value is 0.8. At final step ordered weighted averaging (OWA) method is used to provide a proper aggregation classifiers outputs. Using OWA enhanced system accuracy to 95% and kappa value to 0.9. Applying OWA just uses 50 milliseconds for performing calculation.

Research of a Multistep Method Applied to Numerical Solution of Volterra Integro-Differential Equation

Solution of some practical problems is reduced to the solution of the integro-differential equations. But for the numerical solution of such equations basically quadrature methods or its combination with multistep or one-step methods are used. The quadrature methods basically is applied to calculation of the integral participating in right hand side of integro-differential equations. As this integral is of Volterra type, it is obvious that at replacement with its integrated sum the upper limit of the sum depends on a current point in which values of the integral are defined. Thus we receive the integrated sum with variable boundary, to work with is hardly. Therefore multistep method with the constant coefficients, which is free from noted lack and gives the way for finding it-s coefficients is present.

Structure-vibration Analysis of a Power Transformer(154kV/60MVA/Single Phase)

The most common cause of power transformer failures is mechanical defect brought about by excessive vibration, which is formed by the combination of multiples of a frequency of 120 Hz. In this paper, the types of mechanical exciting forces applied to the power transformer were classified, and the mechanical damage mechanism of the power transformer was identified using the vibration transfer route to the machine or structure. The general effects of 120 Hz-vibration on the enclosure, bushing, Buchholz relay, pressure release valve and tap changer of the transformer were also examined.

An EEG Case Study of Arithmetical Reasoning by Four Individuals Varying in Imagery and Mathematical Ability: Implications for Mathematics Education

The main issue of interest here is whether individuals who differ in arithmetical reasoning ability and levels of imagery ability display different brain activity during the conduct of mental arithmetical reasoning tasks. This was a case study of four participants who represented four extreme combinations of Maths –Imagery abilities: ie., low-low, high-high, high-low, low-high respectively. As the Ps performed a series of 60 arithmetical reasoning tasks, 128-channel EEG recordings were taken and the pre-response interval subsequently analysed using EGI GeosourceTM software. The P who was high in both imagery and maths ability showed peak activity prior to response in BA7 (superior parietal cortex) but other Ps did not show peak activity in this region. The results are considered in terms of the diverse routes that may be employed by individuals during the conduct of arithmetical reasoning tasks and the possible implications of this for mathematics education.

Image Restoration in Non-Linear Filtering Domain using MDB approach

This paper proposes a new technique based on nonlinear Minmax Detector Based (MDB) filter for image restoration. The aim of image enhancement is to reconstruct the true image from the corrupted image. The process of image acquisition frequently leads to degradation and the quality of the digitized image becomes inferior to the original image. Image degradation can be due to the addition of different types of noise in the original image. Image noise can be modeled of many types and impulse noise is one of them. Impulse noise generates pixels with gray value not consistent with their local neighborhood. It appears as a sprinkle of both light and dark or only light spots in the image. Filtering is a technique for enhancing the image. Linear filter is the filtering in which the value of an output pixel is a linear combination of neighborhood values, which can produce blur in the image. Thus a variety of smoothing techniques have been developed that are non linear. Median filter is the one of the most popular non-linear filter. When considering a small neighborhood it is highly efficient but for large window and in case of high noise it gives rise to more blurring to image. The Centre Weighted Mean (CWM) filter has got a better average performance over the median filter. However the original pixel corrupted and noise reduction is substantial under high noise condition. Hence this technique has also blurring affect on the image. To illustrate the superiority of the proposed approach, the proposed new scheme has been simulated along with the standard ones and various restored performance measures have been compared.

Two-Phase Optimization for Selecting Materialized Views in a Data Warehouse

A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance. Therefore, in this paper, we introduce a new approach aimed to solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that 2PO outperform the original algorithms in terms of query processing cost and view maintenance cost.

Optimization of Parametric Studies Using Strategies of Sampling Techniques

To improve the efficiency of parametric studies or tests planning the method is proposed, that takes into account all input parameters, but only a few simulation runs are performed to assess the relative importance of each input parameter. For K input parameters with N input values the total number of possible combinations of input values equals NK. To limit the number of runs, only some (totally N) of possible combinations are taken into account. The sampling procedure Updated Latin Hypercube Sampling is used to choose the optimal combinations. To measure the relative importance of each input parameter, the Spearman rank correlation coefficient is proposed. The sensitivity and the influence of all parameters are analyzed within one procedure and the key parameters with the largest influence are immediately identified.

Java Based Automatic Curriculum Generator for Children with Trisomy 21

Early Intervention Program (EIP) is required to improve the overall development of children with Trisomy 21 (Down syndrome). In order to help trainer and parent in the implementation of EIP, a support system has been developed. The support system is able to screen data automatically, store and analyze data, generate individual EIP (curriculum) with optimal training duration and to generate training automatically. The system consists of hardware and software where the software has been implemented using Java language and Linux Fedora. The software has been tested to ensure the functionality and reliability. The prototype has been also tested in Down syndrome centers. Test result shows that the system is reliable to be used for generation of an individual curriculum which includes the training program to improve the motor, cognitive, and combination abilities of Down syndrome children under 6 years.

Density Estimation using Generalized Linear Model and a Linear Combination of Gaussians

In this paper we present a novel approach for density estimation. The proposed approach is based on using the logistic regression model to get initial density estimation for the given empirical density. The empirical data does not exactly follow the logistic regression model, so, there will be a deviation between the empirical density and the density estimated using logistic regression model. This deviation may be positive and/or negative. In this paper we use a linear combination of Gaussian (LCG) with positive and negative components as a model for this deviation. Also, we will use the expectation maximization (EM) algorithm to estimate the parameters of LCG. Experiments on real images demonstrate the accuracy of our approach.

A Comparison Study of the Removal of Selected Pharmaceuticals in Waters by Chemical Oxidation Treatments

The degradation of selected pharmaceuticals in some water matrices was studied by using several chemical treatments. The pharmaceuticals selected were the beta-blocker metoprolol, the nonsteroidal anti-inflammatory naproxen, the antibiotic amoxicillin, and the analgesic phenacetin; and their degradations were conducted by using UV radiation alone, ozone, Fenton-s reagent, Fenton-like system, photo-Fenton system, and combinations of UV radiation and ozone with H2O2, TiO2, Fe(II), and Fe(III). The water matrices, in addition to ultra-pure water, were a reservoir water, a groundwater, and two secondary effluents from two municipal WWTP. The results reveal that the presence of any second oxidant enhanced the oxidation rates, with the systems UV/TiO2 and O3/TiO2 providing the highest degradation rates. It is also observed in most of the investigated oxidation systems that the degradation rate followed the sequence: amoxicillin > naproxen > metoprolol > phenacetin. Lower rates were obtained with the pharmaceuticals dissolved in natural waters and secondary effluents due to the organic matter present which consume some amounts of the oxidant agents.

2D and 3D Finite Element Method Packages of CEMTool for Engineering PDE Problems

CEMTool is a command style design and analyzing package for scientific and technological algorithm and a matrix based computation language. In this paper, we present new 2D & 3D finite element method (FEM) packages for CEMTool. We discuss the detailed structures and the important features of pre-processor, solver, and post-processor of CEMTool 2D & 3D FEM packages. In contrast to the existing MATLAB PDE Toolbox, our proposed FEM packages can deal with the combination of the reserved words. Also, we can control the mesh in a very effective way. With the introduction of new mesh generation algorithm and fast solving technique, our FEM packages can guarantee the shorter computational time than MATLAB PDE Toolbox. Consequently, with our new FEM packages, we can overcome some disadvantages or limitations of the existing MATLAB PDE Toolbox.

Examination of Flood Runoff Reproductivity for Different Rainfall Sources in Central Vietnam

This paper presents the combination of different precipitation data sets and the distributed hydrological model, in order to examine the flood runoff reproductivity of scattered observation catchments. The precipitation data sets were obtained from observation using rain-gages, satellite based estimate (TRMM), and numerical weather prediction model (NWP), then were coupled with the super tank model. The case study was conducted in three basins (small, medium, and large size) located in Central Vietnam. Calculated hydrographs based on ground observation rainfall showed best fit to measured stream flow, while those obtained from TRMM and NWP showed high uncertainty of peak discharges. However, calculated hydrographs using the adjusted rainfield depicted a promising alternative for the application of TRMM and NWP in flood modeling for scattered observation catchments, especially for the extension of forecast lead time.

The Influence of Heat Treatment on Antimicrobial Proteins in Milk

the obligatory step during immunoglobulin and lysozyme concentration process is thermal treatment. The combination of temperature and time used in processing can affect the structure of the proteins and involve unfolding and aggregation. The aim of the present study was to evaluate the heat stability of total Igs, the particular immunoglobulin classes and lysozyme in milk. Milk samples were obtained from conventional dairy herd in Latvia. Raw milk samples were pasteurized in different regimes: 63 °C 30 min, 72 °C 15-20 s, 78 °C 15-20 s, 85 °C 15-20 s, 95 °C 15-20 s. The concentrations of Igs (IgA, IgG, IgM) and lysozyme were determined by turbodimetric method. During research was established, that activity of antimicrobial proteins decreases differently. Less concentration reduce was established in a case of lysozyme.

Tipover Stability Enhancement of Wheeled Mobile Manipulators Using an Adaptive Neuro- Fuzzy Inference Controller System

In this paper an algorithm based on the adaptive neuro-fuzzy controller is provided to enhance the tipover stability of mobile manipulators when they are subjected to predefined trajectories for the end-effector and the vehicle. The controller creates proper configurations for the manipulator to prevent the robot from being overturned. The optimal configuration and thus the most favorable control are obtained through soft computing approaches including a combination of genetic algorithm, neural networks, and fuzzy logic. The proposed algorithm, in this paper, is that a look-up table is designed by employing the obtained values from the genetic algorithm in order to minimize the performance index and by using this data base, rule bases are designed for the ANFIS controller and will be exerted on the actuators to enhance the tipover stability of the mobile manipulator. A numerical example is presented to demonstrate the effectiveness of the proposed algorithm.