Complex Method for Localized Muscle Fatigue Evaluation

The research was designed to examine the relationship between the development of muscle fatigue and the effect it has on sport performance, specifically during maximal voluntary contraction. This kind of this investigation using simultaneous electrophysiological and mechanical recordings, based on advanced mathematical processing, allows us to get parameters, and indexes in a short time, and finally, the mapping to use for the thorough investigation of the muscle contraction force, respectively the phenomenon of local muscle fatigue, both for athletes and other subjects.

Optimum Cascaded Design for Speech Enhancement Using Kalman Filter

Speech enhancement is the process of eliminating noise and increasing the quality of a speech signal, which is contaminated with other kinds of distortions. This paper is on developing an optimum cascaded system for speech enhancement. This aim is attained without diminishing any relevant speech information and without much computational and time complexity. LMS algorithm, Spectral Subtraction and Kalman filter have been deployed as the main de-noising algorithms in this work. Since these algorithms suffer from respective shortcomings, this work has been undertaken to design cascaded systems in different combinations and the evaluation of such cascades by qualitative (listening) and quantitative (SNR) tests.

Separation of Manganese and Cadmium from Cobalt Electrolyte Solution by Solvent Extraction

Impurity metals such as manganese and cadmium from high-tenor cobalt electrolyte solution were selectively removed by solvent extraction method using Co-D2EHPA after converting the functional group of D2EHPA with Co2+ ions. The process parameters such as pH, organic concentration, O/A ratio, kinetics etc. were investigated and the experiments were conducted by batch tests in the laboratory bench scale. Results showed that a significant amount of manganese and cadmium can be extracted using Co-D2EHPA for the optimum processing of cobalt electrolyte solution at equilibrium pH about 3.5. The McCabe-Thiele diagram, constructed from the extraction studies showed that 100% impurities can be extracted through four stages for manganese and three stages for cadmium using O/A ratio of 0.65 and 1.0, respectively. From the stripping study, it was found that 100% manganese and cadmium can be stripped from the loaded organic using 0.4 M H2SO4 in a single contact. The loading capacity of Co-D2EHPA by manganese and cadmium were also investigated with different O/A ratio as well as with number of stages of contact of aqueous and organic phases. Valuable information was obtained for the designing of an impurities removal process for the production of pure cobalt with less trouble in the electrowinning circuit.

Road Extraction Using Stationary Wavelet Transform

In this paper, a novel road extraction method using Stationary Wavelet Transform is proposed. To detect road features from color aerial satellite imagery, Mexican hat Wavelet filters are used by applying the Stationary Wavelet Transform in a multiresolution, multi-scale, sense and forming the products of Wavelet coefficients at a different scales to locate and identify road features at a few scales. In addition, the shifting of road features locations is considered through multiple scales for robust road extraction in the asymmetry road feature profiles. From the experimental results, the proposed method leads to a useful technique to form the basis of road feature extraction. Also, the method is general and can be applied to other features in imagery.

Control Improvement of a C Sugar Cane Crystallization Using an Auto-Tuning PID Controller Based on Linearization of a Neural Network

The industrial process of the sugar cane crystallization produces a residual that still contains a lot of soluble sucrose and the objective of the factory is to improve its extraction. Therefore, there are substantial losses justifying the search for the optimization of the process. Crystallization process studied on the industrial site is based on the “three massecuites process". The third step of this process constitutes the final stage of exhaustion of the sucrose dissolved in the mother liquor. During the process of the third step of crystallization (Ccrystallization), the phase that is studied and whose control is to be improved, is the growing phase (crystal growth phase). The study of this process on the industrial site is a problem in its own. A control scheme is proposed to improve the standard PID control law used in the factory. An auto-tuning PID controller based on instantaneous linearization of a neural network is then proposed.

Memory and Higher Cognition

Working memory (WM) can be defined as the system which actively holds information in the mind to do tasks in spite of the distraction. Contrary, short-term memory (STM) is a system that represents the capacity for the active storing of information without distraction. There has been accumulating evidence that these types of memory are related to higher cognition (HC). The aim of this study was to verify the relationship between HC and memory (visual STM and WM, auditory STM and WM). 59 primary school children were tested by intelligence test, mathematical tasks (HC) and memory subtests. We have shown that visual but not auditory memory is a significant predictor of higher cognition. The relevance of these results are discussed.

Control of Pressure Gradient in the Contraction of a Wind Tunnel

Subsonic wind tunnel experiments were conducted to study the effect of tripped boundary layer on the pressure distribution in the contraction region of the tunnel. Measurements were performed by installing trip strip at two different positions in the concave portion of the contraction. The results show that installation of the trip strips, have significant effects on both turbulence and pressure distribution. The reduction in the free stream turbulence and reduction of the wall static pressure distribution deferred signified with the location of the trip strip.

SystemC Modeling of Adaptive Least Mean Square Filter

In this paper, we demonstrate the adaptive least-mean-square (LMS) filter modeling using SystemC. SystemC is a modeling language that allows designer to model both hardware and software component and makes it possible to design from high level system of abstraction to low level system of abstraction. We produced five adaptive least-mean-square filter models that are classed as five abstraction levels using SystemC proceeding from the abstract model to the more concrete model.

Support Vector Machine Prediction Model of Early-stage Lung Cancer Based on Curvelet Transform to Extract Texture Features of CT Image

Purpose: To explore the use of Curvelet transform to extract texture features of pulmonary nodules in CT image and support vector machine to establish prediction model of small solitary pulmonary nodules in order to promote the ratio of detection and diagnosis of early-stage lung cancer. Methods: 2461 benign or malignant small solitary pulmonary nodules in CT image from 129 patients were collected. Fourteen Curvelet transform textural features were as parameters to establish support vector machine prediction model. Results: Compared with other methods, using 252 texture features as parameters to establish prediction model is more proper. And the classification consistency, sensitivity and specificity for the model are 81.5%, 93.8% and 38.0% respectively. Conclusion: Based on texture features extracted from Curvelet transform, support vector machine prediction model is sensitive to lung cancer, which can promote the rate of diagnosis for early-stage lung cancer to some extent.

Smart Surveillance using PDA

The aim of this research is to develop a fast and reliable surveillance system based on a personal digital assistant (PDA) device. This is to extend the capability of the device to detect moving objects which is already available in personal computers. Secondly, to compare the performance between Background subtraction (BS) and Temporal Frame Differencing (TFD) techniques for PDA platform as to which is more suitable. In order to reduce noise and to prepare frames for the moving object detection part, each frame is first converted to a gray-scale representation and then smoothed using a Gaussian low pass filter. Two moving object detection schemes i.e., BS and TFD have been analyzed. The background frame is updated by using Infinite Impulse Response (IIR) filter so that the background frame is adapted to the varying illuminate conditions and geometry settings. In order to reduce the effect of noise pixels resulting from frame differencing morphological filters erosion and dilation are applied. In this research, it has been found that TFD technique is more suitable for motion detection purpose than the BS in term of speed. On average TFD is approximately 170 ms faster than the BS technique

Study of Equilibrium and Mass Transfer of Co- Extraction of Different Mineral Acids with Iron(III) from Aqueous Solution by Tri-n-Butyl Phosphate Using Liquid Membrane

Extraction of Fe(III) from aqueous solution using Trin- butyl Phosphate (TBP) as carrier needs a highly acidic medium (>6N) as it favours formation of chelating complex FeCl3.TBP. Similarly, stripping of Iron(III) from loaded organic solvents requires neutral pH or alkaline medium to dissociate the same complex. It is observed that TBP co-extracts acids along with metal, which causes reversal of driving force of extraction and iron(III) is re-extracted back from the strip phase into the feed phase during Liquid Emulsion Membrane (LEM) pertraction. Therefore, rate of extraction of different mineral acids (HCl, HNO3, H2SO4) using TBP with and without presence of metal Fe(III) was examined. It is revealed that in presence of metal acid extraction is enhanced. Determination of mass transfer coefficient of both acid and metal extraction was performed by using Bulk Liquid Membrane (BLM). The average mass transfer coefficient was obtained by fitting the derived model equation with experimentally obtained data. The mass transfer coefficient of the mineral acid extraction is in the order of kHNO3 = 3.3x10-6m/s > kHCl = 6.05x10-7m/s > kH2SO4 = 1.85x10-7m/s. The distribution equilibria of the above mentioned acids between aqueous feed solution and a solution of tri-n-butyl-phosphate (TBP) in organic solvents have been investigated. The stoichiometry of acid extraction reveals the formation of TBP.2HCl, HNO3.2TBP, and TBP.H2SO4 complexes. Moreover, extraction of Iron(III) by TBP in HCl aqueous solution forms complex FeCl3.TBP.2HCl while in HNO3 medium forms complex 3FeCl3.TBP.2HNO3

Extraction of Phenol, o-Cresol, and p-Cresol from Coal Tar: Effect of Temperature and Mixing

Coal tar is a liquid by-product of the process of coal gasification and carbonation. This liquid oil mixture contains various kinds of useful compounds such as phenol, o-cresol, and p-cresol. These compounds are widely used as raw material for insecticides, dyes, medicines, perfumes, coloring matters, and many others. This research needed to be done that given the optimum conditions for the separation of phenol, o-cresol, and p-cresol from the coal tar by solvent extraction process. The aim of the present work was to study the effect of two kinds of aqueous were used as solvents: methanol and acetone solutions, the effect of temperature (298, 306, and 313K) and mixing (30, 35, and 40rpm) for the separation of phenol, o-cresol, and p-cresol from coal tar by solvent extraction. Results indicated that phenol, o-cresol, and p-cresol in coal tar were selectivity extracted into the solvent phase and these components could be separated by solvent extraction. The aqueous solution of methanol, mass ratio of solvent to feed, Eo/Ro=1, extraction temperature 306K and mixing 35 rpm were the most efficient for extraction of phenol, o-cresol, and p-cresol from coal tar.

Blind Source Separation for Convoluted Signals Based on Properties of Acoustic Transfer Function in Real Environments

Frequency domain independent component analysis has a scaling indeterminacy and a permutation problem. The scaling indeterminacy can be solved by use of a decomposed spectrum. For the permutation problem, we have proposed the rules in terms of gain ratio and phase difference derived from the decomposed spectra and the source-s coarse directions. The present paper experimentally clarifies that the gain ratio and the phase difference work effectively in a real environment but their performance depends on frequency bands, a microphone-space and a source-microphone distance. From these facts it is seen that it is difficult to attain a perfect solution for the permutation problem in a real environment only by either the gain ratio or the phase difference. For the perfect solution, this paper gives a solution to the problems in a real environment. The proposed method is simple, the amount of calculation is small. And the method has high correction performance without depending on the frequency bands and distances from source signals to microphones. Furthermore, it can be applied under the real environment. From several experiments in a real room, it clarifies that the proposed method has been verified.

Segmentation and Recognition of Handwritten Numeric Chains

In this paper we present an off line system for the recognition of the handwritten numeric chains. Our work is divided in two big parts. The first part is the realization of a recognition system of the isolated handwritten digits. In this case the study is based mainly on the evaluation of neural network performances, trained with the gradient back propagation algorithm. The used parameters to form the input vector of the neural network are extracted on the binary images of the digits by several methods: the distribution sequence, the Barr features and the centred moments of the different projections and profiles. The second part is the extension of our system for the reading of the handwritten numeric chains constituted of a variable number of digits. The vertical projection is used to segment the numeric chain at isolated digits and every digit (or segment) will be presented separately to the entry of the system achieved in the first part (recognition system of the isolated handwritten digits). The result of the recognition of the numeric chain will be displayed at the exit of the global system.

Leaching Behaviour of a Low-grade South African Nickel Laterite

The morphology, mineralogical and chemical composition of a low-grade nickel ore from Mpumalanga, South Africa, were studied by scanning electron microscope (SEM), X-ray diffraction (XRD) and X-ray fluorescence (XRF), respectively. The ore was subjected to atmospheric agitation leaching using sulphuric acid to investigate the effects of acid concentration, leaching temperature, leaching time and particle size on extraction of nickel and cobalt. Analyses results indicated the ore to be a saprolitic nickel laterite belonging to the serpentine group of minerals. Sulphuric acid was found to be able to extract nickel from the ore. Increased acid concentration and temperature only produced low amounts of nickel but improved cobalt extraction. As high as 77.44% Ni was achieved when leaching a -106+75μm fraction with 4.0M acid concentration at 25oC. The kinetics of nickel leaching from the saprolitic ore were studied and the activation energy was determined to be 18.16kJ/mol. This indicated that nickel leaching reaction was diffusion controlled.

Blind Source Separation based on the Estimation for the Number of the Blind Sources under a Dynamic Acoustic Environment

Independent component analysis can estimate unknown source signals from their mixtures under the assumption that the source signals are statistically independent. However, in a real environment, the separation performance is often deteriorated because the number of the source signals is different from that of the sensors. In this paper, we propose an estimation method for the number of the sources based on the joint distribution of the observed signals under two-sensor configuration. From several simulation results, it is found that the number of the sources is coincident to that of peaks in the histogram of the distribution. The proposed method can estimate the number of the sources even if it is larger than that of the observed signals. The proposed methods have been verified by several experiments.

Evaluation of the Effects of Climate Change in Destruction Procedure on Iran-s Historic Buildings

Climate change could lead to changes in cultural environments and landscapes as we know them.Climate change presents an immediate and significant threat to our natural and built environments and to the ways of life which co-exist with these environments. In most traditional buildings, the harmony of texture with nature and environment has been ever considered; so houses and cities have been mixed with their natural environment so astonishingly and the selection and usage of materials have been in such a way that they have provided the utmost conformity with the environment, as the result the created areas have a unique beauty and attraction.The extent to which climate change contributes to destruction procedure on Iran-s historic buildings.is a subject of current discussion. Cities, towns and built-up areas also have their own characteristics that might make them particularly vulnerable to climate change.

Topographic Arrangement of 3D Design Components on 2D Maps by Unsupervised Feature Extraction

As a result of the daily workflow in the design development departments of companies, databases containing huge numbers of 3D geometric models are generated. According to the given problem engineers create CAD drawings based on their design ideas and evaluate the performance of the resulting design, e.g. by computational simulations. Usually, new geometries are built either by utilizing and modifying sets of existing components or by adding single newly designed parts to a more complex design. The present paper addresses the two facets of acquiring components from large design databases automatically and providing a reasonable overview of the parts to the engineer. A unified framework based on the topographic non-negative matrix factorization (TNMF) is proposed which solves both aspects simultaneously. First, on a given database meaningful components are extracted into a parts-based representation in an unsupervised manner. Second, the extracted components are organized and visualized on square-lattice 2D maps. It is shown on the example of turbine-like geometries that these maps efficiently provide a wellstructured overview on the database content and, at the same time, define a measure for spatial similarity allowing an easy access and reuse of components in the process of design development.

Investigation of Pre-Treatment Parameters of Rye and Triticale for Bioethanol Production

This paper presents the new results of energy plant – rye and triticale at yellow ripeness and ripe, pre-treatment in high pressure steam reactor and monosaccharide extraction. There were investigated the influence of steam pressure (20 to 22 bar), retention duration (180 to 240 s) and catalytic sulphuric acid concentration strength (0 to 0.5 %) on the pre-treatment process, contents of monosaccharides (glucose, arabinose, xylose, mannose) and undesirable by-compounds (furfural and HMF) in the reactor. The study has determined that the largest amount of monosaccharides (37.2 % of glucose, 2.7 % of arabinose, 8.4 % of xylose, and 1.3 % of mannose) was received in the rye at ripe, the samples of which were mixed with 0.5 % concentration of catalytic sulphuric acid, and hydrolysed in the reactor, where the pressure was 20 bar, whereas the reaction time – 240 s.

Concept Indexing using Ontology and Supervised Machine Learning

Nowadays, ontologies are the only widely accepted paradigm for the management of sharable and reusable knowledge in a way that allows its automatic interpretation. They are collaboratively created across the Web and used to index, search and annotate documents. The vast majority of the ontology based approaches, however, focus on indexing texts at document level. Recently, with the advances in ontological engineering, it became clear that information indexing can largely benefit from the use of general purpose ontologies which aid the indexing of documents at word level. This paper presents a concept indexing algorithm, which adds ontology information to words and phrases and allows full text to be searched, browsed and analyzed at different levels of abstraction. This algorithm uses a general purpose ontology, OntoRo, and an ontologically tagged corpus, OntoCorp, both developed for the purpose of this research. OntoRo and OntoCorp are used in a two-stage supervised machine learning process aimed at generating ontology tagging rules. The first experimental tests show a tagging accuracy of 78.91% which is encouraging in terms of the further improvement of the algorithm.