Novel Adaptive Channel Equalization Algorithms by Statistical Sampling

In this paper, novel statistical sampling based equalization techniques and CNN based detection are proposed to increase the spectral efficiency of multiuser communication systems over fading channels. Multiuser communication combined with selective fading can result in interferences which severely deteriorate the quality of service in wireless data transmission (e.g. CDMA in mobile communication). The paper introduces new equalization methods to combat interferences by minimizing the Bit Error Rate (BER) as a function of the equalizer coefficients. This provides higher performance than the traditional Minimum Mean Square Error equalization. Since the calculation of BER as a function of the equalizer coefficients is of exponential complexity, statistical sampling methods are proposed to approximate the gradient which yields fast equalization and superior performance to the traditional algorithms. Efficient estimation of the gradient is achieved by using stratified sampling and the Li-Silvester bounds. A simple mechanism is derived to identify the dominant samples in real-time, for the sake of efficient estimation. The equalizer weights are adapted recursively by minimizing the estimated BER. The near-optimal performance of the new algorithms is also demonstrated by extensive simulations. The paper has also developed a (Cellular Neural Network) CNN based approach to detection. In this case fast quadratic optimization has been carried out by t, whereas the task of equalizer is to ensure the required template structure (sparseness) for the CNN. The performance of the method has also been analyzed by simulations.

Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP) for Recovering Signal

Given a large sparse signal, great wishes are to reconstruct the signal precisely and accurately from lease number of measurements as possible as it could. Although this seems possible by theory, the difficulty is in built an algorithm to perform the accuracy and efficiency of reconstructing. This paper proposes a new proved method to reconstruct sparse signal depend on using new method called Least Support Matching Pursuit (LS-OMP) merge it with the theory of Partial Knowing Support (PSK) given new method called Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP). The new methods depend on the greedy algorithm to compute the support which depends on the number of iterations. So to make it faster, the PKLS-OMP adds the idea of partial knowing support of its algorithm. It shows the efficiency, simplicity, and accuracy to get back the original signal if the sampling matrix satisfies the Restricted Isometry Property (RIP). Simulation results also show that it outperforms many algorithms especially for compressible signals.

Feature Selection with Kohonen Self Organizing Classification Algorithm

In this paper a one-dimension Self Organizing Map algorithm (SOM) to perform feature selection is presented. The algorithm is based on a first classification of the input dataset on a similarity space. From this classification for each class a set of positive and negative features is computed. This set of features is selected as result of the procedure. The procedure is evaluated on an in-house dataset from a Knowledge Discovery from Text (KDT) application and on a set of publicly available datasets used in international feature selection competitions. These datasets come from KDT applications, drug discovery as well as other applications. The knowledge of the correct classification available for the training and validation datasets is used to optimize the parameters for positive and negative feature extractions. The process becomes feasible for large and sparse datasets, as the ones obtained in KDT applications, by using both compression techniques to store the similarity matrix and speed up techniques of the Kohonen algorithm that take advantage of the sparsity of the input matrix. These improvements make it feasible, by using the grid, the application of the methodology to massive datasets.

Some Computational Results on MPI Parallel Implementation of Dense Simplex Method

There are two major variants of the Simplex Algorithm: the revised method and the standard, or tableau method. Today, all serious implementations are based on the revised method because it is more efficient for sparse linear programming problems. Moreover, there are a number of applications that lead to dense linear problems so our aim in this paper is to present some computational results on parallel implementation of dense Simplex Method. Our implementation is implemented on a SMP cluster using C programming language and the Message Passing Interface MPI. Preliminary computational results on randomly generated dense linear programs support our results.

Characterization of an Acetobacter Strain Isolated from Iranian Peach that Tolerates High Temperatures and Ethanol Concentrations

Vinegar is a precious food additive and complement as well as effective preservative against food spoilage. Recently traditional vinegar production has been improved using various natural substrates and fruits such as grape, palm, cherry, coconut, date, sugarcane, rice and balsam. These neoclassical fermentations resulted in several vinegar types with different tastes, fragrances and nutritional values because of applying various acetic acid bacteria as starters. Acetic acid bacteria include genera Acetobacter, Gluconacetobacter and Gluconobacter according to latest edition of Bergy-s Manual of Systematic Bacteriology that classifies genera on the basis of their 16s RNA differences. Acetobacter spp as the main vinegar starters belong to family Acetobacteraceae that are gram negative obligate aerobes, chemoorganotrophic bacilli that are oxidase negative and oxidize ethanol to acetic acid. In this research we isolated and identified a native Acetobacter strain with high acetic acid productivity and tolerance against high ethanol concentrations from Iranian peach as a summer delicious fruit that is very susceptible to food spoilage and decay. We used selective and specific laboratorial culture media such as Standard GYC, Frateur and Carr medium. Also we used a new industrial culture medium and a miniature fermentor with a new aeration system innovated by Pars Yeema Biotechnologists Co., Isfahan Science and Technology Town (ISTT), Isfahan, Iran. The isolated strain was successfully cultivated in modified Carr media with 2.5% and 5% ethanol simultaneously in high temperatures, 34 - 40º C after 96 hours of incubation period. We showed that the increase of ethanol concentration resulted in rising of strain sensitivity to high temperature. In conclusion we isolated and characterized a new Acetobacter strain from Iranian peach that could be considered as a potential strain for production of a new vinegar type, peach vinegar, with a delicious taste and advantageous nutritional value in food biotechnology and industrial microbiology.

Dynamic Modeling of Underwater Manipulator and Its Simulation

High redundancy and strong uncertainty are two main characteristics for underwater robotic manipulators with unlimited workspace and mobility, but they also make the motion planning and control difficult and complex. In order to setup the groundwork for the research on control schemes, the mathematical representation is built by using the Denavit-Hartenberg (D-H) method [9]&[12]; in addition to the geometry of the manipulator which was studied for establishing the direct and inverse kinematics. Then, the dynamic model is developed and used by employing the Lagrange theorem. Furthermore, derivation and computer simulation is accomplished using the MATLAB environment. The result obtained is compared with mechanical system dynamics analysis software, ADAMS. In addition, the creation of intelligent artificial skin using Interlink Force Sensing ResistorTM technology is presented as groundwork for future work

A Maximum Parsimony Model to Reconstruct Phylogenetic Network in Honey Bee Evolution

Phylogenies ; The evolutionary histories of groups of species are one of the most widely used tools throughout the life sciences, as well as objects of research with in systematic, evolutionary biology. In every phylogenetic analysis reconstruction produces trees. These trees represent the evolutionary histories of many groups of organisms, bacteria due to horizontal gene transfer and plants due to process of hybridization. The process of gene transfer in bacteria and hybridization in plants lead to reticulate networks, therefore, the methods of constructing trees fail in constructing reticulate networks. In this paper a model has been employed to reconstruct phylogenetic network in honey bee. This network represents reticulate evolution in honey bee. The maximum parsimony approach has been used to obtain this reticulate network.

A Patricia-Tree Approach for Frequent Closed Itemsets

In this paper, we propose an adaptation of the Patricia-Tree for sparse datasets to generate non redundant rule associations. Using this adaptation, we can generate frequent closed itemsets that are more compact than frequent itemsets used in Apriori approach. This adaptation has been experimented on a set of datasets benchmarks.

Turbulent Forced Convection Flow in a Channel over Periodic Grooves Using Nanofluids

Turbulent forced convection flow in a 2-dimensional channel over periodic grooves is numerically investigated. Finite volume method is used to study the effect of turbulence model. The range of Reynolds number varied from 10000 to 30000 for the ribheight to channel-height ratio (B/H) of 2. The downstream wall is heated by a uniform heat flux while the upstream wall is insulated. The investigation is analyzed with different types of nanoparticles such as SiO2, Al2O3, and ZnO, with water as a base fluid are used. The volume fraction is varied from 1% to 4% and the nanoparticle diameter is utilized between 20nm to 50nm. The results revealed 114% heat transfer enhancement compared to the water in a grooved channel by using SiO2 nanoparticle with volume fraction and nanoparticle diameter of 4% and 20nm respectively.

Investigation on the HRSG Installation at South Pars Gas Complex Phases 2&3

In this article the investigation about installation heat recovery steam generation (HRSG) on the exhaust of turbo generators of phases 2&3 at South Pars Gas Complex is presented. The temperature of exhaust gas is approximately 665 degree centigrade, Installation of heat recovery boiler was simulated in ThermoFlow 17.0.2 software, based on test operation data and the equipments site operation conditions in Pars exclusive economical energy area, the affect of installation HRSG package on the available gas turbine and its operation parameters, ambient temperature, the exhaust temperatures steam flow rate were investigated. Base on the results recommended HRSG package should have the capacity for 98 ton per hour high pressure steam generation this refinery, by use of exhaust of three gas turbines for each package in operation condition of each refinery at 30 degree centigrade. Besides saving energy this project will be an Environment-Friendly project. The Payback Period is estimated approximately 1.8 year, with considering Clean Development Mechanism.

A Computer Aided Detection (CAD) System for Microcalcifications in Mammograms - MammoScan mCaD

Clusters of microcalcifications in mammograms are an important sign of breast cancer. This paper presents a complete Computer Aided Detection (CAD) scheme for automatic detection of clustered microcalcifications in digital mammograms. The proposed system, MammoScan μCaD, consists of three main steps. Firstly all potential microcalcifications are detected using a a method for feature extraction, VarMet, and adaptive thresholding. This will also give a number of false detections. The goal of the second step, Classifier level 1, is to remove everything but microcalcifications. The last step, Classifier level 2, uses learned dictionaries and sparse representations as a texture classification technique to distinguish single, benign microcalcifications from clustered microcalcifications, in addition to remove some remaining false detections. The system is trained and tested on true digital data from Stavanger University Hospital, and the results are evaluated by radiologists. The overall results are promising, with a sensitivity > 90 % and a low false detection rate (approx 1 unwanted pr. image, or 0.3 false pr. image).

Impact of Changes in Excise Tax Rate for Strong Alcohol on Consumption and State Revenues in Latvia

State tax revenues in most countries started to decrease during the recession. Government of Latvia decided to compensate the decline by increasing rates of several taxes including excise tax on strong alcohol. The total increase in 2009 constituted 42% and the rate increased from 896€ to 1 266€ for 100l of absolute alcohol. Since then this has had a negative impact on consumption volumes and the split between legal and illegal market. The legal alcohol sales decreased by almost 50% (by volume), consequentially having negative effect on the State revenues from VAT and excise tax. Estimated results for 2010 are indicating 54 million € decrease in VAT, excise tax and other taxes versus 2008 (excise tax -19 million €, VAT -30 million €, other taxes -5 million €). The paper aims to analyze impact of the increase in excise tax on consumption patterns, State revenues and competitiveness of the local companies to draw up proposals for the state authorities regarding more effective tax policies. The analysis reveals a relationship between excise tax rate, illegal alcohol market and State revenues. The results can be used to improve excise tax system and effectiveness in Latvia.

Boundary-Element-Based Finite Element Methods for Helmholtz and Maxwell Equations on General Polyhedral Meshes

We present new finite element methods for Helmholtz and Maxwell equations on general three-dimensional polyhedral meshes, based on domain decomposition with boundary elements on the surfaces of the polyhedral volume elements. The methods use the lowest-order polynomial spaces and produce sparse, symmetric linear systems despite the use of boundary elements. Moreover, piecewise constant coefficients are admissible. The resulting approximation on the element surfaces can be extended throughout the domain via representation formulas. Numerical experiments confirm that the convergence behavior on tetrahedral meshes is comparable to that of standard finite element methods, and equally good performance is attained on more general meshes.

Quantitative Estimation of Periodicities in Lyari River Flow Routing

The hydrologic time series data display periodic structure and periodic autoregressive process receives considerable attention in modeling of such series. In this communication long term record of monthly waste flow of Lyari river is utilized to quantify by using PAR modeling technique. The parameters of model are estimated by using Frances & Paap methodology. This study shows that periodic autoregressive model of order 2 is the most parsimonious model for assessing periodicity in waste flow of the river. A careful statistical analysis of residuals of PAR (2) model is used for establishing goodness of fit. The forecast by using proposed model confirms significance and effectiveness of the model.

Greek Compounds: A Challenging Case for the Parsing Techniques of PC-KIMMO v.2

In this paper we describe the recognition process of Greek compound words using the PC-KIMMO software. We try to show certain limitations of the system with respect to the principles of compound formation in Greek. Moreover, we discuss the computational processing of phenomena such as stress and syllabification which are indispensable for the analysis of such constructions and we try to propose linguistically-acceptable solutions within the particular system.

Influence of Various Factors on Stability of CoSPc in LPG Sweetening Process

IFP Group Technology “Sulfrex process" was used in Iran-s South Pars Gas Complex Refineries for removing sulfur compounds such as mercaptans, carbonyl sulfide and hydrogen sulfide, which uses sulfonated cobalt phthalocyanine dispersed in alkaline solution as catalyst. In this technology, catalyst and alkaline solution were used circularly. However the stability of catalyst due to effect of some parameters would reduce with the running of the unit and therefore sweetening efficiency would be decreased. Hence, the aim of this research is study the factors effecting on the stability of catalyst.

Deep Web Content Mining

The rapid expansion of the web is causing the constant growth of information, leading to several problems such as increased difficulty of extracting potentially useful knowledge. Web content mining confronts this problem gathering explicit information from different web sites for its access and knowledge discovery. Query interfaces of web databases share common building blocks. After extracting information with parsing approach, we use a new data mining algorithm to match a large number of schemas in databases at a time. Using this algorithm increases the speed of information matching. In addition, instead of simple 1:1 matching, they do complex (m:n) matching between query interfaces. In this paper we present a novel correlation mining algorithm that matches correlated attributes with smaller cost. This algorithm uses Jaccard measure to distinguish positive and negative correlated attributes. After that, system matches the user query with different query interfaces in special domain and finally chooses the nearest query interface with user query to answer to it.

Thematic Role Extraction Using Shallow Parsing

Extracting thematic (semantic) roles is one of the major steps in representing text meaning. It refers to finding the semantic relations between a predicate and syntactic constituents in a sentence. In this paper we present a rule-based approach to extract semantic roles from Persian sentences. The system exploits a twophase architecture to (1) identify the arguments and (2) label them for each predicate. For the first phase we developed a rule based shallow parser to chunk Persian sentences and for the second phase we developed a knowledge-based system to assign 16 selected thematic roles to the chunks. The experimental results of testing each phase are shown at the end of the paper.

Weight Functions for Signal Reconstruction Based On Level Crossings

Although the level crossing concept has been the subject of intensive investigation over the last few years, certain problems of great interest remain unsolved. One of these concern is distribution of threshold levels. This paper presents a new threshold level allocation schemes for level crossing based on nonuniform sampling. Intuitively, it is more reasonable if the information rich regions of the signal are sampled finer and those with sparse information are sampled coarser. To achieve this objective, we propose non-linear quantization functions which dynamically assign the number of quantization levels depending on the importance of the given amplitude range. Two new approaches to determine the importance of the given amplitude segment are presented. The proposed methods are based on exponential and logarithmic functions. Various aspects of proposed techniques are discussed and experimentally validated. Its efficacy is investigated by comparison with uniform sampling.

Quantitative Study for Exchange of Gases from Open Sewer Channel to Atmosphere

In this communication a quantitative modeling approach is applied to construct model for the exchange of gases from open sewer channel to the atmosphere. The data for the exchange of gases of the open sewer channel for the year January 1979 to December 2006 is utilized for the construction of the model. The study reveals that stream flow of the open sewer channel exchanges the toxic gases continuously with time varying scale. We find that the quantitative modeling approach is more parsimonious model for these exchanges. The usual diagnostic tests are applied for the model adequacy. This model is beneficial for planner and managerial bodies for the improvement of implemented policies to overcome future environmental problems.