A Grey-Fuzzy Controller for Optimization Technique in Wireless Networks

In wireless and mobile communications, this progress provides opportunities for introducing new standards and improving existing services. Supporting multimedia traffic with wireless networks quality of service (QoS). In this paper, a grey-fuzzy controller for radio resource management (GF-RRM) is presented to maximize the number of the served calls and QoS provision in wireless networks. In a wireless network, the call arrival rate, the call duration and the communication overhead between the base stations and the control center are vague and uncertain. In this paper, we develop a method to predict the cell load and to solve the RRM problem based on the GF-RRM, and support the present facility has been built on the application-level of the wireless networks. The GF-RRM exhibits the better adaptability, fault-tolerant capability and performance than other algorithms. Through simulations, we evaluate the blocking rate, update overhead, and channel acquisition delay time of the proposed method. The results demonstrate our algorithm has the lower blocking rate, less updated overhead, and shorter channel acquisition delay.

A Review on Terrestrial Multimedia Communication using OFDM Technology

The development of wireless communication technologies has changed our living style in global level. After the international success of mobile telephony standards, the location and time independent voice connection has become a default method in daily telecommunications. As for today, highly advanced multimedia messaging plays a key role in value added service handling. Along with evolving data services, the need for more complex applications can be seen, including the mobile usage of broadcast technologies. Here performance of a system design for terrestrial multimedia content is examined with emphasis on mobile reception. This review paper has accommodated the understanding of physical layer role and the flavour of terrestrial channel effects on the terrestrial multimedia transmission using OFDM keeping DVB-H as benchmark standard.

Particle Swarm Optimization with Reduction for Global Optimization Problems

This paper presents an algorithm of particle swarm optimization with reduction for global optimization problems. Particle swarm optimization is an algorithm which refers to the collective motion such as birds or fishes, and a multi-point search algorithm which finds a best solution using multiple particles. Particle swarm optimization is so flexible that it can adapt to a number of optimization problems. When an objective function has a lot of local minimums complicatedly, the particle may fall into a local minimum. For avoiding the local minimum, a number of particles are initially prepared and their positions are updated by particle swarm optimization. Particles sequentially reduce to reach a predetermined number of them grounded in evaluation value and particle swarm optimization continues until the termination condition is met. In order to show the effectiveness of the proposed algorithm, we examine the minimum by using test functions compared to existing algorithms. Furthermore the influence of best value on the initial number of particles for our algorithm is discussed.

E-government Adoption in Romania

The Romanian government has been making significant attempts to make its services and information available on the Internet. According to the UN e-government survey conducted in 2008, Romania comes under mid range countries by utilization of egovernment (percent of utilization 41%). Romania-s national portal www.e-guvernare.ro aims at progressively making all services and information accessible through the portal. However, the success of these efforts depends, to a great extent, on how well the targeted users for such services, citizens in general, make use of them. For this reason, the purpose of the presented study was to identify what factors could affect the citizens' adoption of e-government services. The study is an extension of the Technology Acceptance Model. The proposed model was validated using data collected from 481 citizens. The results provided substantial support for all proposed hypotheses and showed the significance of the extended constructs.

Management and Control of Industrial Effluents Discharged to Public Sewers: A Case Study

An overview of the important aspects of managing and controlling industrial effluent discharges to public sewers namely sampling, characterization, quantification and legislative controls has been presented. The findings have been validated by means of a case study covering three industrial sectors namely, tanning, textile finishing and food processing industries. Industrial effluents discharges were found to be best monitored by systematic and automatic sampling and quantified using water meter readings corrected for evaporative and consumptive losses. Based on the treatment processes employed in the public owned treatment works and the chemical oxygen demand and biochemical oxygen demand levels obtained, the effluent from all the three industrial sectors studied were found to lie in the toxic zone. Thus, physico-chemical treatment of these effluents is required to bring them into the biodegradable zone. KL values (quoted to base e) were greater than 0.50 day-1 compared to 0.39 day-1 for typical municipality wastewater.

An Automated Test Setup for the Characterization of Antenna in CATR

This paper describes the development of a fully automated measurement software for antenna radiation pattern measurements in a Compact Antenna Test Range (CATR). The CATR has a frequency range from 2-40 GHz and the measurement hardware includes a Network Analyzer for transmitting and Receiving the microwave signal and a Positioner controller to control the motion of the Styrofoam column. The measurement process includes Calibration of CATR with a Standard Gain Horn (SGH) antenna followed by Gain versus angle measurement of the Antenna under test (AUT). The software is designed to control a variety of microwave transmitter / receiver and two axis Positioner controllers through the standard General Purpose interface bus (GPIB) interface. Addition of new Network Analyzers is supported through a slight modification of hardware control module. Time-domain gating is implemented to remove the unwanted signals and get the isolated response of AUT. The gated response of the AUT is compared with the calibration data in the frequency domain to obtain the desired results. The data acquisition and processing is implemented in Agilent VEE and Matlab. A variety of experimental measurements with SGH antennas were performed to validate the accuracy of software. A comparison of results with existing commercial softwares is presented and the measured results are found to be within .2 dBm.

Analysis of Physicochemical Properties on Prediction of R5, X4 and R5X4 HIV-1 Coreceptor Usage

Bioinformatics methods for predicting the T cell coreceptor usage from the array of membrane protein of HIV-1 are investigated. In this study, we aim to propose an effective prediction method for dealing with the three-class classification problem of CXCR4 (X4), CCR5 (R5) and CCR5/CXCR4 (R5X4). We made efforts in investigating the coreceptor prediction problem as follows: 1) proposing a feature set of informative physicochemical properties which is cooperated with SVM to achieve high prediction test accuracy of 81.48%, compared with the existing method with accuracy of 70.00%; 2) establishing a large up-to-date data set by increasing the size from 159 to 1225 sequences to verify the proposed prediction method where the mean test accuracy is 88.59%, and 3) analyzing the set of 14 informative physicochemical properties to further understand the characteristics of HIV-1coreceptors.

Use of Fuzzy Edge Image in Block Truncation Coding for Image Compression

An image compression method has been developed using fuzzy edge image utilizing the basic Block Truncation Coding (BTC) algorithm. The fuzzy edge image has been validated with classical edge detectors on the basis of the results of the well-known Canny edge detector prior to applying to the proposed method. The bit plane generated by the conventional BTC method is replaced with the fuzzy bit plane generated by the logical OR operation between the fuzzy edge image and the corresponding conventional BTC bit plane. The input image is encoded with the block mean and standard deviation and the fuzzy bit plane. The proposed method has been tested with test images of 8 bits/pixel and size 512×512 and found to be superior with better Peak Signal to Noise Ratio (PSNR) when compared to the conventional BTC, and adaptive bit plane selection BTC (ABTC) methods. The raggedness and jagged appearance, and the ringing artifacts at sharp edges are greatly reduced in reconstructed images by the proposed method with the fuzzy bit plane.

A New Model for Question Answering Systems

Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems. If this module doesn't work properly, it will make problems for other sections. Moreover answer processing module is an emerging topic in Question Answering, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic classification. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. Answer processing module, consists of candidate answer filtering, candidate answer ordering components and also it has a validation section for interacting with user. This module makes it more suitable to find exact answer. In this paper we have described question and answer processing modules with modeling, implementing and evaluating the system. System implemented in two versions. Results show that 'Version No.1' gave correct answer to 70% of questions (30 correct answers to 50 asked questions) and 'version No.2' gave correct answers to 94% of questions (47 correct answers to 50 asked questions).

Recent Accounting Standard Setting Changes for Consolidated Financial Statements

In the current context of globalization, a large number of companies sought to develop as a group in order to reach to other markets or meet the necessary criteria for listing on a stock exchange. The issue of consolidated financial statements prepared by a parent, an investor or a venture and the financial reporting standards guiding them therefore becomes even more important. The aim of our paper is to expose this issue in a consistent manner, first by summarizing the international accounting and financial reporting standards applicable before the 1st of January 2013 and considering the role of the crisis in shaping the standard setting process, and secondly by analyzing the newly issued/modified standards and main changes being brought

Exploiting Two Intelligent Models to Predict Water Level: A Field Study of Urmia Lake, Iran

Water level forecasting using records of past time series is of importance in water resources engineering and management. For example, water level affects groundwater tables in low-lying coastal areas, as well as hydrological regimes of some coastal rivers. Then, a reliable prediction of sea-level variations is required in coastal engineering and hydrologic studies. During the past two decades, the approaches based on the Genetic Programming (GP) and Artificial Neural Networks (ANN) were developed. In the present study, the GP is used to forecast daily water level variations for a set of time intervals using observed water levels. The measurements from a single tide gauge at Urmia Lake, Northwest Iran, were used to train and validate the GP approach for the period from January 1997 to July 2008. Statistics, the root mean square error and correlation coefficient, are used to verify model by comparing with a corresponding outputs from Artificial Neural Network model. The results show that both these artificial intelligence methodologies are satisfactory and can be considered as alternatives to the conventional harmonic analysis.

Novel Rao-Blackwellized Particle Filter for Mobile Robot SLAM Using Monocular Vision

This paper presents the novel Rao-Blackwellised particle filter (RBPF) for mobile robot simultaneous localization and mapping (SLAM) using monocular vision. The particle filter is combined with unscented Kalman filter (UKF) to extending the path posterior by sampling new poses that integrate the current observation which drastically reduces the uncertainty about the robot pose. The landmark position estimation and update is also implemented through UKF. Furthermore, the number of resampling steps is determined adaptively, which seriously reduces the particle depletion problem, and introducing the evolution strategies (ES) for avoiding particle impoverishment. The 3D natural point landmarks are structured with matching Scale Invariant Feature Transform (SIFT) feature pairs. The matching for multi-dimension SIFT features is implemented with a KD-Tree in the time cost of O(log2 N). Experiment results on real robot in our indoor environment show the advantages of our methods over previous approaches.

Avoiding Catastrophic Forgetting by a Dual-Network Memory Model Using a Chaotic Neural Network

In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. In this paper, we propose a biologically inspired neural network model which overcomes this problem. The proposed model consists of two distinct networks: one is a Hopfield type of chaotic associative memory and the other is a multilayer neural network. We consider that these networks correspond to the hippocampus and the neocortex of the brain, respectively. Information given is firstly stored in the hippocampal network with fast learning algorithm. Then the stored information is recalled by chaotic behavior of each neuron in the hippocampal network. Finally, it is consolidated in the neocortical network by using pseudopatterns. Computer simulation results show that the proposed model has much better ability to avoid catastrophic forgetting in comparison with conventional models.

Unconstrained Arabic Online Handwritten Words Segmentation using New HMM State Design

In this paper we propose a segmentation system for unconstrained Arabic online handwriting. An essential problem addressed by analytical-based word recognition system. The system is composed of two-stages the first is a newly special designed hidden Markov model (HMM) and the second is a rules based stage. In our system, handwritten words are broken up into characters by simultaneous segmentation-recognition using HMMs of unique design trained using online features most of which are novel. The HMM output characters boundaries represent the proposed segmentation points (PSP) which are then validated by rules-based post stage without any contextual information help to solve different segmentation errors. The HMM has been designed and tested using a self collected dataset (OHASD) [1]. Most errors cases are cured and remarkable segmentation enhancement is achieved. Very promising word and character segmentation rates are obtained regarding the unconstrained Arabic handwriting difficulty and not using context help.

A Novel Adaptive E-Learning Model Based on Developed Learner's Styles

Adaptive e-learning today gives the student a central role in his own learning process. It allows learners to try things out, participate in courses like never before, and get more out of learning than before. In this paper, an adaptive e-learning model for logic design, simplification of Boolean functions and related fields is presented. Such model presents suitable courses for each student in a dynamic and adaptive manner using existing database and workflow technologies. The main objective of this research work is to provide an adaptive e-learning model based learners' personality using explicit and implicit feedback. To recognize the learner-s, we develop dimensions to decide each individual learning style in order to accommodate different abilities of the users and to develop vital skills. Thus, the proposed model becomes more powerful, user friendly and easy to use and interpret. Finally, it suggests a learning strategy and appropriate electronic media that match the learner-s preference.

Horizontal Aspects of Planning Climate Change Adapted Management of Wetlands

Climate change causes severe effects on natural habitats, especially wetlands. These challenges require the adaptation of their management to probable effects of climate change. A compilation of necessary changes in land management was collected in a Hungarian area being both national park and Natura 2000 SAC and SCI site in favor of increasing the resilience and reducing vulnerability. Several factors, such as ecological aspects, nature conservation and climatic adaptation should be combined with social and economic factors during the process of developing climate change adapted management on vulnerable wetlands. Planning adaptive management should be determined by a priority order of conservation aims and evaluation of factors at the determined planning unit. Mowing techniques, frequency and exact date should be observed as well as grazing species and their breed, due to different grazing, group forming and trampling habits. Integrating landscape history and historical land development into the planning process is essential.

Sentiment Analysis: Popularity of Candidates for the President of the United States

This article deals with the popularity of candidates for the president of the United States of America. The popularity is assessed according to public comments on the Web 2.0. Social networking, blogging and online forums (collectively Web 2.0) are for common Internet users the easiest way to share their personal opinions, thoughts, and ideas with the entire world. However, the web content diversity, variety of technologies and website structure differences, all of these make the Web 2.0 a network of heterogeneous data, where things are difficult to find for common users. The introductory part of the article describes methodology for gathering and processing data from Web 2.0. The next part of the article is focused on the evaluation and content analysis of obtained information, which write about presidential candidates.

Automatic Microaneurysm Quantification for Diabetic Retinopathy Screening

Microaneurysm is a key indicator of diabetic retinopathy that can potentially cause damage to retina. Early detection and automatic quantification are the keys to prevent further damage. In this paper, which focuses on automatic microaneurysm detection in images acquired through non-dilated pupils, we present a series of experiments on feature selection and automatic microaneurysm pixel classification. We found that the best feature set is a combination of 10 features: the pixel-s intensity of shade corrected image, the pixel hue, the standard deviation of shade corrected image, DoG4, the area of the candidate MA, the perimeter of the candidate MA, the eccentricity of the candidate MA, the circularity of the candidate MA, the mean intensity of the candidate MA on shade corrected image and the ratio of the major axis length and minor length of the candidate MA. The overall sensitivity, specificity, precision, and accuracy are 84.82%, 99.99%, 89.01%, and 99.99%, respectively.

Electrical Characteristics of SCR - based ESD Device for I/O and Power Rail Clamp in 0.35um Process

This paper presents a SCR-based ESD protection devices for I/O clamp and power rail clamp, respectably. These devices have a low trigger voltage and high holding voltage characteristics than conventional SCR device. These devices are fabricated by using 0.35um BCD (Bipolar-CMOS-DMOS) processes. These devices were validated using a TLP system. From the experimental results, the device for I/O ESD clamp has a trigger voltage of 5.8V. Also, the device for power rail ESD clamp has a holding voltage of 7.7V.

Multiple-Points Fault Signature's Dynamics Modeling for Bearing Defect Frequencies

Occurrence of a multiple-points fault in machine operations could result in exhibiting complex fault signatures, which could result in lowering fault diagnosis accuracy. In this study, a multiple-points defect model (MPDM) is proposed which can simulate fault signature-s dynamics for n-points bearing faults. Furthermore, this study identifies that in case of multiple-points fault in the rotary machine, the location of the dominant component of defect frequency shifts depending upon the relative location of the fault points which could mislead the fault diagnostic model to inaccurate detections. Analytical and experimental results are presented to characterize and validate the variation in the dominant component of defect frequency. Based on envelop detection analysis, a modification is recommended in the existing fault diagnostic models to consider the multiples of defect frequency rather than only considering the frequency spectrum at the defect frequency in order to incorporate the impact of multiple points fault.