A New Fast Intra Prediction Mode Decision Algorithm for H.264/AVC Encoders

The H.264/AVC video coding standard contains a number of advanced features. Ones of the new features introduced in this standard is the multiple intramode prediction. Its function exploits directional spatial correlation with adjacent block for intra prediction. With this new features, intra coding of H.264/AVC offers a considerably higher improvement in coding efficiency compared to other compression standard, but computational complexity is increased significantly when brut force rate distortion optimization (RDO) algorithm is used. In this paper, we propose a new fast intra prediction mode decision method for the complexity reduction of H.264 video coding. for luma intra prediction, the proposed method consists of two step: in the first step, we make the RDO for four mode of intra 4x4 block, based the distribution of RDO cost of those modes and the idea that the fort correlation with adjacent mode, we select the best mode of intra 4x4 block. In the second step, we based the fact that the dominating direction of a smaller block is similar to that of bigger block, the candidate modes of 8x8 blocks and 16x16 macroblocks are determined. So, in case of chroma intra prediction, the variance of the chroma pixel values is much smaller than that of luma ones, since our proposed uses only the mode DC. Experimental results show that the new fast intra mode decision algorithm increases the speed of intra coding significantly with negligible loss of PSNR.

Identification and Classification of Plastic Resins using Near Infrared Reflectance Spectroscopy

In this paper, an automated system is presented for identification and separation of plastic resins based on near infrared (NIR) reflectance spectroscopy. For identification and separation among resins, a "Two-Filter" identification method is proposed that is capable to distinguish among polyethylene terephthalate (PET), high density polyethylene (HDPE), polyvinyl chloride (PVC), polypropylene (PP) and polystyrene (PS). Through surveying effects of parameters such as surface contamination, sample thickness, label and cap existence, it was obvious that the "Two-Filter" method has a high efficiency in identification of resins. It is shown that accurate identification and separation of five major resins can be obtained through calculating the relative reflectance at two wavelengths in the NIR region.

Challenges of Implementing Urban Master Plans: The Lahore Experience

Master plan is a tool to guide and manage the growth of cities in a planned manner. The soul of a master plan lies in its implementation framework. If not implemented, people are trapped in a mess of urban problems and laissez-faire development having serious long term repercussions. Unfortunately, Master Plans prepared for several major cities of Pakistan could not be fully implemented due to host of reasons and Lahore is no exception. Being the second largest city of Pakistan with a population of over 7 million people, Lahore holds the distinction that the first ever Master Plan in the country was prepared for this city in 1966. Recently in 2004, a new plan titled `Integrated Master Plan for Lahore-2021- has been approved for implementation. This paper provides a comprehensive account of the weaknesses and constraints in the plan preparation process and implementation strategies of Master Plans prepared for Lahore. It also critically reviews the new Master Plan particularly with respect to the proposed implementation framework. The paper discusses the prospects and pre-conditions for successful implementation of the new Plan in the light of historic analysis, interviews with stakeholders and the new institutional context under the devolution plan.

Analysis of Blind Decision Feedback Equalizer Convergence: Interest of a Soft Decision

In this paper the behavior of the decision feedback equalizers (DFEs) adapted by the decision-directed or the constant modulus blind algorithms is presented. An analysis of the error surface of the corresponding criterion cost functions is first developed. With the intention of avoiding the ill-convergence of the algorithm, the paper proposes to modify the shape of the cost function error surface by using a soft decision instead of the hard one. This was shown to reduce the influence of false decisions and to smooth the undesirable minima. Modified algorithms using the soft decision during a pseudo-training phase with an automatic switch to the properly tracking phase are then derived. Computer simulations show that these modified algorithms present better ability to avoid local minima than conventional ones.

Fusion of ETM+ Multispectral and Panchromatic Texture for Remote Sensing Classification

This paper proposes to use ETM+ multispectral data and panchromatic band as well as texture features derived from the panchromatic band for land cover classification. Four texture features including one 'internal texture' and three GLCM based textures namely correlation, entropy, and inverse different moment were used in combination with ETM+ multispectral data. Two data sets involving combination of multispectral, panchromatic band and its texture were used and results were compared with those obtained by using multispectral data alone. A decision tree classifier with and without boosting were used to classify different datasets. Results from this study suggest that the dataset consisting of panchromatic band, four of its texture features and multispectral data was able to increase the classification accuracy by about 2%. In comparison, a boosted decision tree was able to increase the classification accuracy by about 3% with the same dataset.

Multi-level Metadata Integration System: XML, RDF and RuleML

Our work is part of the heterogeneous data integration, with the definition of a structural and semantic mediation model. Our aim is to propose architecture for the heterogeneous sources metadata mediation, represented by XML, RDF and RuleML models, providing to the user the metadata transparency. This, by including data structures, of natures fundamentally different, and allowing the decomposition of a query involving multiple sources, to queries specific to these sources, then recompose the result.

FSM-based Recognition of Dynamic Hand Gestures via Gesture Summarization Using Key Video Object Planes

The use of human hand as a natural interface for humancomputer interaction (HCI) serves as the motivation for research in hand gesture recognition. Vision-based hand gesture recognition involves visual analysis of hand shape, position and/or movement. In this paper, we use the concept of object-based video abstraction for segmenting the frames into video object planes (VOPs), as used in MPEG-4, with each VOP corresponding to one semantically meaningful hand position. Next, the key VOPs are selected on the basis of the amount of change in hand shape – for a given key frame in the sequence the next key frame is the one in which the hand changes its shape significantly. Thus, an entire video clip is transformed into a small number of representative frames that are sufficient to represent a gesture sequence. Subsequently, we model a particular gesture as a sequence of key frames each bearing information about its duration. These constitute a finite state machine. For recognition, the states of the incoming gesture sequence are matched with the states of all different FSMs contained in the database of gesture vocabulary. The core idea of our proposed representation is that redundant frames of the gesture video sequence bear only the temporal information of a gesture and hence discarded for computational efficiency. Experimental results obtained demonstrate the effectiveness of our proposed scheme for key frame extraction, subsequent gesture summarization and finally gesture recognition.

Modeling the Influence of Socioeconomic and Land-Use Factors on Mode Choice: A Comparison of Riyadh, Saudi Arabia, and Melbourne, Australia

Metropolitan areas have suffered from traffic problems, which have steadily increased in many monocentric cities. Urban expansion, population growth, and road network development have resulted in a structural shift toward urban sprawl, increasing commuters’ dependence on private modes of transport. This paper aims to model the influence of socioeconomic and land-use factors on mode choice using a multinomial and nested logit model. Land-use patterns—such as residential, commercial, retail, educational and employment related—affect the choice of mode and destination in the short and medium term. Socioeconomic factors—such as age, gender, income, household size, and house type—also affect choice, while residential location is affected in the long term. Riyadh in Saudi Arabia and Melbourne in Australia were chosen as case studies. Riyadh is a car-dependent city with limited public transport, whereas Melbourne has good public transport but an increase in car dependence. Aggregate level land-use data and disaggregate level individual, household, and journey-to-work data are used to determine the effects of land use and socioeconomic factors on mode choice. The model results determined that urban sprawl is the main factor that affects mode choice, income, and house type.

The Knowledge Representation of the Genetic Regulatory Networks Based on Ontology

The understanding of the system level of biological behavior and phenomenon variously needs some elements such as gene sequence, protein structure, gene functions and metabolic pathways. Challenging problems are representing, learning and reasoning about these biochemical reactions, gene and protein structure, genotype and relation between the phenotype, and expression system on those interactions. The goal of our work is to understand the behaviors of the interactions networks and to model their evolution in time and in space. We propose in this study an ontological meta-model for the knowledge representation of the genetic regulatory networks. Ontology in artificial intelligence means the fundamental categories and relations that provide a framework for knowledge models. Domain ontology's are now commonly used to enable heterogeneous information resources, such as knowledge-based systems, to communicate with each other. The interest of our model is to represent the spatial, temporal and spatio-temporal knowledge. We validated our propositions in the genetic regulatory network of the Aarbidosis thaliana flower

Effects of Nanolayer Structure and Brownian Motion of Particles in Thermal Conductivity Enhancement of Nanofluids

Nanofluids are novel fluids that are going to have an important role in future industrial thermal device designs. Studies are being predominantly conducted on the mechanism of these heat transfers. The key to this attraction is in the increase in thermal conductivity brought about by the Nanofluids compared with the base fluid. Different models have been proposed for calculation of effective thermal conduction that has been gradually modified. In this investigation effect of nanolayer structure and Brownian motion of particles are studied and a new modified thermal conductivity model is proposed. Temperature, concentration, nanolayer thickness and particle size are taken as variables and their effect are studied simultaneously on the thermal conductivity of the fluids, showing the concentration of the nanoparticles to affect the nanolayer thickness which also affects the Brownian motion.

Generation of Sets of Synthetic Classifiers for the Evaluation of Abstract-Level Combination Methods

This paper presents a new technique for generating sets of synthetic classifiers to evaluate abstract-level combination methods. The sets differ in terms of both recognition rates of the individual classifiers and degree of similarity. For this purpose, each abstract-level classifier is considered as a random variable producing one class label as the output for an input pattern. From the initial set of classifiers, new slightly different sets are generated by applying specific operators, which are defined at the purpose. Finally, the sets of synthetic classifiers have been used to estimate the performance of combination methods for abstract-level classifiers. The experimental results demonstrate the effectiveness of the proposed approach.

Applying GQM Approach towards Development of Criterion-Referenced Assessment Model for OO Programming Courses

The most influential programming paradigm today is object oriented (OO) programming and it is widely used in education and industry. Recognizing the importance of equipping students with OO knowledge and skills, it is not surprising that most Computer Science degree programs offer OO-related courses. How do we assess whether the students have acquired the right objectoriented skills after they have completed their OO courses? What are object oriented skills? Currently none of the current assessment techniques would be able to provide this answer. Traditional forms of OO programming assessment provide a ways for assigning numerical scores to determine letter grades. But this rarely reveals information about how students actually understand OO concept. It appears reasonable that a better understanding of how to define and assess OO skills is needed by developing a criterion referenced model. It is even critical in the context of Malaysia where there is currently a growing concern over the level of competency of Malaysian IT graduates in object oriented programming. This paper discussed the approach used to develop the criterion-referenced assessment model. The model can serve as a guideline when conducting OO programming assessment as mentioned. The proposed model is derived by using Goal Questions Metrics methodology, which helps formulate the metrics of interest. It concluded with a few suggestions for further study.

High Impedance Fault Detection using LVQ Neural Networks

This paper presents a new method to detect high impedance faults in radial distribution systems. Magnitudes of third and fifth harmonic components of voltages and currents are used as a feature vector for fault discrimination. The proposed methodology uses a learning vector quantization (LVQ) neural network as a classifier for identifying high impedance arc-type faults. The network learns from the data obtained from simulation of a simple radial system under different fault and system conditions. Compared to a feed-forward neural network, a properly tuned LVQ network gives quicker response.

Asymptotic Stability of Input-saturated System with Linear-growth-bound Disturbances via Variable Structure Control: An LMI Approach

Variable Structure Control (VSC) is one of the most useful tools handling the practical system with uncertainties and disturbances. Up to now, unfortunately, not enough studies on the input-saturated system with linear-growth-bound disturbances via VSC have been presented. Therefore, this paper proposes an asymp¬totic stability condition for the system via VSC. The designed VSC controller consists of two control parts. The linear control part plays a role in stabilizing the system, and simultaneously, the nonlinear control part in rejecting the linear-growth-bound disturbances perfectly. All conditions derived in this paper are expressed with Linear Matrices Inequalities (LMIs), which can be easily solved with an LMI toolbox in MATLAB.

Unit Commitment Solution Methods

An effort to develop a unit commitment approach capable of handling large power systems consisting of both thermal and hydro generating units offers a large profitable return. In order to be feasible, the method to be developed must be flexible, efficient and reliable. In this paper, various proposed methods have been described along with their strengths and weaknesses. As all of these methods have some sort of weaknesses, a comprehensive algorithm that combines the strengths of different methods and overcomes each other-s weaknesses would be a suitable approach for solving industry-grade unit commitment problem.

Web Traffic Mining using Neural Networks

With the explosive growth of data available on the Internet, personalization of this information space become a necessity. At present time with the rapid increasing popularity of the WWW, Websites are playing a crucial role to convey knowledge and information to the end users. Discovering hidden and meaningful information about Web users usage patterns is critical to determine effective marketing strategies to optimize the Web server usage for accommodating future growth. The task of mining useful information becomes more challenging when the Web traffic volume is enormous and keeps on growing. In this paper, we propose a intelligent model to discover and analyze useful knowledge from the available Web log data.

A Black-box Approach for Response Quality Evaluation of Conversational Agent Systems

The evaluation of conversational agents or chatterbots question answering systems is a major research area that needs much attention. Before the rise of domain-oriented conversational agents based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when chatterbots began to become more domain specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time to achieve high quality responses. This paper discusses the inappropriateness of the existing measures for response quality evaluation and the call for new standard measures and related considerations are brought forward. As a short-term solution for evaluating response quality of conversational agents, and to demonstrate the challenges in evaluating systems of different nature, this research proposes a blackbox approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems, AnswerBus, START and AINI.

Single Image Defogging Method Using Variational Approach for Edge-Preserving Regularization

In this paper, we propose the variational approach to solve single image defogging problem. In the inference process of the atmospheric veil, we defined new functional for atmospheric veil that satisfy edge-preserving regularization property. By using the fundamental lemma of calculus of variations, we derive the Euler-Lagrange equation foratmospheric veil that can find the maxima of a given functional. This equation can be solved by using a gradient decent method and time parameter. Then, we can have obtained the estimated atmospheric veil, and then have conducted the image restoration by using inferred atmospheric veil. Finally we have improved the contrast of restoration image by various histogram equalization methods. The experimental results show that the proposed method achieves rather good defogging results.

A Modified Fuzzy C-Means Algorithm for Natural Data Exploration

In Data mining, Fuzzy clustering algorithms have demonstrated advantage over crisp clustering algorithms in dealing with the challenges posed by large collections of vague and uncertain natural data. This paper reviews concept of fuzzy logic and fuzzy clustering. The classical fuzzy c-means algorithm is presented and its limitations are highlighted. Based on the study of the fuzzy c-means algorithm and its extensions, we propose a modification to the cmeans algorithm to overcome the limitations of it in calculating the new cluster centers and in finding the membership values with natural data. The efficiency of the new modified method is demonstrated on real data collected for Bhutan-s Gross National Happiness (GNH) program.

Removal of Boron from Waste Waters by Ion- Exchange in a Batch System

Boron minerals are very useful for various industrial activities, such as glass industry and detergent industry, due to its mechanical and chemical properties. During the production of boron compounds, many of these are introduced into the environment in the form of waste. Boron is also an important micro nutrient for the plants to vegetate but if it exists in high concentrations, it could have toxic effects. The maximum boron level in drinking water for human health is given as 0.3 mg/L in World Health Organization (WHO) standards. The toxic effects of boron should be noted especially for dry regions, thus, in recent years, increasing attention has been paid to remove the boron from waste waters. In this study, boron removal is implemented by ion exchange process using Amberlite IRA-743 resin. Amberlite IRA-743 resin is a boron specific resin and it belongs to the polymerizate sorbent group within the aminopolyol functional group. Batch studies were performed to investigate the effects of various experimental parameters, such as adsorbent dose, initial concentration and pH, on the removal of boron. It is found that, when the adsorbent dose increases removal of boron from the liquid phase increases. However, an increase in the initial concentration decreases the removal of boron. The effective pH values for removal of boron are determined between 8.5 and 9. Equilibrium isotherms were also analyzed by Langmuir and Freundlich isotherm models. The Langmuir isotherm is obeyed better than the Freundlich isotherm.