Effects of Road Disturbance on Plant Biodiversity

Urbanization and related anthropogenic modifications cause extent of habitat fragmentation and directly lead to decline of local biodiversity. Conservation biologists advocate corridor creation as one approach to rescue biodiversity. Here we examine the utility of roads as corridors in preserving plant diversity by investigating roadside vegetation in Yellow River Delta (YRD), China. We examined the spatio-temporal distribution pattern of plant species richness, diversity and composition along roadside. The results suggest that roads, as dispersal conduits, increase occurrence probability of new settlers to a new area, meanwhile, roads accumulate the greater propagule pressure and favourable survival condition during operation phase. As a result, more species, including native and alien plants, non- halophyte and halophyte species, threatened and cosmopolitic species, were found prosperous at roadside. Roadside may be a refuge for more species, and the pattern of vegetation distribution is affected by road age and the distance from road verge.

Over-Height Vehicle Detection in Low Headroom Roads Using Digital Video Processing

In this paper we present a new method for over-height vehicle detection in low headroom streets and highways using digital video possessing. The accuracy and the lower price comparing to present detectors like laser radars and the capability of providing extra information like speed and height measurement make this method more reliable and efficient. In this algorithm the features are selected and tracked using KLT algorithm. A blob extraction algorithm is also applied using background estimation and subtraction. Then the world coordinates of features that are inside the blobs are estimated using a noble calibration method. As, the heights of the features are calculated, we apply a threshold to select overheight features and eliminate others. The over-height features are segmented using some association criteria and grouped using an undirected graph. Then they are tracked through sequential frames. The obtained groups refer to over-height vehicles in a scene.

On the Need to have an Additional Methodology for the Psychological Product Measurement and Evaluation

Cognitive Science appeared about 40 years ago, subsequent to the challenge of the Artificial Intelligence, as common territory for several scientific disciplines such as: IT, mathematics, psychology, neurology, philosophy, sociology, and linguistics. The new born science was justified by the complexity of the problems related to the human knowledge on one hand, and on the other by the fact that none of the above mentioned sciences could explain alone the mental phenomena. Based on the data supplied by the experimental sciences such as psychology or neurology, models of the human mind operation are built in the cognition science. These models are implemented in computer programs and/or electronic circuits (specific to the artificial intelligence) – cognitive systems – whose competences and performances are compared to the human ones, leading to the psychology and neurology data reinterpretation, respectively to the construction of new models. During these processes if psychology provides the experimental basis, philosophy and mathematics provides the abstraction level utterly necessary for the intermission of the mentioned sciences. The ongoing general problematic of the cognitive approach provides two important types of approach: the computational one, starting from the idea that the mental phenomenon can be reduced to 1 and 0 type calculus operations, and the connection one that considers the thinking products as being a result of the interaction between all the composing (included) systems. In the field of psychology measurements in the computational register use classical inquiries and psychometrical tests, generally based on calculus methods. Deeming things from both sides that are representing the cognitive science, we can notice a gap in psychological product measurement possibilities, regarded from the connectionist perspective, that requires the unitary understanding of the quality – quantity whole. In such approach measurement by calculus proves to be inefficient. Our researches, deployed for longer than 20 years, lead to the conclusion that measuring by forms properly fits to the connectionism laws and principles.

An ensemble of Weighted Support Vector Machines for Ordinal Regression

Instead of traditional (nominal) classification we investigate the subject of ordinal classification or ranking. An enhanced method based on an ensemble of Support Vector Machines (SVM-s) is proposed. Each binary classifier is trained with specific weights for each object in the training data set. Experiments on benchmark datasets and synthetic data indicate that the performance of our approach is comparable to state of the art kernel methods for ordinal regression. The ensemble method, which is straightforward to implement, provides a very good sensitivity-specificity trade-off for the highest and lowest rank.

Comparison of Conventional and “ECO“Transportation Pavements in Cyprus using Life Cycle Approach

Road industry has challenged the prospect of ecoconstruction. Pavements may fit within the framework of sustainable development. Hence, research implements assessments of conventional pavements impacts on environment in use of life cycle approach. To meet global, and often national, targets on pollution control, newly introduced pavement designs are under study. This is the case of Cyprus demonstration, which occurred within EcoLanes project work. This alternative pavement differs on concrete layer reinforced with tire recycling product. Processing of post-consumer tires produces steel fibers improving strength capacity against cracking. Thus maintenance works are relevantly limited in comparison to flexible pavement. This enables to be more ecofriendly, referenced to current study outputs. More specific, proposed concrete pavement life cycle processes emits 15 % less air pollutants and consumes 28 % less embodied energy than those of the asphalt pavement. In addition there is also a reduction on costs by 0.06 %.

Multiscale Analysis and Change Detection Based on a Contrario Approach

Automatic methods of detecting changes through satellite imaging are the object of growing interest, especially beca²use of numerous applications linked to analysis of the Earth’s surface or the environment (monitoring vegetation, updating maps, risk management, etc...). This work implemented spatial analysis techniques by using images with different spatial and spectral resolutions on different dates. The work was based on the principle of control charts in order to set the upper and lower limits beyond which a change would be noted. Later, the a contrario approach was used. This was done by testing different thresholds for which the difference calculated between two pixels was significant. Finally, labeled images were considered, giving a particularly low difference which meant that the number of “false changes” could be estimated according to a given limit.

Bank Business Models and The Changes in CEE Countries

The aim of this article is to assess the existing business models used by the banks operating in the CEE countries in the time period from 2006 till 2011. In order to obtain research results, the authors performed qualitative analysis of the scientific literature on bank business models, which have been grouped into clusters that consist of such components as: 1) capital and reserves; 2) assets; 3) deposits, and 4) loans. In their turn, bank business models have been developed based on the types of core activities of the banks, and have been divided into four groups: Wholesale, Investment, Retail and Universal Banks. Descriptive statistics have been used to analyse the models, determining mean, minimal and maximal values of constituent cluster components, as well as standard deviation. The analysis of the data is based on such bank variable indices as Return on Assets (ROA) and Return on Equity (ROE).

A New Self-Adaptive EP Approach for ANN Weights Training

Evolutionary Programming (EP) represents a methodology of Evolutionary Algorithms (EA) in which mutation is considered as a main reproduction operator. This paper presents a novel EP approach for Artificial Neural Networks (ANN) learning. The proposed strategy consists of two components: the self-adaptive, which contains phenotype information and the dynamic, which is described by genotype. Self-adaptation is achieved by the addition of a value, called the network weight, which depends on a total number of hidden layers and an average number of neurons in hidden layers. The dynamic component changes its value depending on the fitness of a chromosome, exposed to mutation. Thus, the mutation step size is controlled by two components, encapsulated in the algorithm, which adjust it according to the characteristics of a predefined ANN architecture and the fitness of a particular chromosome. The comparative analysis of the proposed approach and the classical EP (Gaussian mutation) showed, that that the significant acceleration of the evolution process is achieved by using both phenotype and genotype information in the mutation strategy.

High Quality Speech Coding using Combined Parametric and Perceptual Modules

A novel approach to speech coding using the hybrid architecture is presented. Advantages of parametric and perceptual coding methods are utilized together in order to create a speech coding algorithm assuring better signal quality than in traditional CELP parametric codec. Two approaches are discussed. One is based on selection of voiced signal components that are encoded using parametric algorithm, unvoiced components that are encoded perceptually and transients that remain unencoded. The second approach uses perceptual encoding of the residual signal in CELP codec. The algorithm applied for precise transient selection is described. Signal quality achieved using the proposed hybrid codec is compared to quality of some standard speech codecs.

Optimization Approaches for a Complex Dairy Farm Simulation Model

This paper describes the optimization of a complex dairy farm simulation model using two quite different methods of optimization, the Genetic algorithm (GA) and the Lipschitz Branch-and-Bound (LBB) algorithm. These techniques have been used to improve an agricultural system model developed by Dexcel Limited, New Zealand, which describes a detailed representation of pastoral dairying scenarios and contains an 8-dimensional parameter space. The model incorporates the sub-models of pasture growth and animal metabolism, which are themselves complex in many cases. Each evaluation of the objective function, a composite 'Farm Performance Index (FPI)', requires simulation of at least a one-year period of farm operation with a daily time-step, and is therefore computationally expensive. The problem of visualization of the objective function (response surface) in high-dimensional spaces is also considered in the context of the farm optimization problem. Adaptations of the sammon mapping and parallel coordinates visualization are described which help visualize some important properties of the model-s output topography. From this study, it is found that GA requires fewer function evaluations in optimization than the LBB algorithm.

Identification of Seat Belt Wearing Compliance Associate Factors in Malaysia: Evidence-based Approach

The aim of the study was to identify seat belt wearing factor among road users in Malaysia. Evidence-based approach through in-depth crash investigation was utilised to determine the intended objectives. The objective was scoped into crashes investigated by Malaysian Institute of Road Safety Research (MIROS) involving passenger vehicles within 2007 and 2010. Crash information of a total of 99 crash cases involving 240 vehicles and 864 occupants were obtained during the study period. Statistical test and logistic regression analysis have been performed. Results of the analysis revealed that gender, seat position and age were associated with seat belt wearing compliance in Malaysia. Males are 97.6% more likely to wear seat belt compared to females (95% CI 1.317 to 2.964). By seat position, the finding indicates that frontal occupants were 82 times more likely to be wearing seat belt (95% CI 30.199 to 225.342) as compared to rear occupants. It is also important to note that the odds of seat belt wearing increased by about 2.64% (95% CI 1.0176 to 1.0353) for every one year increase in age. This study is essential in understanding the Malaysian tendency in belting up while being occupied in a vehicle. The factors highlighted in this study should be emphasized in road safety education in order to increase seat belt wearing rate in this country and ultimately in preventing deaths due to road crashes.

Clustering Categorical Data Using Hierarchies (CLUCDUH)

Clustering large populations is an important problem when the data contain noise and different shapes. A good clustering algorithm or approach should be efficient enough to detect clusters sensitively. Besides space complexity, time complexity also gains importance as the size grows. Using hierarchies we developed a new algorithm to split attributes according to the values they have and choosing the dimension for splitting so as to divide the database roughly into equal parts as much as possible. At each node we calculate some certain descriptive statistical features of the data which reside and by pruning we generate the natural clusters with a complexity of O(n).

An in Silico Approach for Prioritizing Drug Targets in Metabolic Pathway of Mycobacterium Tuberculosis

There is an urgent need to develop novel Mycobacterium tuberculosis (Mtb) drugs that are active against drug resistant bacteria but, more importantly, kill persistent bacteria. Our study structured based on integrated analysis of metabolic pathways, small molecule screening and similarity Search in PubChem Database. Metabolic analysis approaches based on Unified weighted used for potent target selection. Our results suggest that pantothenate synthetase (panC) and and 3-methyl-2-oxobutanoate hydroxymethyl transferase (panB) as a appropriate drug targets. In our study, we used pantothenate synthetase because of existence inhibitors. We have reported the discovery of new antitubercular compounds through ligand based approaches using computational tools.

Network State Classification based on the Statistical properties of RTT for an Adaptive Multi-State Proactive Transport Protocol for Satellite based Networks

This paper attempts to establish the fact that Multi State Network Classification is essential for performance enhancement of Transport protocols over Satellite based Networks. A model to classify Multi State network condition taking into consideration both congestion and channel error is evolved. In order to arrive at such a model an analysis of the impact of congestion and channel error on RTT values has been carried out using ns2. The analysis results are also reported in the paper. The inference drawn from this analysis is used to develop a novel statistical RTT based model for multi state network classification. An Adaptive Multi State Proactive Transport Protocol consisting of Proactive Slow Start, State based Error Recovery, Timeout Action and Proactive Reduction is proposed which uses the multi state network state classification model. This paper also confirms through detail simulation and analysis that a prior knowledge about the overall characteristics of the network helps in enhancing the performance of the protocol over satellite channel which is significantly affected due to channel noise and congestion. The necessary augmentation of ns2 simulator is done for simulating the multi state network classification logic. This simulation has been used in detail evaluation of the protocol under varied levels of congestion and channel noise. The performance enhancement of this protocol with reference to established protocols namely TCP SACK and Vegas has been discussed. The results as discussed in this paper clearly reveal that the proposed protocol always outperforms its peers and show a significant improvement in very high error conditions as envisaged in the design of the protocol.

A Trainable Neural Network Ensemble for ECG Beat Classification

This paper illustrates the use of a combined neural network model for classification of electrocardiogram (ECG) beats. We present a trainable neural network ensemble approach to develop customized electrocardiogram beat classifier in an effort to further improve the performance of ECG processing and to offer individualized health care. We process a three stage technique for detection of premature ventricular contraction (PVC) from normal beats and other heart diseases. This method includes a denoising, a feature extraction and a classification. At first we investigate the application of stationary wavelet transform (SWT) for noise reduction of the electrocardiogram (ECG) signals. Then feature extraction module extracts 10 ECG morphological features and one timing interval feature. Then a number of multilayer perceptrons (MLPs) neural networks with different topologies are designed. The performance of the different combination methods as well as the efficiency of the whole system is presented. Among them, Stacked Generalization as a proposed trainable combined neural network model possesses the highest recognition rate of around 95%. Therefore, this network proves to be a suitable candidate in ECG signal diagnosis systems. ECG samples attributing to the different ECG beat types were extracted from the MIT-BIH arrhythmia database for the study.

Task Modeling for User Interface Design: A Layered Approach

The model-based approach to user interface design relies on developing separate models that are capturing various aspects about users, tasks, application domain, presentation and dialog representations. This paper presents a task modeling approach for user interface design and aims at exploring the mappings between task, domain and presentation models. The basic idea of our approach is to identify typical configurations in task and domain models and to investigate how they relate each other. A special emphasis is put on application-specific functions and mappings between domain objects and operational task structures. In this respect, we will distinguish between three layers in the task decomposition: a functional layer, a planning layer, and an operational layer.

Research on IBR-Driven Distributed Collaborative Visualization System

Image-based Rendering(IBR) techniques recently reached in broad fields which leads to a critical challenge to build up IBR-Driven visualization platform where meets requirement of high performance, large bounds of distributed visualization resource aggregation and concentration, multiple operators deploying and CSCW design employing. This paper presents an unique IBR-based visualization dataflow model refer to specific characters of IBR techniques and then discusses prominent feature of IBR-Driven distributed collaborative visualization (DCV) system before finally proposing an novel prototype. The prototype provides a well-defined three level modules especially work as Central Visualization Server, Local Proxy Server and Visualization Aid Environment, by which data and control for collaboration move through them followed the previous dataflow model. With aid of this triple hierarchy architecture of that, IBR oriented application construction turns to be easy. The employed augmented collaboration strategy not only achieve convenient multiple users synchronous control and stable processing management, but also is extendable and scalable.

Design of an M-Channel Cosine Modulated Filter Bank by New Cosh Window Based FIR Filters

In this paper newly reported Cosh window function is used in the design of prototype filter for M-channel Near Perfect Reconstruction (NPR) Cosine Modulated Filter Bank (CMFB). Local search optimization algorithm is used for minimization of distortion parameters by optimizing the filter coefficients of prototype filter. Design examples are presented and comparison has been made with Kaiser window based filterbank design of recently reported work. The result shows that the proposed design approach provides lower distortion parameters and improved far-end suppression than the Kaiser window based design of recent reported work.

A Survey: Clustering Ensembles Techniques

The clustering ensembles combine multiple partitions generated by different clustering algorithms into a single clustering solution. Clustering ensembles have emerged as a prominent method for improving robustness, stability and accuracy of unsupervised classification solutions. So far, many contributions have been done to find consensus clustering. One of the major problems in clustering ensembles is the consensus function. In this paper, firstly, we introduce clustering ensembles, representation of multiple partitions, its challenges and present taxonomy of combination algorithms. Secondly, we describe consensus functions in clustering ensembles including Hypergraph partitioning, Voting approach, Mutual information, Co-association based functions and Finite mixture model, and next explain their advantages, disadvantages and computational complexity. Finally, we compare the characteristics of clustering ensembles algorithms such as computational complexity, robustness, simplicity and accuracy on different datasets in previous techniques.