Small Sample Bootstrap Confidence Intervals for Long-Memory Parameter

The log periodogram regression is widely used in empirical applications because of its simplicity, since only a least squares regression is required to estimate the memory parameter, d, its good asymptotic properties and its robustness to misspecification of the short term behavior of the series. However, the asymptotic distribution is a poor approximation of the (unknown) finite sample distribution if the sample size is small. Here the finite sample performance of different nonparametric residual bootstrap procedures is analyzed when applied to construct confidence intervals. In particular, in addition to the basic residual bootstrap, the local and block bootstrap that might adequately replicate the structure that may arise in the errors of the regression are considered when the series shows weak dependence in addition to the long memory component. Bias correcting bootstrap to adjust the bias caused by that structure is also considered. Finally, the performance of the bootstrap in log periodogram regression based confidence intervals is assessed in different type of models and how its performance changes as sample size increases.

Vertex Configurations and Their Relationship on Orthogonal Pseudo-Polyhedra

Vertex configuration for a vertex in an orthogonal pseudo-polyhedron is an identity of a vertex that is determined by the number of edges, dihedral angles, and non-manifold properties meeting at the vertex. There are up to sixteen vertex configurations for any orthogonal pseudo-polyhedron (OPP). Understanding the relationship between these vertex configurations will give us insight into the structure of an OPP and help us design better algorithms for many 3-dimensional geometric problems. In this paper, 16 vertex configurations for OPP are described first. This is followed by a number of formulas giving insight into the relationship between different vertex configurations in an OPP. These formulas will be useful as an extension of orthogonal polyhedra usefulness on pattern analysis in 3D-digital images.

Effects of Superheating on Thermodynamic Performance of Organic Rankine Cycles

Recently ORC(Organic Rankine Cycle) has attracted much attention due to its potential in reducing consumption of fossil fuels and its favorable characteristics to exploit low-grade heat sources. In this work thermodynamic performance of ORC with superheating of vapor is comparatively assessed for various working fluids. Special attention is paid to the effects of system parameters such as the evaporating temperature and the turbine inlet temperature on the characteristics of the system such as maximum possible work extraction from the given source, volumetric flow rate per 1 kW of net work and quality of the working fluid at turbine exit as well as thermal and exergy efficiencies. Results show that for a given source the thermal efficiency increases with decrease of the superheating but exergy efficiency may have a maximum value with respect to the superheating of the working fluid. Results also show that in selection of working fluid it is required to consider various criteria of performance characteristics as well as thermal efficiency.

An Evaluation of Digital Elevation Models to Short-Term Monitoring of a High Energy Barrier Island, Northeast Brazil

The morphological short-term evolution of Ponta do Tubarão Island (PTI) was investigated through high accurate surveys based on post-processed kinematic (PPK) relative positioning on Global Navigation Satellite Systems (GNSS). PTI is part of a barrier island system on a high energy northeast Brazilian coastal environment and also an area of high environmental sensitivity. Surveys were carried out quarterly over a two years period from May 2010 to May 2012. This paper assesses statically the performance of digital elevation models (DEM) derived from different interpolation methods to represent morphologic features and to quantify volumetric changes and TIN models shown the best results to that purposes. The MDE allowed quantifying surfaces and volumes in detail as well as identifying the most vulnerable segments of the PTI to erosion and/or accumulation of sediments and relate the alterations to climate conditions. The coastal setting and geometry of PTI protects a significant mangrove ecosystem and some oil and gas facilities installed in the vicinities from damaging effects of strong oceanwaves and currents. Thus, the maintenance of PTI is extremely required but the prediction of its longevity is uncertain because results indicate an irregularity of sedimentary balance and a substantial decline in sediment supply to this coastal area.

Vector Space of the Extended Base-triplets over the Galois Field of five DNA Bases Alphabet

A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D, G, A, U, C}, where the letter D represent one or more hypothetical bases with unspecific pairing. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvements of a primitive DNA repair system could make possible the transition from the ancient to the modern genetic code. Our results suggest that the Watson-Crick base pairing and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as the transition from the former to the later. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences.

Research Trend Analysis – A Sample in the Field of Information Systems

As research performance in academia is treated as one of indices for national competency, many countries devote much attention and resources to increasing their research performance. Understand the research trend is the basic step to improve the research performance. The goal of this research is to design an analysis system to evaluate research trends from analyzing data from different countries. In this paper, information system researches in Taiwan and other countries, including Asian countries and prominent countries represented by the Group of Eight (G8) is used as example. Our research found the trends are varied in different countries. Our research suggested that Taiwan-s scholars can pay more attention to interdisciplinary applications and try to increase their collaboration with other countries, in order to increase Taiwan's competency in the area of information science.

A Parametric Assessment of Friction Damper in Eccentric Braced Frame

In This paper, the behavior of eccentric braced frame (EBF) is studied with replacing friction damper (FD) in confluence of these braces, in 5 and 10-storey steel frames. For FD system, the main step is to determine the slip load. For this reason, the performance indexes include roof displacement, base shear, dissipated energy and relative performance should be investigated. In nonlinear dynamic analysis, the response of structure to three earthquake records has been obtained and the values of roof displacement, base shear and column axial force for FD and EBF frames have been compared. The results demonstrate that use of the FD in frames, in comparison with the EBF, substantially reduces the roof displacement, column axial force and base shear. The obtained results show suitable performance of FD in higher storey structure in comparison with the EBF.

A New Approach for Classifying Large Number of Mixed Variables

The issue of classifying objects into one of predefined groups when the measured variables are mixed with different types of variables has been part of interest among statisticians in many years. Some methods for dealing with such situation have been introduced that include parametric, semi-parametric and nonparametric approaches. This paper attempts to discuss on a problem in classifying a data when the number of measured mixed variables is larger than the size of the sample. A propose idea that integrates a dimensionality reduction technique via principal component analysis and a discriminant function based on the location model is discussed. The study aims in offering practitioners another potential tool in a classification problem that is possible to be considered when the observed variables are mixed and too large.

Stability of Discrete Linear Systems with Periodic Coefficients under Parametric Perturbations

This paper studies the problem of exponential stability of perturbed discrete linear systems with periodic coefficients. Assuming that the unperturbed system is exponentially stable we obtain conditions on the perturbations under which the perturbed system is exponentially stable.

The Inverse Eigenvalue Problem via Orthogonal Matrices

In this paper we study the inverse eigenvalue problem for symmetric special matrices and introduce sufficient conditions for obtaining nonnegative matrices. We get the HROU algorithm from [1] and introduce some extension of this algorithm. If we have some eigenvectors and associated eigenvalues of a matrix, then by this extension we can find the symmetric matrix that its eigenvalue and eigenvectors are given. At last we study the special cases and get some remarkable results.

Quantification of Periodicities in Fugitive Emission of Gases from Lyari Waterway

Periodicities in the environmetric time series can be idyllically assessed by utilizing periodic models. In this communication fugitive emission of gases from open sewer channel Lyari which follows periodic behaviour are approximated by employing periodic autoregressive model of order p. The orders of periodic model for each season are selected through the examination of periodic partial autocorrelation or information criteria. The parameters for the selected order of season are estimated individually for each emitted air toxin. Subsequently, adequacies of fitted models are established by examining the properties of the residual for each season. These models are beneficial for schemer and administrative bodies for the improvement of implemented policies to surmount future environmental problems.

Effects of Network Dynamics on Routing Efficiency in P2P Networks

P2P Networks are highly dynamic structures since their nodes – peer users keep joining and leaving continuously. In the paper, we study the effects of network change rates on query routing efficiency. First we describe some background and an abstract system model. The chosen routing technique makes use of cached metadata from previous answer messages and also employs a mechanism for broken path detection and metadata maintenance. Several metrics are used to show that the protocol behaves quite well even with high rate of node departures, but above a certain threshold it literally breaks down and exhibits considerable efficiency degradation.

Improved Feature Processing for Iris Biometric Authentication System

Iris-based biometric authentication is gaining importance in recent times. Iris biometric processing however, is a complex process and computationally very expensive. In the overall processing of iris biometric in an iris-based biometric authentication system, feature processing is an important task. In feature processing, we extract iris features, which are ultimately used in matching. Since there is a large number of iris features and computational time increases as the number of features increases, it is therefore a challenge to develop an iris processing system with as few as possible number of features and at the same time without compromising the correctness. In this paper, we address this issue and present an approach to feature extraction and feature matching process. We apply Daubechies D4 wavelet with 4 levels to extract features from iris images. These features are encoded with 2 bits by quantizing into 4 quantization levels. With our proposed approach it is possible to represent an iris template with only 304 bits, whereas existing approaches require as many as 1024 bits. In addition, we assign different weights to different iris region to compare two iris templates which significantly increases the accuracy. Further, we match the iris template based on a weighted similarity measure. Experimental results on several iris databases substantiate the efficacy of our approach.

Order Statistics-based “Anti-Bayesian“ Parametric Classification for Asymmetric Distributions in the Exponential Family

Although the field of parametric Pattern Recognition (PR) has been thoroughly studied for over five decades, the use of the Order Statistics (OS) of the distributions to achieve this has not been reported. The pioneering work on using OS for classification was presented in [1] for the Uniform distribution, where it was shown that optimal PR can be achieved in a counter-intuitive manner, diametrically opposed to the Bayesian paradigm, i.e., by comparing the testing sample to a few samples distant from the mean. This must be contrasted with the Bayesian paradigm in which, if we are allowed to compare the testing sample with only a single point in the feature space from each class, the optimal strategy would be to achieve this based on the (Mahalanobis) distance from the corresponding central points, for example, the means. In [2], we showed that the results could be extended for a few symmetric distributions within the exponential family. In this paper, we attempt to extend these results significantly by considering asymmetric distributions within the exponential family, for some of which even the closed form expressions of the cumulative distribution functions are not available. These distributions include the Rayleigh, Gamma and certain Beta distributions. As in [1] and [2], the new scheme, referred to as Classification by Moments of Order Statistics (CMOS), attains an accuracy very close to the optimal Bayes’ bound, as has been shown both theoretically and by rigorous experimental testing.

On a Discrete-Time GIX/Geo/1/N Queue with Single Working Vacation and Partial Batch Rejection

This paper treats a discrete-time finite buffer batch arrival queue with a single working vacation and partial batch rejection in which the inter-arrival and service times are, respectively, arbitrary and geometrically distributed. The queue is analyzed by using the supplementary variable and the imbedded Markov-chain techniques. We obtain steady-state system length distributions at prearrival, arbitrary and outside observer-s observation epochs. We also present probability generation function (p.g.f.) of actual waiting-time distribution in the system and some performance measures.

MinRoot and CMesh: Interconnection Architectures for Network-on-Chip Systems

The success of an electronic system in a System-on- Chip is highly dependent on the efficiency of its interconnection network, which is constructed from routers and channels (the routers move data across the channels between nodes). Since neither classical bus based nor point to point architectures can provide scalable solutions and satisfy the tight power and performance requirements of future applications, the Network-on-Chip (NoC) approach has recently been proposed as a promising solution. Indeed, in contrast to the traditional solutions, the NoC approach can provide large bandwidth with moderate area overhead. The selected topology of the components interconnects plays prime rule in the performance of NoC architecture as well as routing and switching techniques that can be used. In this paper, we present two generic NoC architectures that can be customized to the specific communication needs of an application in order to reduce the area with minimal degradation of the latency of the system. An experimental study is performed to compare these structures with basic NoC topologies represented by 2D mesh, Butterfly-Fat Tree (BFT) and SPIN. It is shown that Cluster mesh (CMesh) and MinRoot schemes achieves significant improvements in network latency and energy consumption with only negligible area overhead and complexity over existing architectures. In fact, in the case of basic NoC topologies, CMesh and MinRoot schemes provides substantial savings in area as well, because they requires fewer routers. The simulation results show that CMesh and MinRoot networks outperforms MESH, BFT and SPIN in main performance metrics.

Video Quality Assessment using Visual Attention Approach for Sign Language

Visual information is very important in human perception of surrounding world. Video is one of the most common ways to capture visual information. The video capability has many benefits and can be used in various applications. For the most part, the video information is used to bring entertainment and help to relax, moreover, it can improve the quality of life of deaf people. Visual information is crucial for hearing impaired people, it allows them to communicate personally, using the sign language; some parts of the person being spoken to, are more important than others (e.g. hands, face). Therefore, the information about visually relevant parts of the image, allows us to design objective metric for this specific case. In this paper, we present an example of an objective metric based on human visual attention and detection of salient object in the observed scene.

Fingerprint Identification using Discretization Technique

Fingerprint based identification system; one of a well known biometric system in the area of pattern recognition and has always been under study through its important role in forensic science that could help government criminal justice community. In this paper, we proposed an identification framework of individuals by means of fingerprint. Different from the most conventional fingerprint identification frameworks the extracted Geometrical element features (GEFs) will go through a Discretization process. The intention of Discretization in this study is to attain individual unique features that could reflect the individual varianceness in order to discriminate one person from another. Previously, Discretization has been shown a particularly efficient identification on English handwriting with accuracy of 99.9% and on discrimination of twins- handwriting with accuracy of 98%. Due to its high discriminative power, this method is adopted into this framework as an independent based method to seek for the accuracy of fingerprint identification. Finally the experimental result shows that the accuracy rate of identification of the proposed system using Discretization is 100% for FVC2000, 93% for FVC2002 and 89.7% for FVC2004 which is much better than the conventional or the existing fingerprint identification system (72% for FVC2000, 26% for FVC2002 and 32.8% for FVC2004). The result indicates that Discretization approach manages to boost up the classification effectively, and therefore prove to be suitable for other biometric features besides handwriting and fingerprint.

Assessing Relationship between Type of Financial Market and Market Indices in Tehran Stock Exchange

The aim of this study was to examine and identify the type of Iranian financial market in terms of being symmetrical or asymmetrical and to measure relationship between type of market and the market's indices. In this study, daily information on the market-s Share Price Index, Industrial Index and Top Fifty Most Active Companies during the years 1999-2010 has been used. In addition, to determine type of the financial market, rate of return on Security is taken into account. In this research, by using logistic regression analysis methods, relationship of the market type with the above mentioned indices have been examined. The results showed that the type of the financial market has a positive significant association with market share price index and Industrial Index. Index of Top Fifty Most Active Companies is significantly associated with type of financial market, however this relationship is inverse.

A Genetic Algorithm Based Classification Approach for Finding Fault Prone Classes

Fault-proneness of a software module is the probability that the module contains faults. A correlation exists between the fault-proneness of the software and the measurable attributes of the code (i.e. the static metrics) and of the testing (i.e. the dynamic metrics). Early detection of fault-prone software components enables verification experts to concentrate their time and resources on the problem areas of the software system under development. This paper introduces Genetic Algorithm based software fault prediction models with Object-Oriented metrics. The contribution of this paper is that it has used Metric values of JEdit open source software for generation of the rules for the classification of software modules in the categories of Faulty and non faulty modules and thereafter empirically validation is performed. The results shows that Genetic algorithm approach can be used for finding the fault proneness in object oriented software components.