Abstract: The predictability of masonry arch bridges and their
behaviour is widely considered doubtful due to the lack of knowledge
about the conditions of a given masonry arch bridge. The assessment
methods for masonry arch bridges are MEXE, ARCHIE, RING and
Frame Analysis Method. The material properties of the masonry and
fill material are extremely difficult to determine accurately.
Consequently, it is necessary to examine the effect of load dispersal
angle through the fill material, the effect of variations in the stiffness
of the masonry, the tensile strength of the masonry mortar continuum
and the compressive strength of the masonry mortar continuum. It is
also important to understand the effect of fill material on load
dispersal angle to determine their influence on ratings. In this paper a
series of parametric studies, to examine the sensitivity of assessment
ratings to the various sets of input data required by the frame analysis
method, are carried out.
Abstract: This paper examines predictability in stock return in
developed and emergingmarkets by testing long memory in stock
returns using wavelet approach. Wavelet-based maximum likelihood
estimator of the fractional integration estimator is superior to the
conventional Hurst exponent and Geweke and Porter-Hudak
estimator in terms of asymptotic properties and mean squared error.
We use 4-year moving windows to estimate the fractional integration
parameter. Evidence suggests that stock return may not be predictable
indeveloped countries of the Asia-Pacificregion. However,
predictability of stock return insome developing countries in this
region such as Indonesia, Malaysia and Philippines may not be ruled
out. Stock return in the Thailand stock market appears to be not
predictable after the political crisis in 2008.
Abstract: Many systems in the natural world exhibit chaos or non-linear behavior, the complexity of which is so great that they appear to be random. Identification of chaos in experimental data is essential for characterizing the system and for analyzing the predictability of the data under analysis. The Lyapunov exponents provide a quantitative measure of the sensitivity to initial conditions and are the most useful dynamical diagnostic for chaotic systems. However, it is difficult to accurately estimate the Lyapunov exponents of chaotic signals which are corrupted by a random noise. In this work, a method for estimation of Lyapunov exponents from noisy time series using unscented transformation is proposed. The proposed methodology was validated using time series obtained from known chaotic maps. In this paper, the objective of the work, the proposed methodology and validation results are discussed in detail.
Abstract: An adaptive software reliability prediction model
using evolutionary connectionist approach based on Recurrent Radial
Basis Function architecture is proposed. Based on the currently
available software failure time data, Fuzzy Min-Max algorithm is
used to globally optimize the number of the k Gaussian nodes. The
corresponding optimized neural network architecture is iteratively
and dynamically reconfigured in real-time as new actual failure time
data arrives. The performance of our proposed approach has been
tested using sixteen real-time software failure data. Numerical results
show that our proposed approach is robust across different software
projects, and has a better performance with respect to next-steppredictability
compared to existing neural network model for failure
time prediction.
Abstract: Key performance indicators (KPIs) are used for post
result evaluation in the construction industry, and they normally do
not have provisions for changes. This paper proposes a set of
dynamic key performance indicators (d-KPIs) which predicts the
future performance of the activity being measured and presents the
opportunity to change practice accordingly. Critical to the
predictability of a construction project is the ability to achieve
automated data collection. This paper proposes an effective way to
collect the process and engineering management data from an
integrated construction management system. The d-KPI matrix,
consisting of various indicators under seven categories, developed
from this study can be applied to close monitoring of the
development projects of aged-care facilities. The d-KPI matrix also
enables performance measurement and comparison at both project
and organization levels.
Abstract: The prediction of financial time series is a very
complicated process. If the efficient market hypothesis holds, then the predictability of most financial time series would be a rather
controversial issue, due to the fact that the current price contains already all available information in the market. This paper extends
the Adaptive Neuro Fuzzy Inference System for High Frequency
Trading which is an expert system that is capable of using fuzzy reasoning combined with the pattern recognition capability of neural networks to be used in financial forecasting and trading in high
frequency. However, in order to eliminate unnecessary input in the
training phase a new event based volatility model was proposed.
Taking volatility and the scaling laws of financial time series into consideration has brought about the development of the Intraday Seasonality Observation Model. This new model allows the observation of specific events and seasonalities in data and subsequently removes any unnecessary data. This new event based
volatility model provides the ANFIS system with more accurate input
and has increased the overall performance of the system.
Abstract: The inherent iterative nature of product design and development poses significant challenge to reduce the product design and development time (PD). In order to shorten the time to market, organizations have adopted concurrent development where multiple specialized tasks and design activities are carried out in parallel. Iterative nature of work coupled with the overlap of activities can result in unpredictable time to completion and significant rework. Many of the products have missed the time to market window due to unanticipated or rather unplanned iteration and rework. The iterative and often overlapped processes introduce greater amounts of ambiguity in design and development, where the traditional methods and tools of project management provide less value. In this context, identifying critical metrics to understand the iteration probability is an open research area where significant contribution can be made given that iteration has been the key driver of cost and schedule risk in PD projects. Two important questions that the proposed study attempts to address are: Can we predict and identify the number of iterations in a product development flow? Can we provide managerial insights for a better control over iteration? The proposal introduces the concept of decision points and using this concept intends to develop metrics that can provide managerial insights into iteration predictability. By characterizing the product development flow as a network of decision points, the proposed research intends to delve further into iteration probability and attempts to provide more clarity.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: If price and quantity are the fundamental building
blocks of any theory of market interactions, the importance of trading
volume in understanding the behavior of financial markets is clear.
However, while many economic models of financial markets have
been developed to explain the behavior of prices -predictability,
variability, and information content- far less attention has been
devoted to explaining the behavior of trading volume. In this article,
we hope to expand our understanding of trading volume by
developing a new measure of herding behavior based on a cross
sectional dispersion of volumes betas. We apply our measure to the
Toronto stock exchange using monthly data from January 2000 to
December 2002. Our findings show that the herd phenomenon
consists of three essential components: stationary herding, intentional
herding and the feedback herding.
Abstract: Accounts of language acquisition differ significantly in their treatment of the role of prediction in language learning. In particular, nativist accounts posit that probabilistic learning about words and word sequences has little to do with how children come to use language. The accuracy of this claim was examined by testing whether distributional probabilities and frequency contributed to how well 3-4 year olds repeat simple word chunks. Corresponding chunks were the same length, expressed similar content, and were all grammatically acceptable, yet the results of the study showed marked differences in performance when overall distributional frequency varied. It was found that a distributional model of language predicted the empirical findings better than a number of other models, replicating earlier findings and showing that children attend to distributional probabilities in an adult corpus. This suggested that language is more prediction-and-error based, rather than on abstract rules which nativist camps suggest.
Abstract: The purpose of this study was to investigate the
relationships among students- process of study, creative self-efficacy
and creativity while attending college. A total of 60 students enrolled
in Hsiuping Institute of Technology in central Taiwan were selected as
samples for the study. The instruments for this study included three
questionnaires to explore the aforesaid aspects.
This researchers tested creative self-efficacy and process of study,
and creativity with Pearson correlation and hierarchical regression
analyses. The major findings of this research are (1) the process of
study had direct positive predictability on creativity, and (2) the
relationship between process of study and creativity is partially
mediated by creative self-efficacy.
Abstract: Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.
Abstract: Thirty three re-wetting tests were conducted at
different combinations of temperatures (5.7- 46.30C) and relative
humidites (48.2-88.6%) with barley. Two most commonly used thinlayer
drying and rewetting models i.e. Page and Diffusion were
compared for their ability to the fit the experimental re-wetting data
based on the standard error of estimate (SEE) of the measured and
simulated moisture contents. The comparison shows both the Page
and Diffusion models fit the re-wetting experimental data of barley
well. The average SEE values for the Page and Diffusion models
were 0.176 % d.b. and 0.199 % d.b., respectively. The Page and
Diffusion models were found to be most suitable equations, to
describe the thin-layer re-wetting characteristics of barley over a
typically five day re-wetting. These two models can be used for the
simulation of deep-bed re-wetting of barley occurring during
ventilated storage and deep bed drying.
Abstract: Integration of system process information obtained
through an image processing system with an evolving knowledge
database to improve the accuracy and predictability of wear particle
analysis is the main focus of the paper. The objective is to automate
intelligently the analysis process of wear particle using classification
via self organizing maps. This is achieved using relationship
measurements among corresponding attributes of various
measurements for wear particle. Finally, visualization technique is
proposed that helps the viewer in understanding and utilizing these
relationships that enable accurate diagnostics.
Abstract: Integration of system process information obtained
through an image processing system with an evolving knowledge
database to improve the accuracy and predictability of wear debris
analysis is the main focus of the paper. The objective is to automate
intelligently the analysis process of wear particle using classification
via self-organizing maps. This is achieved using relationship
measurements among corresponding attributes of various
measurements for wear debris. Finally, visualization technique is
proposed that helps the viewer in understanding and utilizing these
relationships that enable accurate diagnostics.