Abstract: Our goal is development of an algorithm capable of
predicting the directional trend of the Standard and Poor’s 500 index
(S&P 500). Extensive research has been published attempting to
predict different financial markets using historical data testing on an
in-sample and trend basis, with many authors employing excessively
complex mathematical techniques. In reviewing and evaluating these
in-sample methodologies, it became evident that this approach was
unable to achieve sufficiently reliable prediction performance for
commercial exploitation. For these reasons, we moved to an out-ofsample
strategy based on linear regression analysis of an extensive
set of financial data correlated with historical closing prices of the
S&P 500. We are pleased to report a directional trend accuracy of
greater than 55% for tomorrow (t+1) in predicting the S&P 500.
Abstract: Estimation of model parameters is necessary to predict
the behavior of a system. Model parameters are estimated using
optimization criteria. Most algorithms use historical data to estimate
model parameters. The known target values (actual) and the output
produced by the model are compared. The differences between the
two form the basis to estimate the parameters. In order to compare
different models developed using the same data different criteria are
used. The data obtained for short scale projects are used here. We
consider software effort estimation problem using radial basis
function network. The accuracy comparison is made using various
existing criteria for one and two predictors. Then, we propose a new
criterion based on linear least squares for evaluation and compared
the results of one and two predictors. We have considered another
data set and evaluated prediction accuracy using the new criterion.
The new criterion is easy to comprehend compared to single statistic.
Although software effort estimation is considered, this method is
applicable for any modeling and prediction.
Abstract: One of the main features of the Maghreb is its linguistic richness. The multilingualism is a fact which always marked the Maghreb since the beginning of the history up to know. Since the arrival of the Phoenicians, followed by the Carthaginians, Romans, and Arabs, etc, there was a social group in the Maghreb which controlled two kinds of idioms. The libyc one remained, despite everything, the local language used by the major part of the population. This language had a support of written transmission attested by many inscriptions. Among all the forms of the Maghreb writing, this alphabet, however, continues to cause a certain number of questions about the origin and the date of its appearance. The archaeological, linguistic and historical data remain insufficient to answer these questions. This did not prevent the researchers from giving an opinion. In order to answer these questions we will expose here the various assumptions adopted by various authors who are founded on more or less explicit arguments. We will also speak about the various forms taken by the libyc writing during antiquity.
Abstract: Mobile robotics is gaining an increasingly important
role in modern society. Several potentially dangerous or laborious
tasks for human are assigned to mobile robots, which are increasingly
capable. Many of these tasks need to be performed within a specified
period, i.e, meet a deadline. Missing the deadline can result in
financial and/or material losses. Mechanisms for predicting the
missing of deadlines are fundamental because corrective actions can
be taken to avoid or minimize the losses resulting from missing the
deadline. In this work we propose a simple but reliable deadline
missing prediction mechanism for mobile robots through the use of
historical data and we use the Pioneer 3-DX robot for experiments
and simulations, one of the most popular robots in academia.
Abstract: The high utilization rate of Automated Teller Machine (ATM) has inevitably caused the phenomena of waiting for a long time in the queue. This in turn has increased the out of stock situations. The ATM utilization helps to determine the usage level and states the necessity of the ATM based on the utilization of the ATM system. The time in which the ATM used more frequently (peak time) and based on the predicted solution the necessary actions are taken by the bank management. The analysis can be done by using the concept of Data Mining and the major part are analyzed based on the predictive data mining. The results are predicted from the historical data (past data) and track the relevant solution which is required. Weka tool is used for the analysis of data based on predictive data mining.
Abstract: This study considers the problem of calculating safety stocks in disaster situations inventory systems that face demand uncertainties. Safety stocks are essential to make the supply chain, which is controlled by forecasts of customer needs, in response to demand uncertainties and to reach predefined goal service levels. To solve the problem of uncertainties due to the disaster situations affecting the industry sector, the concept of Emergency Safety Stock (ESS) was proposed. While there exists a huge body of literature on determining safety stock levels, this literature does not address the problem arising due to the disaster and dealing with the situations. In this paper, the problem of improving the Order Quantity Model to deal with uncertainty of demand due to disasters is managed by incorporating a new idea called ESS which is based on the probability of disaster occurrence and uses probability matrix calculated from the historical data.
Abstract: Short-Term Load Forecasting (STLF) plays an important role for the economic and secure operation of power systems. In this paper, Continuous Genetic Algorithm (CGA) is employed to evolve the optimum large neural networks structure and connecting weights for one-day ahead electric load forecasting problem. This study describes the process of developing three layer feed-forward large neural networks for load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. We find good performance for the large neural networks. The proposed methodology gives lower percent errors all the time. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.
Abstract: In the 1980s, companies began to feel the effect of three major influences on their product development: newer and innovative technologies, increasing product complexity and larger organizations. And therefore companies were forced to look for new product development methods. This paper tries to focus on the two of new product development methods (DFM and CE). The aim of this paper is to see and analyze different product development methods specifically on Design for Manufacturability and Concurrent Engineering. Companies can achieve and be benefited by minimizing product life cycle, cost and meeting delivery schedule. This paper also presents simplified models that can be modified and used by different companies based on the companies- objective and requirements. Methodologies that are followed to do this research are case studies. Two companies were taken and analysed on the product development process. Historical data, interview were conducted on these companies in addition to that, Survey of literatures and previous research works on similar topics has been done during this research. This paper also tries to show the implementation cost benefit analysis and tries to calculate the implementation time. From this research, it has been found that the two companies did not achieve the delivery time to the customer. Some of most frequently coming products are analyzed and 50% to 80 % of their products are not delivered on time to the customers. The companies are following the traditional way of product development that is sequentially design and production method, which highly affect time to market. In the case study it is found that by implementing these new methods and by forming multi disciplinary team in designing and quality inspection; the company can reduce the workflow steps from 40 to 30.
Abstract: This paper proposes a novel improvement of forecasting approach based on using time-invariant fuzzy time series. In contrast to traditional forecasting methods, fuzzy time series can be also applied to problems, in which historical data are linguistic values. It is shown that proposed time-invariant method improves the performance of forecasting process. Further, the effect of using different number of fuzzy sets is tested as well. As with the most of cited papers, historical enrollment of the University of Alabama is used in this study to illustrate the forecasting process. Subsequently, the performance of the proposed method is compared with existing fuzzy time series time-invariant models based on forecasting accuracy. It reveals a certain performance superiority of the proposed method over methods described in the literature.
Abstract: The quality of short term load forecasting can improve the efficiency of planning and operation of electric utilities. Artificial Neural Networks (ANNs) are employed for nonlinear short term load forecasting owing to their powerful nonlinear mapping capabilities. At present, there is no systematic methodology for optimal design and training of an artificial neural network. One has often to resort to the trial and error approach. This paper describes the process of developing three layer feed-forward large neural networks for short-term load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. Particle Swarm Optimization (PSO) is used to develop the optimum large neural network structure and connecting weights for one-day ahead electric load forecasting problem. PSO is a novel random optimization method based on swarm intelligence, which has more powerful ability of global optimization. Employing PSO algorithms on the design and training of ANNs allows the ANN architecture and parameters to be easily optimized. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. The experimental results show that the proposed method optimized by PSO can quicken the learning speed of the network and improve the forecasting precision compared with the conventional Back Propagation (BP) method. Moreover, it is not only simple to calculate, but also practical and effective. Also, it provides a greater degree of accuracy in many cases and gives lower percent errors all the time for STLF problem compared to BP method. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.
Abstract: Modeling of complex dynamic systems, which are
very complicated to establish mathematical models, requires new and
modern methodologies that will exploit the existing expert
knowledge, human experience and historical data. Fuzzy cognitive
maps are very suitable, simple, and powerful tools for simulation and
analysis of these kinds of dynamic systems. However, human experts
are subjective and can handle only relatively simple fuzzy cognitive
maps; therefore, there is a need of developing new approaches for an
automated generation of fuzzy cognitive maps using historical data.
In this study, a new learning algorithm, which is called Big Bang-Big
Crunch, is proposed for the first time in literature for an automated
generation of fuzzy cognitive maps from data. Two real-world
examples; namely a process control system and radiation therapy
process, and one synthetic model are used to emphasize the
effectiveness and usefulness of the proposed methodology.
Abstract: Social-economic variables influence transportation
demand largely. Analyses of discrete choice model consider
social-economic variables to study traveler-s mode choice and
demand. However, to calibrate the discrete choice model needs to have
plenty of questionnaire survey. Also, an aggregative model is
proposed. The historical data of passenger volumes for high speed rail
and domestic civil aviation are employed to calibrate and validate the
model. In this study, models with different social-economic variables,
which are oil price, GDP per capita, CPI and economic growth rate,
are compared. From the results, the model with the oil price is better
than models with the other social-economic variables.
Abstract: Overall cost is a significant consideration in any
decision-making process. Although many studies were carried out on
overall cost in construction, little has treated the uncertainties of real
life cycle development. On the basis of several case studies, a
feedback process was performed on the historical data of studied
buildings. This process enabled to identify some factors causing
uncertainty during the operational period. As a result, the research
proposes a new method for assessing the overall cost during a part of
the building-s life cycle taking account of the building actual value,
its end-of-life value and the influence of the identified life cycle
uncertainty factors. The findings are a step towards a higher level of
reliability in overall cost evaluation taking account of some usually
unexpected uncertainty factors.
Abstract: Fuzzy Cognitive Maps (FCMs) is a causal graph, which shows the relations between essential components in complex systems. Experts who are familiar with the system components and their relations can generate a related FCM. There is a big gap when human experts cannot produce FCM or even there is no expert to produce the related FCM. Therefore, a new mechanism must be used to bridge this gap. In this paper, a novel learning method is proposed to construct causal graph based on historical data and by using metaheuristic such Tabu Search (TS). The efficiency of the proposed method is shown via comparison of its results of some numerical examples with those of some other methods.
Abstract: Fuzzy Cognitive Maps (FCMs) have successfully
been applied in numerous domains to show relations between
essential components. In some FCM, there are more nodes, which
related to each other and more nodes means more complex in system
behaviors and analysis. In this paper, a novel learning method used to
construct FCMs based on historical data and by using data mining
and DEMATEL method, a new method defined to reduce nodes
number. This method cluster nodes in FCM based on their cause and
effect behaviors.
Abstract: A water reuse system in wetland paddy was simulated
to supply water for industrial in this paper. A two-tank model was employed to represent the return flow of the wetland paddy.Historical data were performed for parameter estimation and model verification. With parameters estimated from the data, the model was then used to simulate a reasonable return flow rate from the wetland
paddy. The simulation results show that the return flow ratio was 11.56% in the first crop season and 35.66% in the second crop
season individually; the difference may result from the heavy rainfall in the second crop season. Under the existent pond with surplus
active capacity, the water reuse ratio was 17.14%, and the water supplementary ratio was 21.56%. However, the pattern of rainfall, the
active capacity of the pond, and the rate of water treatment limit the
volume of reuse water. Increasing the irrigation water, dredging the
depth of pond before rainy season and enlarging the scale of module are help to develop water reuse system to support for the industrial
water use around wetland paddy.
Abstract: The objective of the paper is to develop the forecast
model for the HW flows. The methodology of the research included
6 modules: historical data, assumptions, choose of indicators, data
processing, and data analysis with STATGRAPHICS, and forecast
models. The proposed methodology was validated for the case study
for Latvia. Hypothesis on the changes in HW for time period of
2010-2020 have been developed and mathematically described with
confidence level of 95.0% and 50.0%. Sensitivity analysis for the
analyzed scenarios was done. The results show that the growth of
GDP affects the total amount of HW in the country. The total amount
of the HW is projected to be within the corridor of – 27.7% in the
optimistic scenario up to +87.8% in the pessimistic scenario with
confidence level of 50.0% for period of 2010-2020. The optimistic
scenario has shown to be the least flexible to the changes in the GDP
growth.
Abstract: Self-organizing map (SOM) provides both clustering and visualization capabilities in mining data. Dynamic self-organizing maps such as Growing Self-organizing Map (GSOM) has been developed to overcome the problem of fixed structure in SOM to enable better representation of the discovered patterns. However, in mining large datasets or historical data the hierarchical structure of the data is also useful to view the cluster formation at different levels of abstraction. In this paper, we present a technique to generate concept trees from the GSOM. The formation of tree from different spread factor values of GSOM is also investigated and the quality of the trees analyzed. The results show that concept trees can be generated from GSOM, thus, eliminating the need for re-clustering of the data from scratch to obtain a hierarchical view of the data under study.
Abstract: Discrete choice model is the most used methodology for studying traveler-s mode choice and demand. However, to calibrate the discrete choice model needs to have plenty of questionnaire survey. In this study, an aggregative model is proposed. The historical data of passenger volumes for high speed rail and domestic civil aviation are employed to calibrate and validate the model. In this study, different models are compared so as to propose the best one. From the results, systematic equations forecast better than single equation do. Models with the external variable, which is oil price, are better than models based on closed system assumption.
Abstract: The article presents analysis results of maps of
expected subsidence in undermined areas for road repair
management. The analysis was done in the area of Karvina district in
the Czech Republic, including undermined areas with ongoing deep
mining activities or finished deep mining in years 2003 - 2009.
The article discusses the possibilities of local road maintenance
authorities to determine areas that will need most repairs in the future
with limited data available. Using the expected subsidence maps new
map of surface curvature was calculated. Combined with road maps
and historical data about repairs the result came for five main
categories of undermined areas, proving very simple tool for
management.