Abstract: There are many situations where input feature vectors are incomplete and methods to tackle the problem have been studied for a long time. A commonly used procedure is to replace each missing value with an imputation. This paper presents a method to perform categorical missing data imputation from numerical and categorical variables. The imputations are based on Simpson-s fuzzy min-max neural networks where the input variables for learning and classification are just numerical. The proposed method extends the input to categorical variables by introducing new fuzzy sets, a new operation and a new architecture. The procedure is tested and compared with others using opinion poll data.
Abstract: The healthcare environment is generally perceived as
being information rich yet knowledge poor. However, there is a lack
of effective analysis tools to discover hidden relationships and trends
in data. In fact, valuable knowledge can be discovered from
application of data mining techniques in healthcare system. In this
study, a proficient methodology for the extraction of significant
patterns from the Coronary Heart Disease warehouses for heart
attack prediction, which unfortunately continues to be a leading cause
of mortality in the whole world, has been presented. For this purpose,
we propose to enumerate dynamically the optimal subsets of the
reduced features of high interest by using rough sets technique
associated to dynamic programming. Therefore, we propose to
validate the classification using Random Forest (RF) decision tree to
identify the risky heart disease cases. This work is based on a large
amount of data collected from several clinical institutions based on
the medical profile of patient. Moreover, the experts- knowledge in
this field has been taken into consideration in order to define the
disease, its risk factors, and to establish significant knowledge
relationships among the medical factors. A computer-aided system is
developed for this purpose based on a population of 525 adults. The
performance of the proposed model is analyzed and evaluated based
on set of benchmark techniques applied in this classification problem.
Abstract: Stable nonzero populations without random deaths
caused by the Verhulst factor (Verhulst-free) are a rarity. Majority
either grow without bounds or die of excessive harmful mutations.
To delay the accumulation of bad genes or diseases, a new
environmental parameter Γ is introduced in the simulation. Current
results demonstrate that stability may be achieved by setting Γ = 0.1.
These steady states approach a maximum size that scales inversely
with reproduction age.
Abstract: The work we have accomplished in implementing a
Mobile Payment mechanism that enables customers to pay bills for
groceries and other purchased items in a store through the means of a
mobile phone, specifically a Smartphone. The mode of transaction, as
far as communication between the customer-s handset and the
merchant-s POS is concerned, we have decided upon NFC (Near
Field Communication). This is due to the fact that for the most part,
Pakistani Smartphone users have handsets that have Android mobile
OS, which supports the aforementioned platform, IOS, on the other
hand does not.
Abstract: In this paper, a benchmarking framework is presented
for the performance assessment of irrigations systems. Firstly, a data
envelopment analysis (DEA) is applied to measure the technical
efficiency of irrigation systems. This method, based on linear
programming, aims to determine a consistent efficiency ranking of
irrigation systems in which known inputs, such as water volume
supplied and total irrigated area, and a given output corresponding to
the total value of irrigation production are taken into account
simultaneously. Secondly, in order to examine the irrigation
efficiency in more detail, a cross – system comparison is elaborated
using a performance indicators set selected by IWMI. The above
methodologies were applied in Thessaloniki plain, located in
Northern Greece while the results of the application are presented and
discussed. The conjunctive use of DEA and performance indicators
seems to be a very useful tool for efficiency assessment and
identification of best practices in irrigation systems management.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.
Abstract: Network Management Systems have played a great important role in information systems. Management is very important and essential in any fields. There are many managements such as configuration management, fault management, performance management, security management, accounting management and etc. Among them, configuration, fault and security management is more important than others. Because these are essential and useful in any fields. Configuration management is to monitor and maintain the whole system or LAN. Fault management is to detect and troubleshoot the system. Security management is to control the whole system. This paper intends to increase the network management functionalities including configuration management, fault management and security management. In configuration management system, this paper specially can support the USB ports and devices to detect and read devices configuration and solve to detect hardware port and software ports. In security management system, this paper can provide the security feature for the user account setting and user management and proxy server feature. And all of the history of the security such as user account and proxy server history are kept in the java standard serializable file. So the user can view the history of the security and proxy server anytime. If the user uses this system, the user can ping the clients from the network and the user can view the result of the message in fault management system. And this system also provides to check the network card and can show the NIC card setting. This system is used RMI (Remote Method Invocation) and JNI (Java Native Interface) technology. This paper is to implement the client/server network management system using Java 2 Standard Edition (J2SE). This system can provide more than 10 clients. And then this paper intends to show data or message structure of client/server and how to work using TCP/IP protocol.
Abstract: In this paper variation of spot price and total profits of
the generating companies- through wholesale electricity trading are
discussed with and without Central Generating Stations (CGS) share
and seasonal variations are also considered. It demonstrates how
proper analysis of generators- efficiencies and capabilities, types of
generators owned, fuel costs, transmission losses and settling price
variation using the solutions of Optimal Power Flow (OPF), can
allow companies to maximize overall revenue. It illustrates how
solutions of OPF can be used to maximize companies- revenue under
different scenarios. And is also extended to computation of Available
Transfer Capability (ATC) is very important to the transmission
system security and market forecasting. From these results it is
observed that how crucial it is for companies to plan their daily
operations and is certainly useful in an online environment of
deregulated power system. In this paper above tasks are demonstrated
on 124 bus real-life Indian utility power system of Andhra Pradesh
State Grid and results have been presented and analyzed.
Abstract: This paper proposes a novel stereo vision technique
for top view book scanners which provide us with dense 3d point
clouds of page surfaces. This is a precondition to dewarp bound
volumes independent of 2d information on the page. Our method is
based on algorithms, which normally require the projection of pattern
sequences with structured light. We use image sequences of the
moving stripe lighting of the top view scanner instead of an additional
light projection. Thus the stereo vision setup is simplified without
losing measurement accuracy. Furthermore we improve a surface
model dewarping method through introducing a difference vector
based on real measurements. Although our proposed method is hardly
expensive neither in calculation time nor in hardware requirements
we present good dewarping results even for difficult examples.
Abstract: The benefits of rooftop greenery systems (such as
energy savings, reduction of greenhouse gas emission for mitigating
climate change and maintaining sustainable development, indoor
temperature control etc.) in buildings are well recognized, however
there remains very little research conducted for quantifying the
benefits in subtropical climates such as in Australia. This study
mainly focuses on measuring/determining temperature profile and air
conditioning energy savings by implementing rooftop greenery
systems in subtropical Central Queensland in Australia. An
experimental set-up was installed at Rockhampton campus of Central
Queensland University, where two standard shipping containers (6m
x 2.4m x 2.4m) were converted into small offices, one with green
roof and one without. These were used for temperature, humidity and
energy consumption data collection. The study found that an energy
savings of up to 11.70% and temperature difference of up to 4°C can
be achieved in March in subtropical Central Queensland climate in
Australia. It is expected that more energy can be saved in peak
summer days (December/February) as temperature difference
between green roof and non-green roof is higher in December-
February.
Abstract: This paper includes a review of three physics simulation packages that can be used to provide researchers with a virtual ground for modeling, implementing and simulating complex models, as well as testing their control methods with less cost and time of development. The inverted pendulum model was used as a test bed for comparing ODE, DANCE and Webots, while Linear State Feedback was used to control its behavior. The packages were compared with respect to model creation, solving systems of differential equation, data storage, setting system variables, control the experiment and ease of use. The purpose of this paper is to give an overview about our experience with these environments and to demonstrate some of the benefits and drawbacks involved in practice for each package.
Abstract: The main purpose of this study is to provide a detailed
statistical overview of the time and regional distribution, relative
timing occurrence of economic crises and government changes in 51
economies over the 1990–2007 periods. At the same time, the
predictive power of the economic crises on set government changes
will be examined using “signal approach".
The result showed that the percentage of government changes is
highest in transition economies (86 percent of observations) and
lowest in Latin American economies (39 percent of observations).
The percentages of government changes are same in both developed
and developing countries (43 percent of observations). However,
average crises per year (frequency of crises) are higher (lower) in
developing (developed) countries than developed (developing)
countries. Also, the predictive power of economic crises about the
onset of a government change is highest in Transition economies (81
percent) and lowest in Latin American countries (30 percent). The
predictive power of economic crises in developing countries (43
percent) is lower than developed countries (55 percent).
Abstract: The symmetric solution set Σ sym is the set of all solutions to the linear systems Ax = b, where A is symmetric and lies between some given bounds A and A, and b lies between b and b. We present a contractor for Σ sym, which is an iterative method that starts with some initial enclosure of Σ sym (by means of a cartesian product of intervals) and sequentially makes the enclosure tighter. Our contractor is based on polyhedral approximation and solving a series of linear programs. Even though it does not converge to the optimal bounds in general, it may significantly reduce the overestimation. The efficiency is discussed by a number of numerical experiments.
Abstract: This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.
Abstract: This paper explores the effectiveness of machine
learning techniques in detecting firms that issue fraudulent financial
statements (FFS) and deals with the identification of factors
associated to FFS. To this end, a number of experiments have been
conducted using representative learning algorithms, which were
trained using a data set of 164 fraud and non-fraud Greek firms in the
recent period 2001-2002. The decision of which particular method to
choose is a complicated problem. A good alternative to choosing
only one method is to create a hybrid forecasting system
incorporating a number of possible solution methods as components
(an ensemble of classifiers). For this purpose, we have implemented
a hybrid decision support system that combines the representative
algorithms using a stacking variant methodology and achieves better
performance than any examined simple and ensemble method. To
sum up, this study indicates that the investigation of financial
information can be used in the identification of FFS and underline the
importance of financial ratios.
Abstract: Recent years have seen a growing trend towards the
integration of multiple information sources to support large-scale
prediction of protein-protein interaction (PPI) networks in model
organisms. Despite advances in computational approaches, the
combination of multiple “omic" datasets representing the same type
of data, e.g. different gene expression datasets, has not been
rigorously studied. Furthermore, there is a need to further investigate
the inference capability of powerful approaches, such as fullyconnected
Bayesian networks, in the context of the prediction of PPI
networks. This paper addresses these limitations by proposing a
Bayesian approach to integrate multiple datasets, some of which
encode the same type of “omic" data to support the identification of
PPI networks. The case study reported involved the combination of
three gene expression datasets relevant to human heart failure (HF).
In comparison with two traditional methods, Naive Bayesian and
maximum likelihood ratio approaches, the proposed technique can
accurately identify known PPI and can be applied to infer potentially
novel interactions.
Abstract: This paper presents an algorithm which
combining ant colony optimization in the dynamic
programming for solving a dynamic facility layout problem.
The problem is separated into 2 phases, static and dynamic
phase. In static phase, ant colony optimization is used to find
the best ranked of layouts for each period. Then the dynamic
programming (DP) procedure is performed in the dynamic
phase to evaluate the layout set during multi-period planning
horizon. The proposed algorithm is tested over many
problems with size ranging from 9 to 49 departments, 2 and 4
periods. The experimental results show that the proposed
method is an alternative way for the plant layout designer to
determine the layouts during multi-period planning horizon.
Abstract: Since 1984 many schemes have been proposed for
digital signature protocol, among them those that based on discrete
log and factorizations. However a new identification scheme based
on iterated function (IFS) systems are proposed and proved to be
more efficient. In this study the proposed identification scheme is
transformed into a digital signature scheme by using a one way hash
function. It is a generalization of the GQ signature schemes. The
attractor of the IFS is used to obtain public key from a private one,
and in the encryption and decryption of a hash function. Our aim is
to provide techniques and tools which may be useful towards
developing cryptographic protocols. Comparisons between the
proposed scheme and fractal digital signature scheme based on RSA
setting, as well as, with the conventional Guillou-Quisquater
signature, and RSA signature schemes is performed to prove that, the
proposed scheme is efficient and with high performance.
Abstract: Autism Spectrum Disorder (ASD) is a pervasive developmental disorder which affects individuals with varying degrees of impairment. Currently, there has been ample research done in serious game for autism children. Although serious games are traditionally associated with software developments, developing them in the autism field involves studying the associated technology and paying attention to aspects related to interaction with the game. Serious Games for autism cover matters related to education, therapy for communication, psychomotor treatment and social behavior enhancement. In this paper, a systematic review sets out the lines of development and research currently being conducted into serious games which pursue some form of benefit in the field of autism. This paper includes a literature review of relevant serious game developments since in year 2007 and examines new trends.
Abstract: In this paper, we discuss the paradigm shift in bank
capital from the “gone concern" to the “going concern" mindset. We
then propose a methodology for pricing a product of this shift called
Contingent Capital Notes (“CoCos"). The Merton Model can
determine a price for credit risk by using the firm-s equity value as a
call option on those assets. Our pricing methodology for CoCos also
uses the credit spread implied by the Merton Model in a subsequent
derivative form created by John Hull et al . Here, a market implied
asset volatility is calculated by using observed market CDS spreads.
This implied asset volatility is then used to estimate the probability of
triggering a predetermined “contingency event" given the distanceto-
trigger (DTT). The paper then investigates the effect of varying
DTTs and recovery assumptions on the CoCo yield. We conclude
with an investment rationale.