Abstract: In this study we focus on improvement performance
of a cue based Motor Imagery Brain Computer Interface (BCI). For
this purpose, data fusion approach is used on results of different
classifiers to make the best decision. At first step Distinction
Sensitive Learning Vector Quantization method is used as a feature
selection method to determine most informative frequencies in
recorded signals and its performance is evaluated by frequency
search method. Then informative features are extracted by packet
wavelet transform. In next step 5 different types of classification
methods are applied. The methodologies are tested on BCI
Competition II dataset III, the best obtained accuracy is 85% and the
best kappa value is 0.8. At final step ordered weighted averaging
(OWA) method is used to provide a proper aggregation classifiers
outputs. Using OWA enhanced system accuracy to 95% and kappa
value to 0.9. Applying OWA just uses 50 milliseconds for
performing calculation.
Abstract: Trust management is one of the drawbacks in Peer-to-Peer (P2P) system. Lack of centralized control makes it difficult to control the behavior of the peers. Reputation system is one approach to provide trust assessment in P2P system. In this paper, we use fuzzy logic to model trust in a P2P environment. Our trust model combines first-hand (direct experience) and second-hand (reputation)information to allow peers to represent and reason with uncertainty regarding other peers' trustworthiness. Fuzzy logic can help in handling the imprecise nature and uncertainty of trust. Linguistic labels are used to enable peers assign a trust level intuitively. Our fuzzy trust model is flexible such that inference rules are used to weight first-hand and second-hand accordingly.
Abstract: In many countries, digital city or ubiquitous city
(u-City) projects have been initiated to provide digitalized economic
environments to cities. Recently in Korea, Kangwon Province has
started the u-Kangwon project to boost local economy with digitalized
tourism services. We analyze the limitations of the ubiquitous IT
approach through the u-Kangwon case. We have found that travelers
are more interested in quality over speed in access of information. For
improved service quality, we are looking to develop an
IT-convergence service design framework (ISDF). The ISDF is based
on the service engineering technique and composed of three parts:
Service Design, Service Simulation, and the Service Platform.
Abstract: This paper presents a hybrid algorithm for solving a timetabling problem, which is commonly encountered in many universities. The problem combines both teacher assignment and course scheduling problems simultaneously, and is presented as a mathematical programming model. However, this problem becomes intractable and it is unlikely that a proven optimal solution can be obtained by an integer programming approach, especially for large problem instances. A hybrid algorithm that combines an integer programming approach, a greedy heuristic and a modified simulated annealing algorithm collaboratively is proposed to solve the problem. Several randomly generated data sets of sizes comparable to that of an institution in Indonesia are solved using the proposed algorithm. Computational results indicate that the algorithm can overcome difficulties of large problem sizes encountered in previous related works.
Abstract: The purpose of this paper is to explore the relationship
between the customers- issues in company corporate governance and
the financial performance. At the beginning theoretical background
consisting stakeholder theory and corporate governance is presented.
On this theoretical background, the empirical research is built,
collecting data of 60 Czech joint stock companies- boards
considering their relationships with customers. Correlation analysis
and multivariate regression analysis were employed to test the sample
on two hypotheses. The weak positive correlation between
stakeholder approach and the company size was identified. But both
hypotheses were not supported, because there was no significant
relation of independent variables to financial performance.
Abstract: This work presents a recursive identification algorithm. This algorithm relates to the identification of closed loop system with Variable Structure Controller. The approach suggested includes two stages. In the first stage a genetic algorithm is used to obtain the parameters of switching function which gives a control signal rich in commutations (i.e. a control signal whose spectral characteristics are closest possible to those of a white noise signal). The second stage consists in the identification of the system parameters by the instrumental variable method and using the optimal switching function parameters obtained with the genetic algorithm. In order to test the validity of this algorithm a simulation example is presented.
Abstract: Most integrated inertial navigation systems (INS) and
global positioning systems (GPS) have been implemented using the
Kalman filtering technique with its drawbacks related to the need for
predefined INS error model and observability of at least four
satellites. Most recently, a method using a hybrid-adaptive network
based fuzzy inference system (ANFIS) has been proposed which is
trained during the availability of GPS signal to map the error
between the GPS and the INS. Then it will be used to predict the
error of the INS position components during GPS signal blockage.
This paper introduces a genetic optimization algorithm that is used to
update the ANFIS parameters with respect to the INS/GPS error
function used as the objective function to be minimized. The results
demonstrate the advantages of the genetically optimized ANFIS for
INS/GPS integration in comparison with conventional ANFIS
specially in the cases of satellites- outages. Coping with this problem
plays an important role in assessment of the fusion approach in land
navigation.
Abstract: The comparative analysis of different taxonomic
groups of microorganisms isolated from dark chernozem soils under
different agricultures (alfalfa, melilot, sainfoin, soybean, rapeseed) at
Almaty region of Kazakhstan was conducted. It was shown that the
greatest number of micromycetes was typical to the soil planted with
alfalfa and canola. Species diversity of micromycetes markedly
decreases as it approaches the surface of the root, so that the species
composition in the rhizosphere is much more uniform than in the
virgin soil. Promising strains of microscopic fungi and yeast with
plant growth-promoting activity to agricultures were selected. Among
the selected fungi there are representatives of Penicillium bilaiae,
Trichoderma koningii, Fusarium equiseti, Aspergillus ustus. The
highest rates of growth and development of seedlings of plants
observed under the influence of yeasts Aureobasidium pullulans,
Rhodotorula mucilaginosa, Metschnikovia pulcherrima. Using
molecular - genetic techniques confirmation of the identification
results of selected micromycetes was conducted.
Abstract: Automated discovery of Rule is, due to its applicability, one of the most fundamental and important method in KDD. It has been an active research area in the recent past. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form: Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. This paper focuses on the issue of mining Quantified rules with crisp hierarchical structure using Genetic Programming (GP) approach to knowledge discovery. The post-processing scheme presented in this work uses Quantified production rules as initial individuals of GP and discovers hierarchical structure. In proposed approach rules are quantified by using Dempster Shafer theory. Suitable genetic operators are proposed for the suggested encoding. Based on the Subsumption Matrix(SM), an appropriate fitness function is suggested. Finally, Quantified Hierarchical Production Rules (HPRs) are generated from the discovered hierarchy, using Dempster Shafer theory. Experimental results are presented to demonstrate the performance of the proposed algorithm.
Abstract: This paper proposes a new technique based on nonlinear Minmax Detector Based (MDB) filter for image restoration. The aim of image enhancement is to reconstruct the true image from the corrupted image. The process of image acquisition frequently leads to degradation and the quality of the digitized image becomes inferior to the original image. Image degradation can be due to the addition of different types of noise in the original image. Image noise can be modeled of many types and impulse noise is one of them. Impulse noise generates pixels with gray value not consistent with their local neighborhood. It appears as a sprinkle of both light and dark or only light spots in the image. Filtering is a technique for enhancing the image. Linear filter is the filtering in which the value of an output pixel is a linear combination of neighborhood values, which can produce blur in the image. Thus a variety of smoothing techniques have been developed that are non linear. Median filter is the one of the most popular non-linear filter. When considering a small neighborhood it is highly efficient but for large window and in case of high noise it gives rise to more blurring to image. The Centre Weighted Mean (CWM) filter has got a better average performance over the median filter. However the original pixel corrupted and noise reduction is substantial under high noise condition. Hence this technique has also blurring affect on the image. To illustrate the superiority of the proposed approach, the proposed new scheme has been simulated along with the standard ones and various restored performance measures have been compared.
Abstract: Generalized Center String (GCS) problem are
generalized from Common Approximate Substring problem
and Common substring problems. GCS are known to be
NP-hard allowing the problems lies in the explosion of
potential candidates. Finding longest center string without
concerning the sequence that may not contain any motifs is
not known in advance in any particular biological gene
process. GCS solved by frequent pattern-mining techniques
and known to be fixed parameter tractable based on the
fixed input sequence length and symbol set size. Efficient
method known as Bpriori algorithms can solve GCS with
reasonable time/space complexities. Bpriori 2 and Bpriori
3-2 algorithm are been proposed of any length and any
positions of all their instances in input sequences. In this
paper, we reduced the time/space complexity of Bpriori
algorithm by Constrained Based Frequent Pattern mining
(CBFP) technique which integrates the idea of Constraint
Based Mining and FP-tree mining. CBFP mining technique
solves the GCS problem works for all center string of any
length, but also for the positions of all their mutated copies
of input sequence. CBFP mining technique construct TRIE
like with FP tree to represent the mutated copies of center
string of any length, along with constraints to restraint
growth of the consensus tree. The complexity analysis for
Constrained Based FP mining technique and Bpriori
algorithm is done based on the worst case and average case
approach. Algorithm's correctness compared with the
Bpriori algorithm using artificial data is shown.
Abstract: A virtualized and virtual approach is presented on
academically preparing students to successfully engage at a strategic
perspective to understand those concerns and measures that are both
structured and not structured in the area of cyber security and
information assurance. The Master of Science in Cyber Security and
Information Assurance (MSCSIA) is a professional degree for those
who endeavor through technical and managerial measures to ensure
the security, confidentiality, integrity, authenticity, control,
availability and utility of the world-s computing and information
systems infrastructure. The National University Cyber Security and
Information Assurance program is offered as a Master-s degree. The
emphasis of the MSCSIA program uniquely includes hands-on
academic instruction using virtual computers. This past year, 2011,
the NU facility has become fully operational using system
architecture to provide a Virtual Education Laboratory (VEL)
accessible to both onsite and online students. The first student cohort
completed their MSCSIA training this past March 2, 2012 after
fulfilling 12 courses, for a total of 54 units of college credits. The
rapid pace scheduling of one course per month is immensely
challenging, perpetually changing, and virtually multifaceted. This
paper analyses these descriptive terms in consideration of those
globalization penetration breaches as present in today-s world of
cyber security. In addition, we present current NU practices to
mitigate risks.
Abstract: Hidden failure in a protection system has been
recognized as one of the main reasons which may cause to a power
system instability leading to a system cascading collapse. This paper
presents a computationally systematic approach used to obtain the
estimated average probability of a system cascading collapse by
considering the effect of probability hidden failure in a protection
system. The estimated average probability of a system cascading
collapse is then used to determine the severe loading condition
contributing to the higher risk of critical system cascading collapse.
This information is essential to the system utility since it will assist
the operator to determine the highest point of increased system
loading condition prior to the event of critical system cascading
collapse.
Abstract: The paper proposes and validates a new method of solving instances of the vehicle routing problem (VRP). The approach is based on a multiple agent system paradigm. The paper contains the VRP formulation, an overview of the multiple agent environment used and a description of the proposed implementation. The approach is validated experimentally. The experiment plan and the discussion of experiment results follow.
Abstract: Collision is considered as a time-depended nonlinear
dynamic phenomenon. The majority of researchers have focused on
deriving the resultant damage of the ship collisions via analytical,
experimental, and finite element methods.In this paper, first, the
force-penetration curve of a head collision on a container ship with
rigid barrier based on Yang and Pedersen-s methods for internal
mechanic section is studied. Next, the obtained results from different
analytical methods are compared with each others. Then, through a
simulation of the container ship collision in Ansys Ls-Dyna, results
from finite element approach are compared with analytical methods
and the source of errors is discussed. Finally, the effects of
parameters such as velocity, and angle of collision on the forcepenetration
curve are investigated.
Abstract: In this work, we suggested a new approach for the
control of a mobile robot capable of being a building block of an
intelligent agent. This approach includes obstacle avoidance and goal
tracking implemented as two different sliding mode controllers. A
geometry based behavior arbitration is proposed for fusing the two
outputs. Proposed structure is tested on simulations and real robot.
Results have confirmed the high performance of the method.
Abstract: Within dental-guided surgery, there has been a lack
of analytical methods for optimizing the treatment of the
rehabilitation concepts regarding geometrical variation. The purpose
of this study is to find the source of the greatest geometrical variation
contributor and sensitivity contributor with the help of virtual
variation simulation of a dental drill- and implant-guided surgery
process using a methodical approach. It is believed that lower
geometrical variation will lead to better patient security and higher
quality of dental drill- and implant-guided surgeries. It was found
that the origin of the greatest contributor to the most variation, and
hence where the foci should be set, in order to minimize geometrical
variation was in the assembly category (surgery). This was also the
category that was the most sensitive for geometrical variation.
Abstract: Crude oil blending is an important unit operation in
petroleum refining industry. A good model for the blending system is
beneficial for supervision operation, prediction of the export
petroleum quality and realizing model-based optimal control. Since
the blending cannot follow the ideal mixing rule in practice, we
propose a static neural network to approximate the blending
properties. By the dead-zone approach, we propose a new robust
learning algorithm and give theoretical analysis. Real data of crude
oil blending is applied to illustrate the neuro modeling approach.
Abstract: Many electronic voting systems, classified mainly as homomorphic cryptography based, mix-net based and blind signature based, appear after the eighties when zero knowledge proofs were introduced. The common ground for all these three systems is that none of them works without real time cryptologic calculations that should be held on a server. As far as known, the agent-based approach has not been used in a secure electronic voting system. In this study, an agent-based electronic voting schema, which does not contain real time calculations on the server side, is proposed. Conventional cryptologic methods are used in the proposed schema and some of the requirements of an electronic voting system are constructed within the schema. The schema seems quite secure if the used cryptologic methods and agents are secure. In this paper, proposed schema will be explained and compared with already known electronic voting systems.
Abstract: A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance. Therefore, in this paper, we introduce a new approach aimed to solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that 2PO outperform the original algorithms in terms of query processing cost and view maintenance cost.