Abstract: The public sector holds large amounts of data of
various areas such as social affairs, economy, or tourism. Various
initiatives such as Open Government Data or the EU Directive on
public sector information aim to make these data available for public
and private service providers. Requirements for the provision of
public sector data are defined by legal and organizational
frameworks. Surprisingly, the defined requirements hardly cover
security aspects such as integrity or authenticity.
In this paper we discuss the importance of these missing
requirements and present a concept to assure the integrity and
authenticity of provided data based on electronic signatures. We
show that our concept is perfectly suitable for the provisioning of
unaltered data. We also show that our concept can also be extended
to data that needs to be anonymized before provisioning by
incorporating redactable signatures. Our proposed concept enhances
trust and reliability of provided public sector data.
Abstract: This paper compares six approaches of object serialization
from qualitative and quantitative aspects. Those are object
serialization in Java, IDL, XStream, Protocol Buffers, Apache Avro,
and MessagePack. Using each approach, a common example is
serialized to a file and the size of the file is measured. The qualitative
comparison works are investigated in the way of checking whether
schema definition is required or not, whether schema compiler is
required or not, whether serialization is based on ascii or binary, and
which programming languages are supported. It is clear that there
is no best solution. Each solution makes good in the context it was
developed.
Abstract: Recently, lots of researchers are attracted to retrieving
multimedia database by using some impression words and their values.
Ikezoe-s research is one of the representatives and uses eight pairs of
opposite impression words. We had modified its retrieval interface and
proposed '2D-RIB' in the previous work. The aim of the present paper
is to improve his/her satisfaction level to the retrieval result in the
2D-RIB. Our method is to extend the 2D-RIB. One of our extensions is
to define and introduce the following two measures: 'melody
goodness' and 'general acceptance'. Another extension is three types
of customization menus. The result of evaluation using a pilot system
is as follows. Both of these two measures 'melody goodness'
and -general acceptance- can contribute to the improvement.
Moreover, it is effective if we introduce the customization menu
which enables a retrieval person to reduce the strictness level of
retrieval condition in an impression pair based on his/her need.
Abstract: In the Equivalent Transformation (ET) computation
model, a program is constructed by the successive accumulation of
ET rules. A method by meta-computation by which a correct ET
rule is generated has been proposed. Although the method covers a
broad range in the generation of ET rules, all important ET rules
are not necessarily generated. Generation of more ET rules can be
achieved by supplementing generation methods which are specialized
for important ET rules. A Specialization-by-Equation (Speq) rule is
one of those important rules. A Speq rule describes a procedure in
which two variables included in an atom conjunction are equalized
due to predicate constraints. In this paper, we propose an algorithm
that systematically and recursively generate Speq rules and discuss
its effectiveness in the synthesis of ET programs. A Speq rule is
generated based on proof of a logical formula consisting of given
atom set and dis-equality. The proof is carried out by utilizing some
ET rules and the ultimately obtained rules in generating Speq rules.
Abstract: Avionic software architecture has transit from a
federated avionics architecture to an integrated modular avionics
(IMA) .ARINC 653 (Avionics Application Standard Software Interface) is a software specification for space and time partitioning in
Safety-critical avionics Real-time operating systems. Methods to transform the abstract avionics application logic function to the
executable model have been brought up, however with less
consideration about the code generating input and output model specific for ARINC 653 platform and inner-task synchronous dynamic
interaction order sequence. In this paper, we proposed an
AADL-based model-driven design methodology to fulfill the purpose
to automatically generating Cµ executable model on ARINC 653 platform from the ARINC653 architecture which defined as AADL653 in order to facilitate the development of the avionics software constructed on ARINC653 OS. This paper presents the
mapping rules between the AADL653 elements and the elements in
Cµ language, and define the code generating rules , designs an automatic C µ code generator .Then, we use a case to illustrate our
approach. Finally, we give the related work and future research directions.
Abstract: Several studies have been carried out, using various techniques, including neural networks, to discriminate vigilance states in humans from electroencephalographic (EEG) signals, but we are still far from results satisfactorily useable results. The work presented in this paper aims at improving this status with regards to 2 aspects. Firstly, we introduce an original procedure made of the association of two neural networks, a self organizing map (SOM) and a learning vector quantization (LVQ), that allows to automatically detect artefacted states and to separate the different levels of vigilance which is a major breakthrough in the field of vigilance. Lastly and more importantly, our study has been oriented toward real-worked situation and the resulting model can be easily implemented as a wearable device. It benefits from restricted computational and memory requirements and data access is very limited in time. Furthermore, some ongoing works demonstrate that this work should shortly results in the design and conception of a non invasive electronic wearable device.
Abstract: Many research works are carried out on the analysis of
traces in a digital learning environment. These studies produce large
volumes of usage tracks from the various actions performed by a
user. However, to exploit these data, compare and improve
performance, several issues are raised. To remedy this, several works
deal with this problem seen recently. This research studied a series of
questions about format and description of the data to be shared. Our
goal is to share thoughts on these issues by presenting our experience
in the analysis of trace-based log files, comparing several approaches
used in automatic classification applied to e-learning platforms.
Finally, the obtained results are discussed.
Abstract: In this paper, application of artificial neural networks
in typical disease diagnosis has been investigated. The real procedure
of medical diagnosis which usually is employed by physicians was
analyzed and converted to a machine implementable format. Then
after selecting some symptoms of eight different diseases, a data set
contains the information of a few hundreds cases was configured and
applied to a MLP neural network. The results of the experiments and
also the advantages of using a fuzzy approach were discussed as
well. Outcomes suggest the role of effective symptoms selection and
the advantages of data fuzzificaton on a neural networks-based
automatic medical diagnosis system.
Abstract: Data mining has been used very frequently to extract
hidden information from large databases. This paper suggests the use
of decision trees for continuously extracting the clinical reasoning in
the form of medical expert-s actions that is inherent in large number
of EMRs (Electronic Medical records). In this way the extracted data
could be used to teach students of oral medicine a number of orderly
processes for dealing with patients who represent with different
problems within the practice context over time.
Abstract: One of the essential sectors of Myanmar economy is
agriculture which is sensitive to climate variation. The most
important climatic element which impacts on agriculture sector is
rainfall. Thus rainfall prediction becomes an important issue in
agriculture country. Multi variables polynomial regression (MPR)
provides an effective way to describe complex nonlinear input output
relationships so that an outcome variable can be predicted from the
other or others. In this paper, the modeling of monthly rainfall
prediction over Myanmar is described in detail by applying the
polynomial regression equation. The proposed model results are
compared to the results produced by multiple linear regression model
(MLR). Experiments indicate that the prediction model based on
MPR has higher accuracy than using MLR.
Abstract: This work proposes an approach to address automatic
text summarization. This approach is a trainable summarizer, which
takes into account several features, including sentence position,
positive keyword, negative keyword, sentence centrality, sentence
resemblance to the title, sentence inclusion of name entity, sentence
inclusion of numerical data, sentence relative length, Bushy path of
the sentence and aggregated similarity for each sentence to generate
summaries. First we investigate the effect of each sentence feature on
the summarization task. Then we use all features score function to
train genetic algorithm (GA) and mathematical regression (MR)
models to obtain a suitable combination of feature weights. The
proposed approach performance is measured at several compression
rates on a data corpus composed of 100 English religious articles.
The results of the proposed approach are promising.
Abstract: In the paper an effective context based lossless coding
technique is presented. Three principal and few auxiliary contexts are
defined. The predictor adaptation technique is an improved CoBALP
algorithm, denoted CoBALP+. Cumulated predictor error combining
8 bias estimators is calculated. It is shown experimentally that
indeed, the new technique is time-effective while it outperforms the
well known methods having reasonable time complexity, and is
inferior only to extremely computationally complex ones.
Abstract: A new approach is adopted in this paper based
on Turk and Pentland-s eigenface method. It was found that the
probability density function of the distance between the projection
vector of the input face image and the average projection vector of
the subject in the face database, follows Rayleigh distribution. In
order to decrease the false acceptance rate and increase the
recognition rate, the input face image has been recognized using two
thresholds including the acceptance threshold and the rejection
threshold. We also find out that the value of two thresholds will be
close to each other as number of trials increases. During the training,
in order to reduce the number of trials, the projection vectors for each
subject has been averaged. The recognition experiments using the
proposed algorithm show that the recognition rate achieves to
92.875% whilst the average number of judgment is only 2.56 times.
Abstract: Text document categorization involves large amount
of data or features. The high dimensionality of features is a
troublesome and can affect the performance of the classification.
Therefore, feature selection is strongly considered as one of the
crucial part in text document categorization. Selecting the best
features to represent documents can reduce the dimensionality of
feature space hence increase the performance. There were many
approaches has been implemented by various researchers to
overcome this problem. This paper proposed a novel hybrid approach
for feature selection in text document categorization based on Ant
Colony Optimization (ACO) and Information Gain (IG). We also
presented state-of-the-art algorithms by several other researchers.
Abstract: In this paper, we propose a Connect6 solver which
adopts a hybrid approach based on a tree-search algorithm and image
processing techniques. The solver must deal with the complicated
computation and provide high performance in order to make real-time
decisions. The proposed approach enables the solver to be
implemented on a single Spartan-6 XC6SLX45 FPGA produced by
XILINX without using any external devices. The compact
implementation is achieved through image processing techniques to
optimize a tree-search algorithm of the Connect6 game. The tree
search is widely used in computer games and the optimal search brings
the best move in every turn of a computer game. Thus, many
tree-search algorithms such as Minimax algorithm and artificial
intelligence approaches have been widely proposed in this field.
However, there is one fundamental problem in this area; the
computation time increases rapidly in response to the growth of the
game tree. It means the larger the game tree is, the bigger the circuit
size is because of their highly parallel computation characteristics.
Here, this paper aims to reduce the size of a Connect6 game tree using
image processing techniques and its position symmetric property. The
proposed solver is composed of four computational modules: a
two-dimensional checkmate strategy checker, a template matching
module, a skilful-line predictor, and a next-move selector. These
modules work well together in selecting next moves from some
candidates and the total amount of their circuits is small. The details of
the hardware design for an FPGA implementation are described and
the performance of this design is also shown in this paper.
Abstract: On one hand, SNMP (Simple Network Management
Protocol) allows integrating different enterprise elements connected
through Internet into a standardized remote management. On the
other hand, as a consequence of the success of Intelligent Houses
they can be connected through Internet now by means of a residential
gateway according to a common standard called OSGi (Open
Services Gateway initiative). Due to the specifics of OSGi Service
Platforms and their dynamic nature, specific design criterions should
be defined to implement SNMP Agents for OSGi in order to integrate
them into the SNMP remote management. Based on the analysis of
the relation between both standards (SNMP and OSGi), this paper
shows how OSGi Service Platforms can be included into the SNMP
management of a global enterprise, giving implementation details
about an SNMP Agent solution and the definition of a new MIB
(Management Information Base) for managing OSGi platforms that
takes into account the specifics and dynamic nature of OSGi.
Abstract: A direct adaptive controller for a class of unknown nonlinear discrete-time systems is presented in this article. The proposed controller is constructed by fuzzy rules emulated network (FREN). With its simple structure, the human knowledge about the plant is transferred to be if-then rules for setting the network. These adjustable parameters inside FREN are tuned by the learning mechanism with time varying step size or learning rate. The variation of learning rate is introduced by main theorem to improve the system performance and stabilization. Furthermore, the boundary of adjustable parameters is guaranteed through the on-line learning and membership functions properties. The validation of the theoretical findings is represented by some illustrated examples.
Abstract: All-to-all personalized communication, also known as complete exchange, is one of the most dense communication patterns in parallel computing. In this paper, we propose new indirect algorithms for complete exchange on all-port ring and torus. The new algorithms fully utilize all communication links and transmit messages along shortest paths to completely achieve the theoretical lower bounds on message transmission, which have not be achieved among other existing indirect algorithms. For 2D r × c ( r % c ) all-port torus, the algorithm has time complexities of optimal transmission cost and O(c) message startup cost. In addition, the proposed algorithms accommodate non-power-of-two tori where the number of nodes in each dimension needs not be power-of-two or square. Finally, the algorithms are conceptually simple and symmetrical for every message and every node so that they can be easily implemented and achieve the optimum in practice.
Abstract: Requirements management is critical to software
delivery success and project lifecycle. Requirements management
and their traceability provide assistance for many software
engineering activities like impact analysis, coverage analysis,
requirements validation and regression testing. In addition
requirements traceability is the recognized component of many
software process improvement initiatives. Requirements traceability
also helps to control and manage evolution of a software system.
This paper aims to provide an evaluation of current requirements
management and traceability tools. Management and test managers
require an appropriate tool for the software under test. We hope,
evaluation identified here will help to select the efficient and
effective tool.
Abstract: In this paper a new fast simplification method is
presented. Such method realizes Karnough map with large
number of variables. In order to accelerate the operation of the
proposed method, a new approach for fast detection of group
of ones is presented. Such approach implemented in the
frequency domain. The search operation relies on performing
cross correlation in the frequency domain rather than time one.
It is proved mathematically and practically that the number of
computation steps required for the presented method is less
than that needed by conventional cross correlation. Simulation
results using MATLAB confirm the theoretical computations.
Furthermore, a powerful solution for realization of complex
functions is given. The simplified functions are implemented
by using a new desigen for neural networks. Neural networks
are used because they are fault tolerance and as a result they
can recognize signals even with noise or distortion. This is
very useful for logic functions used in data and computer
communications. Moreover, the implemented functions are
realized with minimum amount of components. This is done
by using modular neural nets (MNNs) that divide the input
space into several homogenous regions. Such approach is
applied to implement XOR function, 16 logic functions on one
bit level, and 2-bit digital multiplier. Compared to previous
non- modular designs, a clear reduction in the order of
computations and hardware requirements is achieved.