Abstract: European Rail Traffic Management System (ERTMS) is the European reference for interoperable and safer signaling systems to efficiently manage trains running. If implemented, it allows trains cross seamlessly intra-European national borders. ERTMS has defined a secure communication protocol, EURORADIO, based on open communication networks. Its RadioInfill function can improve the reaction of the signaling system to changes in line conditions, avoiding unnecessary braking: its advantages in terms of power saving and travel time has been analyzed. In this paper a software implementation of the EURORADIO protocol with RadioInfill for ERTMS Level 1 using GSM-R is illustrated as part of the SR-Secure Italian project. In this building-blocks architecture the EURORADIO layers communicates together through modular Application Programm Interfaces. Security coding rules and railway industry requirements specified by EN 50128 standard have been respected. The proposed implementation has successfully passed conformity tests and has been tested on a computer-based simulator.
Abstract: Segmentation is an important step in medical image
analysis and classification for radiological evaluation or computer
aided diagnosis. The CAD (Computer Aided Diagnosis ) of lung CT
generally first segment the area of interest (lung) and then analyze
the separately obtained area for nodule detection in order to
diagnosis the disease. For normal lung, segmentation can be
performed by making use of excellent contrast between air and
surrounding tissues. However this approach fails when lung is
affected by high density pathology. Dense pathologies are present in
approximately a fifth of clinical scans, and for computer analysis
such as detection and quantification of abnormal areas it is vital that
the entire and perfectly lung part of the image is provided and no
part, as present in the original image be eradicated. In this paper we
have proposed a lung segmentation technique which accurately
segment the lung parenchyma from lung CT Scan images. The
algorithm was tested against the 25 datasets of different patients
received from Ackron Univeristy, USA and AGA Khan Medical
University, Karachi, Pakistan.
Abstract: The purpose of the study reported here was designing Information Dissemination System (IDS) based E-learning in agricultural of Iran. A questionnaire was developed to designing Information Dissemination System. The questionnaire was distributed to 96 extension agents who work for Management of Extension and Farming System of Khuzestan province of Iran. Data collected were analyzed using the Statistical Package for the Social Sciences (SPSS). Appropriate statistical procedures for description (frequencies, percent, means, and standard deviations) were used. In this study there was a significant relationship between the age , IT skill and knowledge, years of extension work, the extend of information seeking motivation, level of job satisfaction and level of education with use of information technology by extension agent. According to extension agents five factors were ranked respectively as five top essential items to designing Information Dissemination System (IDS) based E-learning in agricultural of Iran. These factors include: 1) Establish communication between farmers, coordinators (extension agents), agricultural experts, research centers, and community by information technology. 2) The communication between all should be mutual. 3) The information must be based farmers need. 4) Internet used as a facility to transfer the advanced agricultural information to the farming community. 5) Farmers can be illiterate and speak a local and they are not expected to use the system directly. Knowledge produced by the agricultural scientist must be transformed in to computer understandable presentation. To designing Information Dissemination System, electronic communication, in the agricultural society and rural areas must be developed. This communication must be mutual between all factors.
Abstract: This paper presents the benchmarking results and
performance evaluation of differentclustersbuilt atthe National Center
for High-Performance Computingin Taiwan. Performance of
processor, memory subsystem andinterconnect is a critical factor in the
overall performance of high performance computing platforms. The
evaluation compares different system architecture and software
platforms. Most supercomputer used HPL to benchmark their system
performance, in accordance with the requirement of the TOP500 List.
In this paper we consider system memory access factors that affect
benchmark performance, such as processor and memory
performance.We hope these works will provide useful information for
future development and construct cluster system.
Abstract: Article presents the geometry and structure
reconstruction procedure of the aircraft model for flatter research
(based on the I22-IRYDA aircraft). For reconstruction the Reverse
Engineering techniques and advanced surface modeling CAD tools
are used. Authors discuss all stages of data acquisition process,
computation and analysis of measured data. For acquisition the three
dimensional structured light scanner was used. In the further sections,
details of reconstruction process are present. Geometry
reconstruction procedure transform measured input data (points
cloud) into the three dimensional parametric computer model
(NURBS solid model) which is compatible with CAD systems.
Parallel to the geometry of the aircraft, the internal structure
(structural model) are extracted and modeled. In last chapter the
evaluation of obtained models are discussed.
Abstract: Computer languages are usually lumped together
into broad -paradigms-, leaving us in want of a finer classification
of kinds of language. Theories distinguishing between -genuine
differences- in language has been called for, and we propose that
such differences can be observed through a notion of expressive mode.
We outline this concept, propose how it could be operationalized and
indicate a possible context for the development of a corresponding
theory. Finally we consider a possible application in connection
with evaluation of language revision. We illustrate this with a case,
investigating possible revisions of the relational algebra in order to
overcome weaknesses of the division operator in connection with
universal queries.
Abstract: Quality of Service (QoS) Routing aims to find path between source and destination satisfying the QoS requirements which efficiently using the network resources and underlying routing algorithm and to fmd low-cost paths that satisfy given QoS constraints. One of the key issues in providing end-to-end QoS guarantees in packet networks is determining feasible path that satisfies a number of QoS constraints. We present a Optimized Multi- Constrained Routing (OMCR) algorithm for the computation of constrained paths for QoS routing in computer networks. OMCR applies distance vector to construct a shortest path for each destination with reference to a given optimization metric, from which a set of feasible paths are derived at each node. OMCR is able to fmd feasible paths as well as optimize the utilization of network resources. OMCR operates with the hop-by-hop, connectionless routing model in IP Internet and does not create any loops while fmding the feasible paths. Nodes running OMCR not necessarily maintaining global view of network state such as topology, resource information and routing updates are sent only to neighboring nodes whereas its counterpart link-state routing method depend on complete network state for constrained path computation and that incurs excessive communication overhead.
Abstract: Personal computers draw non-sinusoidal current
with odd harmonics more significantly. Power Quality of
distribution networks is severely affected due to the flow of these
generated harmonics during the operation of electronic loads. In
this paper, mathematical modeling of odd harmonics in current like
3rd, 5th, 7th and 9th influencing the power quality has been presented.
Live signals have been captured with the help of power quality
analyzer for analysis purpose. The interesting feature is that Total
Harmonic Distortion (THD) in current decreases with the increase
of nonlinear loads has been verified theoretically. The results
obtained using mathematical expressions have been compared with
the practical results and exciting results have been found.
Abstract: The weight constrained shortest path problem
(WCSPP) is one of most several known basic problems in
combinatorial optimization. Because of its importance in many areas
of applications such as computer science, engineering and operations
research, many researchers have extensively studied the WCSPP.
This paper mainly concentrates on the reduction of total search space
for finding WCSP using some existing Genetic Algorithm (GA). For
this purpose, some controlled schemes of genetic operators are
adopted on list chromosome representation. This approach gives a
near optimum solution with smaller elapsed generation than classical
GA technique. From further analysis on the matter, a new
generalized schema theorem is also developed from the philosophy
of Holland-s theorem.
Abstract: In view of current IT integration development of SOA, this paper examines AIS design based on SOA, including information sources collection, accounting business process integration and real-time financial reports. The main objective of this exploratory paper is to facilitate AIS research combing the Web Service, which is often ignored in accounting and computer research. It provides a conceptual framework that clarifies the interdependency between SOA and AIS, and also presents the major SOA functions in different areas of AIS
Abstract: One of the main concerns in the Information Technology field is adoption with new technologies in organizations which may result in increasing the usage paste of these technologies.This study aims to look at the issue of culture-s role in accepting and using new technologies in organizations. The study examines the effect of culture on accepting and intention to use new technology in organizations. Studies show culture is one of the most important barriers in adoption new technologies. The model used for accepting and using new technology is Technology Acceptance Model (TAM), while for culture and dimensions a well-known theory by Hofsted was used. Results of the study show significant effect of culture on intention to use new technologies. All four dimensions of culture were tested to find the strength of relationship with behavioral intention to use new technologies. Findings indicate the important role of culture in the level of intention to use new technologies and different role of each dimension to improve adaptation process. The study suggests that transferring of new technologies efforts are most likely to be successful if the parties are culturally aligned.
Abstract: In the management of industrial waste, conversion from the use of paper invoices to electronic forms is currently under way in developed countries. Difficulties in such computerization include the lack of synchronization between the actual goods and the corresponding data managed by the server. Consequently, a system which utilizes the incorporation of a QR code in connection with the waste material has been developed. The code is read at each stage, from discharge until disposal, and progress at each stage can be easily reported. This system can be linked with Japanese public digital authentication service of waste, taking advantage of its good points, and can be used to submit reports to the regulatory authorities. Its usefulness was confirmed by a verification test, and put into actual practice.
Abstract: In this paper, a delayed prototype model is studied. Regarding the delay as a bifurcation parameter, we prove that a sequence of Hopf bifurcations will occur at the positive equilibrium when the delay increases. Using the normal form method and center manifold theory, some explicit formulae are worked out for determining the stability and the direction of the bifurcated periodic solutions. Finally, Computer simulations are carried out to explain some mathematical conclusions.
Abstract: This paper presents a new classification algorithm using colour and texture for obstacle detection. Colour information is computationally cheap to learn and process. However in many cases, colour alone does not provide enough information for classification. Texture information can improve classification performance but usually comes at an expensive cost. Our algorithm uses both colour and texture features but texture is only needed when colour is unreliable. During the training stage, texture features are learned specifically to improve the performance of a colour classifier. The algorithm learns a set of simple texture features and only the most effective features are used in the classification stage. Therefore our algorithm has a very good classification rate while is still fast enough to run on a limited computer platform. The proposed algorithm was tested with a challenging outdoor image set. Test result shows the algorithm achieves a much better trade-off between classification performance and efficiency than a typical colour classifier.
Abstract: In this paper, backup and recovery technique for Peer
to Peer applications, such as a distributed asynchronous Web-Based
Training system that we have previously proposed. In order to
improve the scalability and robustness of this system, all contents and
function are realized on mobile agents. These agents are distributed
to computers, and they can obtain using a Peer to Peer network
that modified Content-Addressable Network. In the proposed system,
although entire services do not become impossible even if some
computers break down, the problem that contents disappear occurs
with an agent-s disappearance. As a solution for this issue, backups
of agents are distributed to computers. If a failure of a computer is
detected, other computers will continue service using backups of the
agents belonged to the computer.
Abstract: Probabilistic techniques in computer programs are becoming
more and more widely used. Therefore, there is a big
interest in the formal specification, verification, and development
of probabilistic programs. In our work-in-progress project, we are
attempting to make a constructive framework for developing probabilistic
programs formally. The main contribution of this paper
is to introduce an intermediate artifact of our work, a Z-based
formalism called PZ, by which one can build set theoretical models of
probabilistic programs. We propose to use a constructive set theory,
called CZ set theory, to interpret the specifications written in PZ.
Since CZ has an interpretation in Martin-L¨of-s theory of types, this
idea enables us to derive probabilistic programs from correctness
proofs of their PZ specifications.
Abstract: Users of computer systems may often require the
private transfer of messages/communications between parties across
a network. Information warfare and the protection and dominance of
information in the military context is a prime example of an
application area in which the confidentiality of data needs to be
maintained. The safe transportation of critical data is therefore often
a vital requirement for many private communications. However,
unwanted interception/sniffing of communications is also a
possibility. An elementary stealthy transfer scheme is therefore
proposed by the authors. This scheme makes use of encoding,
splitting of a message and the use of a hashing algorithm to verify the
correctness of the reconstructed message. For this proof-of-concept
purpose, the authors have experimented with the random sending of
encoded parts of a message and the construction thereof to
demonstrate how data can stealthily be transferred across a network
so as to prevent the obvious retrieval of data.
Abstract: There are multiple reasons to expect that detecting the
word order errors in a text will be a difficult problem, and detection
rates reported in the literature are in fact low. Although grammatical
rules constructed by computer linguists improve the performance of
grammar checker in word order diagnosis, the repairing task is still
very difficult. This paper presents an approach for repairing word
order errors in English text by reordering words in a sentence and
choosing the version that maximizes the number of trigram hits
according to a language model. The novelty of this method concerns
the use of an efficient confusion matrix technique for reordering the
words. The comparative advantage of this method is that works with
a large set of words, and avoids the laborious and costly process of
collecting word order errors for creating error patterns.
Abstract: The amount of the information being churned out by the field of biology has jumped manifold and now requires the extensive use of computer techniques for the management of this information. The predominance of biological information such as protein sequence similarity in the biological information sea is key information for detecting protein evolutionary relationship. Protein sequence similarity typically implies homology, which in turn may imply structural and functional similarities. In this work, we propose, a learning method for detecting remote protein homology. The proposed method uses a transformation that converts protein sequence into fixed-dimensional representative feature vectors. Each feature vector records the sensitivity of a protein sequence to a set of amino acids substrings generated from the protein sequences of interest. These features are then used in conjunction with support vector machines for the detection of the protein remote homology. The proposed method is tested and evaluated on two different benchmark protein datasets and it-s able to deliver improvements over most of the existing homology detection methods.
Abstract: The security of computer networks plays a strategic
role in modern computer systems. Intrusion Detection Systems (IDS)
act as the 'second line of defense' placed inside a protected
network, looking for known or potential threats in network traffic
and/or audit data recorded by hosts. We developed an Intrusion
Detection System using LAMSTAR neural network to learn patterns
of normal and intrusive activities, to classify observed system
activities and compared the performance of LAMSTAR IDS with
other classification techniques using 5 classes of KDDCup99 data.
LAMSAR IDS gives better performance at the cost of high
Computational complexity, Training time and Testing time, when
compared to other classification techniques (Binary Tree classifier,
RBF classifier, Gaussian Mixture classifier). we further reduced the
Computational Complexity of LAMSTAR IDS by reducing the
dimension of the data using principal component analysis which in
turn reduces the training and testing time with almost the same
performance.