Abstract: Recent scientific investigations indicate that
multimodal biometrics overcome the technical limitations of
unimodal biometrics, making them ideally suited for everyday life
applications that require a reliable authentication system. However,
for a successful adoption of multimodal biometrics, such systems
would require large heterogeneous datasets with complex multimodal
fusion and privacy schemes spanning various distributed
environments. From experimental investigations of current
multimodal systems, this paper reports the various issues related to
speed, error-recovery and privacy that impede the diffusion of such
systems in real-life. This calls for a robust mechanism that caters to
the desired real-time performance, robust fusion schemes,
interoperability and adaptable privacy policies.
The main objective of this paper is to present a framework that
addresses the abovementioned issues by leveraging on the
heterogeneous resource sharing capacities of Grid services and the
efficient machine learning capabilities of artificial neural networks
(ANN). Hence, this paper proposes a Grid-based neural network
framework for adopting multimodal biometrics with the view of
overcoming the barriers of performance, privacy and risk issues that
are associated with shared heterogeneous multimodal data centres.
The framework combines the concept of Grid services for reliable
brokering and privacy policy management of shared biometric
resources along with a momentum back propagation ANN (MBPANN)
model of machine learning for efficient multimodal fusion and
authentication schemes. Real-life applications would be able to adopt
the proposed framework to cater to the varying business requirements
and user privacies for a successful diffusion of multimodal
biometrics in various day-to-day transactions.
Abstract: Cloud Computing is an approach that provides computation and storage services on-demand to clients over the network, independent of device and location. In the last few years, cloud computing became a trend in information technology with many companies that transfer their business processes and applications in the cloud. Cloud computing with service oriented architecture has contributed to rapid development of Geographic Information Systems. Open Geospatial Consortium with its standards provides the interfaces for hosted spatial data and GIS functionality to integrated GIS applications. Furthermore, with the enormous processing power, clouds provide efficient environment for data intensive applications that can be performed efficiently, with higher precision, and greater reliability. This paper presents our work on the geospatial data services within the cloud computing environment and its technology. A cloud computing environment with the strengths and weaknesses of the geographic information system will be introduced. The OGC standards that solve our application interoperability are highlighted. Finally, we outline our system architecture with utilities for requesting and invoking our developed data intensive applications as a web service.
Abstract: Inter-organizational Workflow (IOW) is commonly
used to support the collaboration between heterogeneous and
distributed business processes of different autonomous organizations
in order to achieve a common goal. E-government is considered as an
application field of IOW. The coordination of the different
organizations is the fundamental problem in IOW and remains the
major cause of failure in e-government projects. In this paper, we
introduce a new coordination model for IOW that improves the
collaboration between government administrations and that respects
IOW requirements applied to e-government. For this purpose, we
adopt a Multi-Agent approach, which deals more easily with interorganizational
digital government characteristics: distribution,
heterogeneity and autonomy. Our model integrates also different
technologies to deal with the semantic and technologic
interoperability. Moreover, it conserves the existing systems of
government administrations by offering a distributed coordination
based on interfaces communication. This is especially applied in
developing countries, where administrations are not necessary
equipped with workflow systems. The use of our coordination
techniques allows an easier migration for an e-government solution
and with a lower cost. To illustrate the applicability of the proposed
model, we present a case study of an identity card creation in Tunisia.
Abstract: This paper describes Clinical Document Architecture Release Two (CDA R2) standard and a client application for messaging with SAĞLIK-NET project developed by The Ministry of Health of Turkey. CDA R2 , developed by Health Level 7 (HL7) organization and approved by American National Standards Institute (ANSI) in 2004, to standardize medical information to be able to share semantically and syntactically. In this study, a client application compatible with HL7 V3 for a project named SAĞLIKNET, aimed to build a National Health Information System by Turkey. Moreover, CDA conformance of this application will also be evaluated.
Abstract: The changing economic climate has made global
manufacturing a growing reality over the last decade, forcing
companies from east and west and all over the world to
collaborate beyond geographic boundaries in the design,
manufacture and assemble of products. The ISO10303 and
ISO14649 Standards (STEP and STEP-NC) have been
developed to introduce interoperability into manufacturing
enterprises so as to meet the challenge of responding to
production on demand. This paper describes and illustrates a
STEP compliant CAD/CAPP/CAM System for the manufacture
of rotational parts on CNC turning centers. The information
models to support the proposed system together with the data
models defined in the ISO14649 standard used to create the NC
programs are also described. A structured view of a STEP
compliant CAD/CAPP/CAM system framework supporting the
next generation of intelligent CNC controllers for turn/mill
component manufacture is provided. Finally a proposed
computational environment for a STEP-NC compliant system
for turning operations (SCSTO) is described. SCSTO is the
experimental part of the research supported by the specification
of information models and constructed using a structured
methodology and object-oriented methods. SCSTO was
developed to generate a Part 21 file based on machining
features to support the interactive generation of process plans
utilizing feature extraction. A case study component has been
developed to prove the concept for using the milling and turning
parts of ISO14649 to provide a turn-mill CAD/CAPP/CAM
environment.
Abstract: Method of Parallel Joint Channel Coding and
Cryptography has been analyzed and simulated in this paper. The
method is an extension of Soft Input Decryption with feedback,
which is used for improvement of channel decoding of secured
messages. Parallel Joint Channel Coding and Cryptography results in
improved coding gain of channel decoding, which achieves more
than 2 dB. Such results are an implication of a combination of
receiver components and their interoperability.
Abstract: This paper presents the design and implements the prototype of an intelligent data processing framework in ubiquitous sensor networks. Much focus is put on how to handle the sensor data stream as well as the interoperability between the low-level sensor data and application clients. Our framework first addresses systematic middleware which mitigates the interaction between the application layer and low-level sensors, for the sake of analyzing a great volume of sensor data by filtering and integrating to create value-added context information. Then, an agent-based architecture is proposed for real-time data distribution to efficiently forward a specific event to the appropriate application registered in the directory service via the open interface. The prototype implementation demonstrates that our framework can host a sophisticated application on the ubiquitous sensor network and it can autonomously evolve to new middleware, taking advantages of promising technologies such as software agents, XML, cloud computing, and the like.
Abstract: A key requirement for e-learning materials is
reusability and interoperability, that is the possibility to use at least
part of the contents in different courses, and to deliver them trough
different platforms. These features make possible to limit the cost of
new packages, but require the development of material according to
proper specifications. SCORM (Sharable Content Object Reference
Model) is a set of guidelines suitable for this purpose. A specific
adaptation project has been started to make possible to reuse existing
materials. The paper describes the main characteristics of SCORM
specification, and the procedure used to modify the existing material.
Abstract: The pedagogy project has been proven as an active
learning method, which is used to develop learner-s skills and
knowledge.The use of technology in the learning world, has filed
several gaps in the implementation of teaching methods, and online
evaluation of learners. However, the project methodology presents
challenges in the assessment of learners online.
Indeed, interoperability between E-learning platforms (LMS) is
one of the major challenges of project-based learning assessment.
Firstly, we have reviewed the characteristics of online assessment
in the context of project-based teaching. We addressed the
constraints encountered during the peer evaluation process.
Our approach is to propose a meta-model, which will describe a
language dedicated to the conception of peer assessment scenario in
project-based learning. Then we illustrate our proposal by an
instantiation of the meta-model through a business process in a
scenario of collaborative assessment on line.
Abstract: Virtually all existing networked system management
tools use a Manager/Agent paradigm. That is, distributed agents are
deployed on managed devices to collect local information and report
it back to some management unit. Even those that use standard
protocols such as SNMP fall into this model. Using standard protocol
has the advantage of interoperability among devices from different
vendors. However, it may not be able to provide customized
information that is of interest to satisfy specific management needs.
In this dissertation work, different approaches are used to
collect information regarding the devices attached to a Local Area
Network. An SNMP aware application is being developed that will
manage the discovery procedure and will be used as data collector.
Abstract: This paper describes a platform that faces the main
research areas for e-learning educational contents. Reusability tackles
the possibility to use contents in different courses reducing costs and
exploiting available data from repositories. In our approach the
production of educational material is based on templates to reuse
learning objects. In terms of interoperability the main challenge lays
on reaching the audience through different platforms. E-learning
solution must track social consumption evolution where nowadays
lots of multimedia contents are accessed through the social networks.
Our work faces it by implementing a platform for generation of
multimedia presentations focused on the new paradigm related to
social media. The system produces videos-courses on top of web
standard SMIL (Synchronized Multimedia Integration Language)
ready to be published and shared. Regarding interfaces it is
mandatory to satisfy user needs and ease communication. To
overcome it the platform deploys virtual teachers that provide natural
interfaces while multimodal features remove barriers to pupils with
disabilities.
Abstract: As communications systems and technology become more advanced and complex, it will be increasingly important to focus on users- individual needs. Personalization and effective user profile management will be necessary to ensure the uptake and success of new services and devices and it is therefore important to focus on the users- requirements in this area and define solutions that meet these requirements. The work on personalization and user profiles emerged from earlier ETSI work on a Universal Communications Identifier (UCI) which is a unique identifier of the user rather than a range of identifiers of the many of communication devices or services (e.g. numbers of fixed phone at home/work, mobile phones, fax and email addresses). This paper describes work on personalization including standardized information and preferences and an architectural framework providing a description of how personalization can be integrated in Next Generation Networks, together with the UCI concept.
Abstract: The main aim of this paper is to present the research
findings on the solution of centralized Web-Services for students by
adopting a framework and a prototype for Service Oriented
Architecture (SOA) Web-Services. The current situation of students-
Web-based application services has been identified and proposed an
effective SOA to increase the operational efficiency of Web-Services
for them it was necessary to identify the challenges in delivering a
SOA technology to increase operational efficiency of Web-Services.
Moreover, the SOA is an emerging concept, used for delivering
efficient student SOA Web-Services. Therefore, service reusability
from SOA Web-Services is provided and logically divided services
into smaller services to increase reusability and modularity. In this
case each service is a modular unit by itself and interoperability
services.
Abstract: Standards for learning objects focus primarily on
content presentation. They were already extended to support automatic evaluation but it is limited to exercises with a predefined
set of answers. The existing standards lack the metadata required by specialized evaluators to handle types of exercises with an indefinite
set of solutions. To address this issue existing learning object standards were extended to the particular requirements of a
specialized domain. A definition of programming problems as learning objects, compatible both with Learning Management Systems and with systems performing automatic evaluation of
programs, is presented in this paper. The proposed definition includes
metadata that cannot be conveniently represented using existing standards, such as: the type of automatic evaluation; the requirements
of the evaluation engine; and the roles of different assets - tests cases, program solutions, etc. The EduJudge project and its main services
are also presented as a case study on the use of the proposed definition of programming problems as learning objects.
Abstract: The state of the art in instructional design for
computer-assisted learning has been strongly influenced by advances
in information technology, Internet and Web-based systems. The
emphasis of educational systems has shifted from training to
learning. The course delivered has also been changed from large
inflexible content to sequential small chunks of learning objects. The
concepts of learning objects together with the advanced technologies
of Web and communications support the reusability, interoperability,
and accessibility design criteria currently exploited by most learning
systems. These concepts enable just-in-time learning. We propose to
extend theses design criteria further to include the learnability
concept that will help adapting content to the needs of learners. The
learnability concept offers a better personalization leading to the
creation and delivery of course content more appropriate to
performance and interest of each learner. In this paper we present a
new framework of learning environments containing knowledge
discovery as a tool to automatically learn patterns of learning
behavior from learners' profiles and history.
Abstract: Nowadays, HPC, Grid and Cloud systems are evolving
very rapidly. However, the development of infrastructure solutions
related to HPC is lagging behind. While the existing infrastructure is
sufficient for simple cases, many computational problems have more
complex requirements.Such computational experiments use different
resources simultaneously to start a large number of computational
jobs.These resources are heterogeneous. They have different
purposes, architectures, performance and used software.Users need a
convenient tool that allows to describe and to run complex
computational experiments under conditions of HPC environment.
This paper introduces a modularworkflow system called SEGL
which makes it possible to run complex computational experiments
under conditions of a real HPC organization. The system can be used
in a great number of organizations, which provide HPC power.
Significant requirements to this system are high efficiency and
interoperability with the existing HPC infrastructure of the
organization without any changes.
Abstract: The importance of our country-s communication
system is noticeable when a disaster occurs. The communication
system in our country includes wired and wireless telephone
networks, radio, satellite system and more increasingly internet. Even
though our communication system is most extensive and dependable,
extreme conditions can put a strain on them. Interoperability between
heterogeneous wireless networks can be used to provide efficient
communication for emergency first response. IEEE 802.21 specifies
Media Independent Handover (MIH) services to enhance the mobile
user experience by optimizing handovers between heterogeneous
access networks. This paper presents an algorithm to improve
congestion control in MIH framework. It is analytically shown that
by including time factor in network selection we can optimize
congestion in the network.
Abstract: The Tropical Data Hub (TDH) is a virtual research environment that provides researchers with an e-research infrastructure to congregate significant tropical data sets for data reuse, integration, searching, and correlation. However, researchers often require data and metadata synthesis across disciplines for crossdomain analyses and knowledge discovery. A triplestore offers a semantic layer to achieve a more intelligent method of search to support the synthesis requirements by automating latent linkages in the data and metadata. Presently, the benchmarks to aid the decision of which triplestore is best suited for use in an application environment like the TDH are limited to performance. This paper describes a new evaluation tool developed to analyze both features and performance. The tool comprises a weighted decision matrix to evaluate the interoperability, functionality, performance, and support availability of a range of integrated and native triplestores to rank them according to requirements of the TDH.
Abstract: Societal security, continuity scenarios and methodological cycling approach explained in this article. Namely societal security organizational challenges ask implementation of international standards BS 25999-2 & global ISO 22300 which is a family of standards for business continuity management system. Efficient global organization system is distinguished of high entity´s complexity, connectivity & interoperability, having not only cooperative relations in a fact. Competing business have numerous participating ´enemies´, which are in apparent or hidden opponent and antagonistic roles with prosperous organization system, resulting to a crisis scene or even to a battle theatre. Organization business continuity scenarios are necessary for such ´a play´ preparedness, planning, management & overmastering in real environments.
Abstract: Topics Disaster and Emergency Management are highly debated among experts. Fast communication will help to deal with emergencies. Problem is with the network connection and data exchange. The paper suggests a solution, which allows possibilities and perspectives of new flexible communication platform to the protection of communication systems for crisis management. This platform is used for everyday communication and communication in crisis situations too.