Abstract: EEG signal is one of the oldest measures of brain
activity that has been used vastly for clinical diagnoses and
biomedical researches. However, EEG signals are highly
contaminated with various artifacts, both from the subject and from
equipment interferences. Among these various kinds of artifacts,
ocular noise is the most important one. Since many applications such
as BCI require online and real-time processing of EEG signal, it is
ideal if the removal of artifacts is performed in an online fashion.
Recently, some methods for online ocular artifact removing have
been proposed. One of these methods is ARMAX modeling of EEG
signal. This method assumes that the recorded EEG signal is a
combination of EOG artifacts and the background EEG. Then the
background EEG is estimated via estimation of ARMAX parameters.
The other recently proposed method is based on adaptive filtering.
This method uses EOG signal as the reference input and subtracts
EOG artifacts from recorded EEG signals. In this paper we
investigate the efficiency of each method for removing of EOG
artifacts. A comparison is made between these two methods. Our
undertaken conclusion from this comparison is that adaptive filtering
method has better results compared with the results achieved by
ARMAX modeling.
Abstract: ebXML (Electronic Business using eXtensible
Markup Language) is an e-business standard, sponsored by
UN/CEFACT and OASIS, which enables enterprises to exchange
business messages, conduct trading relationships, communicate
data in common terms and define and register business
processes. While there is tremendous e-business value in the
ebXML, security remains an unsolved problem and one of the
largest barriers to adoption. XML security technologies emerging
recently have extensibility and flexibility suitable for security
implementation such as encryption, digital signature, access
control and authentication.
In this paper, we propose ebXML business transaction models
that allow trading partners to securely exchange XML based
business transactions by employing XML security technologies.
We show how each XML security technology meets the ebXML
standard by constructing the test software and validating messages
between the trading partners.
Abstract: In this article, a simulation method called the Homotopy Perturbation Method (HPM) is employed in the steady flow of a Walter's B' fluid in a vertical channel with porous wall. We employed Homotopy Perturbation Method to derive solution of a nonlinear form of equation obtained from exerting similarity transforming to the ordinary differential equation gained from continuity and momentum equations of this kind of flow. The results obtained from the Homotopy Perturbation Method are then compared with those from the Runge–Kutta method in order to verify the accuracy of the proposed method. The results show that the Homotopy Perturbation Method can achieve good results in predicting the solution of such problems. Ultimately we use this solution to obtain the other terms of velocities and physical discussion about it.
Abstract: A lot of matching algorithms with different characteristics have been introduced in recent years. For real time systems these algorithms are usually based on minutiae features. In this paper we introduce a novel approach for feature extraction in which the extracted features are independent of shift and rotation of the fingerprint and at the meantime the matching operation is performed much more easily and with higher speed and accuracy. In this new approach first for any fingerprint a reference point and a reference orientation is determined and then based on this information features are converted into polar coordinates. Due to high speed and accuracy of this approach and small volume of extracted features and easily execution of matching operation this approach is the most appropriate for real time applications.
Abstract: For best collaboration, Asynchronous tools and particularly the discussion forums are the most used thanks to their flexibility in terms of time. To convey only the messages that belong to a theme of interest of the tutor in order to help him during his tutoring work, use of a tool for classification of these messages is indispensable. For this we have proposed a semantics classification tool of messages of a discussion forum that is based on LSA (Latent Semantic Analysis), which includes a thesaurus to organize the vocabulary. Benefits offered by formal ontology can overcome the insufficiencies that a thesaurus generates during its use and encourage us then to use it in our semantic classifier. In this work we propose the use of some functionalities that a OWL ontology proposes. We then explain how functionalities like “ObjectProperty", "SubClassOf" and “Datatype" property make our classification more intelligent by way of integrating new terms. New terms found are generated based on the first terms introduced by tutor and semantic relations described by OWL formalism.
Abstract: Transportation authorities need to provide the services
and facilities that are critical to every country-s well-being and
development. Management of the road network is becoming
increasingly challenging as demands increase and resources are
limited. Public sector institutions are integrating performance
information into budgeting, managing and reporting via
implementing performance measurement systems. In the face of
growing challenges, performance measurement of road networks is
attracting growing interest in many countries. The large scale of
public investments makes the maintenance and development of road
networks an area where such systems are an important assessment
tool. Transportation agencies have been using performance
measurement and modeling as part of pavement and bridge
management systems. Recently the focus has been on extending the
process to applications in road construction and maintenance
systems, operations and safety programs, and administrative
structures and procedures. To eliminate failure and dysfunctional
consequences the importance of obtaining objective data and
implementing evaluation instrument where necessary is presented in
this paper
Abstract: The paper deals with results of a project “Interoperability Workplaces to Support Teaching of Security Management in a Computer Network". This project is focused on the perspectives and possibilities of "new approaches" to education, training and crisis communication of rescue teams in the Czech Republic. It means that common technologies considering new perspectives are used to educate selected members of crisis management. The main part concentrates on possibilities of application of new technology and computer-aided tools to education and training of Integrated Rescue System teams.This project uses the COST principle for the creation of specialized centers and for all communication between these workplaces.
Abstract: The direct sewage sludge application is a relative
cheap method for their liquidation. In the past heavy metal contents
increase in soils treated with sewage sludge was observed. In 2003
there was acceptance on act n.188/2003 about sewage sludge
application on soils. The basic philosophy of act is a safety of the
environmental proof of sludge application on soils. The samples of
soils from wastewater treatment plant (WTP) Poprad (35) and WTP
Michalovce (33 samples) were analyzed which were chosen for
sludge application on soils. According to the results only 14 areas for
Poprad and 25 areas for Michalovce are suitable for sludge
application according to act No. 188/2003. The application dose of
sludge was calculated 50 t.ha-1 or 75 t. ha-1 once in 5 years to ensure
that heavy metal contents in treated soils will be kept.
Abstract: This work concerns the measurements of a Bulk
Acoustic Waves (BAW) emission filter S parameters and compare
with prototypes simulated types. Thanks to HP-ADS, a co-simulation
of filters- characteristics in a digital radio-communication chain is
performed. Four cases of modulation schemes are studied in order to
illustrate the impact of the spectral occupation of the modulated
signal. Results of simulations and co-simulation are given in terms of
Error Vector Measurements to be useful for a general sensibility
analysis of 4th/3rd Generation (G.) emitters (wideband QAM and
OFDM signals)
Abstract: Citizens are increasingly are provided with choice and
customization in public services and this has now also become a key
feature of higher education in terms of policy roll-outs on personal
development planning (PDP) and more generally as part of the
employability agenda. The goal here is to transform people, in this
case graduates, into active, responsible citizen-workers. A key part of
this rhetoric and logic is the inculcation of graduate attributes within
students. However, there has also been a concern with the issue of
student lack of engagement and perseverance with their studies. This
paper sets out to explore some of these conceptions that link graduate
attributes with citizenship as well as the notion of how identity is
forged through the higher education process. Examples are drawn
from a quality enhancement project that is being operated within the
context of the Scottish higher education system. This is further
framed within the wider context of competing and conflicting
demands on higher education, exacerbated by the current worldwide
economic climate. There are now pressures on students to develop
their employability skills as well as their capacity to engage with
global issues such as behavioural change in the light of
environmental concerns. It is argued that these pressures, in effect,
lead to a form of personalization that is concerned with how
graduates develop their sense of identity as something that is
engineered and re-engineered to meet these demands.
Abstract: Online discussions are an important component of
both blended and online courses. This paper examines the varieties of
online discussions and the perils, pitfalls and possibilities of this
rather new technological tool for enhanced learning. The discussion
begins with possible perils and pitfalls inherent in this educational
tool and moves to a consideration of the advantages of the varieties
of online discussions feasible for use in teacher education programs.
Abstract: This paper proposes rough set models with three
different level knowledge granules in incomplete information system
under tolerance relation by similarity between objects according to
their attribute values. Through introducing dominance relation on the
discourse to decompose similarity classes into three subclasses: little
better subclass, little worse subclass and vague subclass, it dismantles
lower and upper approximations into three components. By using
these components, retrieving information to find naturally hierarchical
expansions to queries and constructing answers to elaborative queries
can be effective. It illustrates the approach in applying rough set
models in the design of information retrieval system to access different
granular expanded documents. The proposed method enhances rough
set model application in the flexibility of expansions and elaborative
queries in information retrieval.
Abstract: Low-carbon economy means the energy conservation and emission reduction. How to measure and evaluate the regional low-carbon economy is an important problem which should be solved immediately. This paper proposed the eco-efficiency ratio based on the ecological efficiency to evaluate the current situation of the low-carbon economy in Jiangsu province and to analyze the efficiency of the low-carbon economy in Jiangsu and other provinces, compared both advantages and disadvantages. And then this paper put forward some advices for the government to formulate the correct development policy of low-carbon economy, to improve the technology innovation capacity and the efficiency of resource allocation.
Abstract: In this paper, we propose an approach for the classification of fingerprint databases. It is based on the fact that a fingerprint image is composed of regular texture regions that can be successfully represented by co-occurrence matrices. So, we first extract the features based on certain characteristics of the cooccurrence matrix and then we use these features to train a neural network for classifying fingerprints into four common classes. The obtained results compared with the existing approaches demonstrate the superior performance of our proposed approach.
Abstract: This contribution aims to outline some topics around the process of introduction of compulsory electronic exchange of documents (so called e-Boxes) in public administration. The research was conducted in order to gauge the difference between the expectation of those using internal email and their experience in reality. Both qualitative and quantitative research is employed to lead also to an estimation of the willingness and readiness of government bodies, business units and citizens to adopt new technologies. At the same time the most potent barriers to successful e-communication through the e-Boxes are identified.
Abstract: This paper discusses the development of wireless
structure control of an induction motor scalar drives. This was
realised up on the wireless WiFi networks. This strategy of control is
ensured by the use of Wireless ad hoc networks and a virtual network
interface based on VNC which is used to make possible to take the
remote control of a PC connected on a wireless Ethernet network.
Verification of the proposed strategy of control is provided by
experimental realistic tests on scalar controlled induction motor
drives. The experimental results of the implementations with their
analysis are detailed.
Abstract: The network of delivering commodities has been an important design problem in our daily lives and many transportation applications. The delivery performance is evaluated based on the system reliability of delivering commodities from a source node to a sink node in the network. The system reliability is thus maximized to find the optimal routing. However, the design problem is not simple because (1) each path segment has randomly distributed attributes; (2) there are multiple commodities that consume various path capacities; (3) the optimal routing must successfully complete the delivery process within the allowable time constraints. In this paper, we want to focus on the design optimization of the Multi-State Flow Network (MSFN) for multiple commodities. We propose an efficient approach to evaluate the system reliability in the MSFN with respect to randomly distributed path attributes and find the optimal routing subject to the allowable time constraints. The delivery rates, also known as delivery currents, of the path segments are evaluated and the minimal-current arcs are eliminated to reduce the complexity of the MSFN. Accordingly, the correct optimal routing is found and the worst-case reliability is evaluated. It has been shown that the reliability of the optimal routing is at least higher than worst-case measure. Two benchmark examples are utilized to demonstrate the proposed method. The comparisons between the original and the reduced networks show that the proposed method is very efficient.
Abstract: Many studies have applied the Theory of Planned
Behavior (TPB) in predicting health behaviors among unique
populations. However, a new paradigm is emerging where focus is
now directed to modification and expansion of the TPB model rather
than utilization of the traditional theory. This review proposes new
models modified from the Theory of Planned Behavior and suggest
an appropriate study design that can be used to test the models within
physical activity and dietary practice domains among Type 2
diabetics in Kenya. The review was conducted by means of literature
search in the field of nutrition behavior, health psychology and
mixed methods using predetermined key words. The results identify
pre-intention and post intention gaps within the TPB model that need
to be filled. Additional psychosocial factors are proposed to be
included in the TPB model to generate new models and the efficacy
of these models tested using mixed methods design.
Abstract: Locality Sensitive Hashing (LSH) is one of the most
promising techniques for solving nearest neighbour search problem in
high dimensional space. Euclidean LSH is the most popular variation
of LSH that has been successfully applied in many multimedia
applications. However, the Euclidean LSH presents limitations that
affect structure and query performances. The main limitation of the
Euclidean LSH is the large memory consumption. In order to achieve
a good accuracy, a large number of hash tables is required. In this
paper, we propose a new hashing algorithm to overcome the storage
space problem and improve query time, while keeping a good
accuracy as similar to that achieved by the original Euclidean LSH.
The Experimental results on a real large-scale dataset show that the
proposed approach achieves good performances and consumes less
memory than the Euclidean LSH.