Abstract: Despite the strong and consistent increase in the use of
electronic payment methods worldwide, the diffusion of electronic
wallets is still far from widespread. Analysis of the failure of
electronic wallet uptake has either focused on technical issues or
chosen to analyse a specific scheme. This article proposes a joint
approach to analysing key factors affecting the adoption of e-wallets
by using the ‘Technology Acceptance Model” [1] which we have
expanded to take into account the cost of using e-wallets. We use this
model to analyse Monéo, the only French electronic wallet still in
operation.
Abstract: The goals of the present research are to estimate Six Sigma implementation in Latvian commercial banks and to identify the perceived benefits of its implementation. To achieve the goals, the authors used sequential explanatory method. To obtain empirical data, the authors have developed the questionnaire and adapted it for the employees of Latvian commercial banks. The questions are related to Six Sigma implementation and its perceived benefits. The questionnaire mainly consists of closed questions, the evaluation of which is based on 5 point Likert scale. The obtained empirical data has shown that of the two hypotheses put forward in the present research – Hypothesis 1 – has to be rejected, while Hypothesis 2 has been partially confirmed. The authors have also faced some research limitations related to the fact that the participants in the questionnaire belong to different rank of the organization hierarchy.
Abstract: This paper presents the analysis of duct design using
static and dynamic approaches. The static approach is used to find
out applicability between the design and material applied. The
material used in this paper is Thermoplastic Olefins (TPO). For the
dynamic approach, the focusing is only on the CFD simulations. The
fatigue life in this design and material applied also covered.
Abstract: Evaluation of educational portals is an important
subject area that needs more attention from researchers. A university
that has an educational portal which is difficult to use and interact by
teachers or students or management staff can reduce the position and
reputation of the university. Therefore, it is important to have the
ability to make an evaluation of the quality of e-services the
university provide to improve them over time.
The present study evaluates the usability of the Information
Technology Faculty portal at University of Benghazi. Two evaluation
methods were used: a questionnaire-based method and an online
automated tool-based method. The first method was used to measure
the portal's external attributes of usability (Information, Content and
Organization of the portal, Navigation, Links and Accessibility,
Aesthetic and Visual Appeal, Performance and Effectiveness and
educational purpose) from users' perspectives, while the second
method was used to measure the portal's internal attributes of
usability (number and size of HTML files, number and size of images,
load time, HTML check errors, browsers compatibility problems,
number of bad and broken links), which cannot be perceived by the
users. The study showed that some of the usability aspects have been
found at the acceptable level of performance and quality, and some
others have been found otherwise. In general, it was concluded that
the usability of IT faculty educational portal generally acceptable.
Recommendations and suggestions to improve the weakness and
quality of the portal usability are presented in this study.
Abstract: This paper is concerned with the role strategic
management plays in higher education and the methods it entails.
Using the University of West Bohemia and the Czech Republic as
examples, the paper describes the methods used in furthering
strategic objectives within institutions and their different parts
(faculties, institutes). The nature of the demands faced by the
university dictates the need for a strategic framework which defines
the basic objectives and parameters of tertiary education and research
in a local, regional and national context. Sharing strategies with a
wider range of actors (universities, cities, regions, the practical
sphere) is key to laying the foundations for more efficient
cooperation.
Abstract: A new numerical scheme based on the H1-Galerkin mixed finite element method for a class of second-order pseudohyperbolic equations is constructed. The proposed procedures can be split into three independent differential sub-schemes and does not need to solve a coupled system of equations. Optimal error estimates are derived for both semidiscrete and fully discrete schemes for problems in one space dimension. And the proposed method dose not requires the LBB consistency condition. Finally, some numerical results are provided to illustrate the efficacy of our method.
Abstract: The paper presents the study of synthetic transmit
aperture method applying the Golay coded transmission for medical
ultrasound imaging. Longer coded excitation allows to increase the
total energy of the transmitted signal without increasing the peak
pressure. Signal-to-noise ratio and penetration depth are improved
maintaining high ultrasound image resolution.
In the work the 128-element linear transducer array with 0.3 mm
inter-element spacing excited by one cycle and the 8 and 16-bit
Golay coded sequences at nominal frequencies 4 MHz was used.
Single element transmission aperture was used to generate a spherical
wave covering the full image region and all the elements received the
echo signals. The comparison of 2D ultrasound images of the wire
phantom as well as of the tissue mimicking phantom is presented to
demonstrate the benefits of the coded transmission. The results were
obtained using the synthetic aperture algorithm with transmit and
receive signals correction based on a single element directivity
function.
Abstract: In this paper we present a modification to existed model of threshold for shot cut detection, which is able to adapt itself to the sequence statistics and operate in real time, because it use for calculation only previously evaluated frames. The efficiency of proposed modified adaptive threshold scheme was verified through extensive test experiment with several similarity metrics and achieved results were compared to the results reached by the original model. According to results proposed threshold scheme reached higher accuracy than existed original model.
Abstract: Setting up of rural telecentres, popularly referred to as
Common Service Centres (CSCs), are considered one of the initial
forerunners of rural e-Governance initiatives under the Government
of India-s National e-Governance Plan (NeGP). CSCs are
implemented on public-private partnership (PPP) – where State
governments play a major role in facilitating the establishment of
CSCs and investments are made by private companies referred to as
Service Centre Agencies (SCAs). CSC implementation is expected to
help in improving public service delivery in a transparent and
efficient manner. However, there is very little research undertaken to
study the actual impact of CSC implementation at the grassroots
level. This paper addresses the gap by identifying the circumstances,
concerns and expectations from the point-of-view of citizens and
examining the finer aspects of social processes in the context of rural
e-Governance.
Abstract: A computational platform is presented in this
contribution. It has been designed as a virtual laboratory to be used
for exploring optimization algorithms in biological problems. This
platform is built on a blackboard-based agent architecture. As a test
case, the version of the platform presented here is devoted to the
study of protein folding, initially with a bead-like description of the
chain and with the widely used model of hydrophobic and polar
residues (HP model). Some details of the platform design are
presented along with its capabilities and also are revised some
explorations of the protein folding problems with different types of
discrete space. It is also shown the capability of the platform to
incorporate specific tools for the structural analysis of the runs in
order to understand and improve the optimization process.
Accordingly, the results obtained demonstrate that the ensemble of
computational tools into a single platform is worthwhile by itself,
since experiments developed on it can be designed to fulfill different
levels of information in a self-consistent fashion. By now, it is being
explored how an experiment design can be useful to create a
computational agent to be included within the platform. These
inclusions of designed agents –or software pieces– are useful for the
better accomplishment of the tasks to be developed by the platform.
Clearly, while the number of agents increases the new version of the
virtual laboratory thus enhances in robustness and functionality.
Abstract: National Biodiversity Database System (NBIDS) has
been developed for collecting Thai biodiversity data. The goal of this
project is to provide advanced tools for querying, analyzing,
modeling, and visualizing patterns of species distribution for
researchers and scientists. NBIDS data record two types of datasets:
biodiversity data and environmental data. Biodiversity data are
specie presence data and species status. The attributes of biodiversity
data can be further classified into two groups: universal and projectspecific
attributes. Universal attributes are attributes that are common
to all of the records, e.g. X/Y coordinates, year, and collector name.
Project-specific attributes are attributes that are unique to one or a
few projects, e.g., flowering stage. Environmental data include
atmospheric data, hydrology data, soil data, and land cover data
collecting by using GLOBE protocols. We have developed webbased
tools for data entry. Google Earth KML and ArcGIS were used
as tools for map visualization. webMathematica was used for simple
data visualization and also for advanced data analysis and
visualization, e.g., spatial interpolation, and statistical analysis.
NBIDS will be used by park rangers at Khao Nan National Park, and
researchers.
Abstract: Many applications of speech communication and speaker
identification suffer from the problem of co-channel speech. This
paper deals with a multi-resolution dyadic wavelet transform method
for usable segments of co-channel speech detection that could be
processed by a speaker identification system. Evaluation of this
method is performed on TIMIT database referring to the Target to
Interferer Ratio measure. Co-channel speech is constructed by
mixing all possible gender speakers. Results do not show much
difference for different mixtures. For the overall mixtures 95.76% of
usable speech is correctly detected with false alarms of 29.65%.
Abstract: The article deals with the classification of alternative water resources in terms of potential risks which is the prerequisite for incorporating these water resources to the emergency plans. The classification is based on the quantification of risks resulting from possible damage, disruption or total destruction of water resource caused by natural and anthropogenic hazards, assessment of water quality and availability, traffic accessibility of the assessed resource and finally its water yield. The aim is to achieve the development of an integrated rescue system, which will be capable of supplying the population with drinking water on the whole stricken territory during the states of emergency.
Abstract: The System Identification problem looks for a
suitably parameterized model, representing a given process. The
parameters of the model are adjusted to optimize a performance
function based on error between the given process output and
identified process output. The linear system identification field is
well established with many classical approaches whereas most of
those methods cannot be applied for nonlinear systems. The problem
becomes tougher if the system is completely unknown with only the
output time series is available. It has been reported that the
capability of Artificial Neural Network to approximate all linear and
nonlinear input-output maps makes it predominantly suitable for the
identification of nonlinear systems, where only the output time series
is available. [1][2][4][5]. The work reported here is an attempt to
implement few of the well known algorithms in the context of
modeling of nonlinear systems, and to make a performance
comparison to establish the relative merits and demerits.
Abstract: Six Sigma is a well known discipline that reduces
variation using complex statistical tools and the DMAIC model. By
integrating Goldratts-s Theory of Constraints, the Five Focusing
Points and System Thinking tools, Six Sigma projects can be selected
where it can cause more impact in the company. This research
defines an integrated model of six sigma and constraint management
that shows a step-by-step guide using the original methodologies
from each discipline and is evaluated in a case study from the
production line of a Automobile engine monoblock V8, resulting in
an increase in the line capacity from 18.7 pieces per hour to 22.4
pieces per hour, a reduction of 60% of Work-In-Process and a
variation decrease of 0.73%.
Abstract: A personal estimate of a health risk may not
correspond to a scientific assessment of the health risk. Hence, there
is a need to investigate perceived health risks in the public. In this
study, a young, educated and healthy group of people from a tertiary
institute were questioned about their health concerns. Ethics
clearance was obtained and data was collected by means of a
questionnaire. 362 students participated in the study. Tobacco use,
heavy alcohol drinking, illicit drugs, unsafe sex and potential
carcinogens were perceived to be the five greatest threats to health in
this cohort. On the other hand natural health products,
unemployment, unmet contraceptive needs, family violence and
homelessness were felt to be the least perceived health risks.
Nutrition-related health risks as well as health risks due to physical
inactivity and obesity were not perceived as major health threats.
Such a study of health perceptions may guide health promotion
campaigns.
Abstract: Negation is useful in the majority of the real world applications. However, its introduction leads to semantic and canonical problems. We propose in this paper an approach based on stratification to deal with negation problems. This approach is based on an extension of predicates nets. It is characterized with two main contributions. The first concerns the management of the whole class of stratified programs. The second contribution is related to usual operations optimizations on stratified programs (maximal stratification, incremental updates ...).
Abstract: A novel file splitting technique for the reduction of the nth-order entropy of text files is proposed. The technique is based on mapping the original text file into a non-ASCII binary file using a new codeword assignment method and then the resulting binary file is split into several subfiles each contains one or more bits from each codeword of the mapped binary file. The statistical properties of the subfiles are studied and it is found that they reflect the statistical properties of the original text file which is not the case when the ASCII code is used as a mapper. The nth-order entropy of these subfiles are determined and it is found that the sum of their entropies is less than that of the original text file for the same values of extensions. These interesting statistical properties of the resulting subfiles can be used to achieve better compression ratios when conventional compression techniques are applied to these subfiles individually and on a bit-wise basis rather than on character-wise basis.
Abstract: Pattern matching based on regular tree grammars have been widely used in many areas of computer science. In this paper, we propose a pattern matcher within the framework of code generation, based on a generic and a formalized approach. According to this approach, parsers for regular tree grammars are adapted to a general pattern matching solution, rather than adapting the pattern matching according to their parsing behavior. Hence, we first formalize the construction of the pattern matches respective to input trees drawn from a regular tree grammar in a form of the so-called match trees. Then, we adopt a recently developed generic parser and tightly couple its parsing behavior with such construction. In addition to its generality, the resulting pattern matcher is characterized by its soundness and efficient implementation. This is demonstrated by the proposed theory and by the derived algorithms for its implementation. A comparison with similar and well-known approaches, such as the ones based on tree automata and LR parsers, has shown that our pattern matcher can be applied to a broader class of grammars, and achieves better approximation of pattern matches in one pass. Furthermore, its use as a machine code selector is characterized by a minimized overhead, due to the balanced distribution of the cost computations into static ones, during parser generation time, and into dynamic ones, during parsing time.
Abstract: When earthquakes strike the city it results in great loss of lives. The present paper talks about a new innovative design system (MegEifel) for buildings which has a mechanism to mitigate deaths in case any earthquake strikes the city. If buildings will be designed according to MegEifel design then the occupants of the building will be safe even when they are in sleep or are doing day wise activities during the time earthquake strikes. The core structure is suggested to be designed on the principle that more deep the foundations are, the harder it is to uproot the structure. The buildings will have an Eifel rod dug deep into earth which will help save lives in tall buildings when earthquake strikes. This design takes a leverage of protective shells to save lives.