Abstract: Perceptions of quality from both designers and users
perspective have now stretched beyond the traditional usability,
incorporating abstract and subjective concepts. This has led to a shift
in human computer interaction research communities- focus; a shift
that focuses on achieving user experience (UX) by not only fulfilling
conventional usability needs but also those that go beyond them. The
term UX, although widely spread and given significant importance,
lacks consensus in its unified definition. In this paper, we survey
various UX definitions and modeling frameworks and examine them
as the foundation for proposing a UX evolution lifecycle framework
for understanding UX in detail. In the proposed framework we identify
the building blocks of UX and discuss how UX evolves in various
phases. The framework can be used as a tool to understand experience
requirements and evaluate them, resulting in better UX design and
hence improved user satisfaction.
Abstract: Covering-based rough sets is an extension of rough
sets and it is based on a covering instead of a partition of the
universe. Therefore it is more powerful in describing some practical
problems than rough sets. However, by extending the rough sets,
covering-based rough sets can increase the roughness of each model
in recognizing objects. How to obtain better approximations from
the models of a covering-based rough sets is an important issue.
In this paper, two concepts, determinate elements and indeterminate
elements in a universe, are proposed and given precise definitions
respectively. This research makes a reasonable refinement of the
covering-element from a new viewpoint. And the refinement may
generate better approximations of covering-based rough sets models.
To prove the theory above, it is applied to eight major coveringbased
rough sets models which are adapted from other literature.
The result is, in all these models, the lower approximation increases
effectively. Correspondingly, in all models, the upper approximation
decreases with exceptions of two models in some special situations.
Therefore, the roughness of recognizing objects is reduced. This
research provides a new approach to the study and application of
covering-based rough sets.
Abstract: Little research has examined working memory
capacity (WMC) in signed language interpreters and deaf signers.
This paper presents the findings of a study that investigated WMC in
professional Australian Sign Language (Auslan)/English interpreters
and deaf signers. Thirty-one professional Auslan/English interpreters
(14 hearing native signers and 17 hearing non-native signers)
completed an English listening span task and then an Auslan working
memory span task, which tested their English WMC and their Auslan
WMC, respectively. Moreover, 26 deaf signers (6 deaf native signers
and 20 deaf non-native signers) completed the Auslan working
memory span task. The results revealed a non-significant difference
between the hearing native signers and the hearing non-native signers
in their English WMC, and a non-significant difference between the
hearing native signers and the hearing non-native signers in their
Auslan WMC. Moreover, the results yielded a non-significant
difference between the hearing native signers- English WMC and
their Auslan WMC, and a non-significant difference between the
hearing non-native signers- English WMC and their Auslan WMC.
Furthermore, a non-significant difference was found between the deaf
native signers and the deaf non-native signers in their Auslan WMC.
Abstract: Microarray data profiles gene expression on a whole
genome scale, therefore, it provides a good way to study associations
between gene expression and occurrence or progression of cancer.
More and more researchers realized that microarray data is helpful
to predict cancer sample. However, the high dimension of gene
expressions is much larger than the sample size, which makes this
task very difficult. Therefore, how to identify the significant genes
causing cancer becomes emergency and also a hot and hard research
topic. Many feature selection algorithms have been proposed in
the past focusing on improving cancer predictive accuracy at the
expense of ignoring the correlations between the features. In this
work, a novel framework (named by SGS) is presented for stable gene
selection and efficient cancer prediction . The proposed framework
first performs clustering algorithm to find the gene groups where
genes in each group have higher correlation coefficient, and then
selects the significant genes in each group with Bayesian Lasso and
important gene groups with group Lasso, and finally builds prediction
model based on the shrinkage gene space with efficient classification
algorithm (such as, SVM, 1NN, Regression and etc.). Experiment
results on real world data show that the proposed framework often
outperforms the existing feature selection and prediction methods,
say SAM, IG and Lasso-type prediction model.
Abstract: Traditionally, VLSI implementations of spiking
neural nets have featured large neuron counts for fixed computations
or small exploratory, configurable nets. This paper presents the
system architecture of a large configurable neural net system
employing a dedicated mapping algorithm for projecting the targeted
biology-analog nets and dynamics onto the hardware with its
attendant constraints.
Abstract: The paper presents an investigation in to the effect of neural network predictive control of UPFC on the transient stability performance of a multimachine power system. The proposed controller consists of a neural network model of the test system. This model is used to predict the future control inputs using the damped Gauss-Newton method which employs ‘backtracking’ as the line search method for step selection. The benchmark 2 area, 4 machine system that mimics the behavior of large power systems is taken as the test system for the study and is subjected to three phase short circuit faults at different locations over a wide range of operating conditions. The simulation results clearly establish the robustness of the proposed controller to the fault location, an increase in the critical clearing time for the circuit breakers, and an improved damping of the power oscillations as compared to the conventional PI controller.
Abstract: Names are important in many societies, even in technologically oriented ones which use e.g. ID systems to identify individual people. Names such as surnames are the most important as they are used in many processes, such as identifying of people and genealogical research. On the other hand variation of names can be a major problem for the identification and search for people, e.g. web search or security reasons. Name matching presumes a-priori that the recorded name written in one alphabet reflects the phonetic identity of two samples or some transcription error in copying a previously recorded name. We add to this the lode that the two names imply the same person. This paper describes name variations and some basic description of various name matching algorithms developed to overcome name variation and to find reasonable variants of names which can be used to further increasing mismatches for record linkage and name search. The implementation contains algorithms for computing a range of fuzzy matching based on different types of algorithms, e.g. composite and hybrid methods and allowing us to test and measure algorithms for accuracy. NYSIIS, LIG2 and Phonex have been shown to perform well and provided sufficient flexibility to be included in the linkage/matching process for optimising name searching.
Abstract: The curriculum of the primary school science course was redesigned on the basis of constructivism in 2005-2006 academic years, in Turkey. In this context, the name of this course has been changed as “Science and Technology"; and both content and course books, students workbooks for this course have been redesigned in light of constructivism. The aim of this study is to determine whether the Science and Technology course books and student work books for primary school 5th grade are appropriate for the constructivism by evaluating them in terms of the fundamental principles of constructivism. In this study, out of qualitative research methods, documentation technique (i.e. document analysis) is applied; while selecting samples, criterion-sampling is used out of purposeful sampling techniques. When the Science and Technology course book and workbook for the 5th grade in primary education are examined, it is seen that both books complete each other in certain areas. Consequently, it can be claimed that in spite of some inadequate and missing points in the course book and workbook of the primary school Science and Technology course for the 5th grade students, these books are attempted to be designed in terms of the principles of constructivism. To overcome the inadequacies in the books, it can be suggested to redesign them. In addition to them, not to ignore the technology dimension of the course, the activities that encourage the students to prepare projects using technology cycle should be included.
Abstract: Decision support based upon risk analysis into
comparison of the electricity generation from different renewable
energy technologies can provide information about their effects on
the environment and society. The aim of this paper is to develop the
assessment framework regarding risks to health and environment,
and the society-s benefits of the electric power plant generation from
different renewable sources. The multicriteria framework to
multiattribute risk analysis technique and the decision analysis
interview technique are applied in order to support the decisionmaking
process for the implementing renewable energy projects to
the Bangkok case study. Having analyses the local conditions and
appropriate technologies, five renewable power plants are postulated
as options. As this work demonstrates, the analysis can provide a tool
to aid decision-makers for achieving targets related to promote
sustainable energy system.
Abstract: Web-based cooperative learning focuses on (1) the interaction and the collaboration of community members, and (2) the sharing and the distribution of knowledge and expertise by network technology to enhance learning performance. Numerous research literatures related to web-based cooperative learning have demonstrated that cooperative scripts have a positive impact to specify, sequence, and assign cooperative learning activities. Besides, literatures have indicated that role-play in web-based cooperative learning environments enhances two or more students to work together toward the completion of a common goal. Since students generally do not know each other and they lack the face-to-face contact that is necessary for the negotiation of assigning group roles in web-based cooperative learning environments, this paper intends to further extend the application of genetic algorithm (GA) and propose a GA-based algorithm to tackle the problem of role assignment in web-based cooperative learning environments, which not only saves communication costs but also reduces conflict between group members in negotiating role assignments.
Abstract: Ireland developed a National Strategy 2030 that
argued for the creation of a new form of higher education institution,
a Technological University. The research reported here reviews the
first stage of this partnership development. The study found that
national policy can create system capacity and change, but that
individual partners may have more to gain or lose in collaborating.
When presented as a zero-sum activity, fear among partners is high.
The level of knowledge and networking within the higher education
system possessed by each partner contributed to decisions to
participate or not in a joint proposal for collaboration. Greater
success resulted when there were gains for all partners. This research
concludes that policy mandates can provide motivation to
collaborate, but that the partnership needs to be built more on shared
values versus coercion by mandates.
Abstract: In order to increase in chickpea quality and
agroecosystem sustainability, field experiments were carried out in
2007 and 2008 growing seasons. In this research the effects of
different organic, chemical and biological fertilizers were
investigated on grain yield and quality of chickpea. Experimental
units were arranged in split-split plots based on randomized complete
blocks with three replications. The highest amounts of yield and yield
components were obtained in G1×N5 interaction. Significant
increasing of N, P, K, Fe and Mg content in leaves and grains
emphasized on superiority of mentioned treatment because each one
of these nutrients has an approved role in chlorophyll synthesis and
photosynthesis ability of the crop. The combined application of
compost, farmyard manure and chemical phosphorus (N5) had the
best grain quality due to high protein, starch and total sugar contents,
low crude fiber and reduced cooking time.
Abstract: The main goal of the article is to present new model of
application architecture of banking IT solution providing the Internet
Banking services that is particularly outsourced. At first, we propose
business rationale and a SWOT analysis to explain the reasons for the
model in the article. The most important factor for our model is
nowadays- big boom around smart phones and tablet devices. As
next, we focus on IT architecture viewpoint where we design
application, integration and security model. Finally, we propose a
generic governance model that serves as a basis for the specialized
governance model. The specialized instance of governance model is
designed to ensure that the development and the maintenance of
different parts of the IT solution are well governed in time.
Abstract: Laser beam forming is a novel technique developed for the joining of metallic components. In this study, an overview of the laser beam forming process, areas of application, the basic mechanisms of the laser beam forming process, some recent research
studies and the need to focus more research effort on improving the
laser-material interaction of laser beam forming of titanium and its
alloys are presented.
Abstract: Systems Analysis and Design is a key subject in
Information Technology courses, but students do not find it easy to
cope with, since it is not “precise" like programming and not exact
like Mathematics. It is a subject working with many concepts,
modeling ideas into visual representations and then translating the
pictures into a real life system. To complicate matters users who are
not necessarily familiar with computers need to give their inputs to
ensure that they get the system the need. Systems Analysis and
Design also covers two fields, namely Analysis, focusing on the
analysis of the existing system and Design, focusing on the design of
the new system. To be able to test the analysis and design of a
system, it is necessary to develop a system or at least a prototype of
the system to test the validity of the analysis and design. The skills
necessary in each aspect differs vastly. Project Management Skills,
Database Knowledge and Object Oriented Principles are all
necessary. In the context of a developing country where students
enter tertiary education underprepared and the digital divide is alive
and well, students need to be motivated to learn the necessary skills,
get an opportunity to test it in a “live" but protected environment –
within the framework of a university. The purpose of this article is to
improve the learning experience in Systems Analysis and Design
through reviewing the underlying teaching principles used, the
teaching tools implemented, the observations made and the
reflections that will influence future developments in Systems
Analysis and Design. Action research principles allows the focus to
be on a few problematic aspects during a particular semester.
Abstract: A separation-kernel-based operating system (OS) has been designed for use in secure embedded systems by applying formal methods to the design of the separation-kernel part. The separation kernel is a small OS kernel that provides an abstract distributed environment on a single CPU. The design of the separation kernel was verified using two formal methods, the B method and the Spin model checker. A newly designed semi-formal method, the extended state transition method, was also applied. An OS comprising the separation-kernel part and additional OS services on top of the separation kernel was prototyped on the Intel IA-32 architecture. Developing and testing of a prototype embedded application, a point-of-sale application, on the prototype OS demonstrated that the proposed architecture and the use of formal methods to design its kernel part are effective for achieving a secure embedded system having a high-assurance separation kernel.
Abstract: The objective of the research was to study of foot
anthropometry of children aged 7-12 years in the South of Thailand Thirty-three dimensions were measured on 305 male and 295 female
subjects with 3 age ranges (7-12 years old). The instrumentation consists of four types of anthropometer, digital vernier caliper, digital
height gauge and measuring tape. The mean values and standard
deviations of average age, height, and weight of the male subjects were 9.52(±1.70) years, 137.80(±11.55) cm, and 37.57(±11.65) kg.
Female average age, height, and weight subjects were 9.53(±1.70) years, 137.88(±11.55) cm, and 34.90(±11.57) kg respectively. The
comparison of the 33 comparison measured anthropometric. Between
male and female subjects were sexual differences in size on women in almost all areas of significance (p
Abstract: Product Lead Time (PLT) is the period of time from
receiving a customer's order to delivering the final product. PLT is an
indicator of the manufacturing controllability, efficiency and
performance. Due to the explosion in the rate of technological
innovations and the rapid changes in the nature of manufacturing
processes, manufacturing firms can bring the new products to market
quicker only if they can reduce their PLT and speed up the rate at
which they can design, plan, control, and manufacture. Although
there is a substantial body of research on manufacturing relating to
cost and quality issues, there is no much specific research conducted
in relation to the formulation of PLT, despite its significance and
importance. This paper analyzes and formulates PLT which can be
used as a guideline for achieving the shorter PLT. Further more this
paper identifies the causes of delay and factors that contributes to the
increased product lead-time.
Abstract: This report aims to utilize existing and future Multiple-Input Multiple-Output Orthogonal Frequency Division Multiplexing Wireless Local Area Network (MIMO-OFDM WLAN) systems characteristics–such as multiple subcarriers, multiple antennas, and channel estimation characteristics–for indoor location estimation systems based on the Direction of Arrival (DOA) and Radio Signal Strength Indication (RSSI) methods. Hybrid of DOA-RSSI methods also evaluated. In the experimental data result, we show that location estimation accuracy performances can be increased by minimizing the multipath fading effect. This is done using multiple subcarrier frequencies over wideband frequencies to estimate one location. The proposed methods are analyzed in both a wide indoor environment and a typical room-sized office. In the experiments, WLAN terminal locations are estimated by measuring multiple subcarriers from arrays of three dipole antennas of access points (AP). This research demonstrates highly accurate, robust and hardware-free add-on software for indoor location estimations based on a MIMO-OFDM WLAN system.
Abstract: Professions are concerned about the public image they
have, and this public image is represented by stereotypes. Research is
needed to understand how accountants are perceived by different
actors in the society in different contexts, which would allow
universities, professional bodies and employers to adjust their
strategies to attract the right people to the profession and their
organizations. We aim to develop in this paper a framework to be
used in empirical testing in different environments to determine and
analyze the accountant-s stereotype. This framework will be useful in
analyzing the nuances associated to the accountant-s image and in
understanding the factors that may lead to uniformity in the
profession and of those leading to diversity from one context
(country, type of countries, region) to another.