Abstract: Traditional higher-education classrooms allow lecturers to observe students- behaviours and responses to a particular pedagogy during learning in a way that can influence changes to the pedagogical approach. Within current e-learning systems it is difficult to perform continuous analysis of the cohort-s behavioural tendency, making real-time pedagogical decisions difficult. This paper presents a Virtual Learning Process Environment (VLPE) based on the Business Process Management (BPM) conceptual framework. Within the VLPE, course designers can model various education pedagogies in the form of learning process workflows using an intuitive flow diagram interface. These diagrams are used to visually track the learning progresses of a cohort of students. This helps assess the effectiveness of the chosen pedagogy, providing the information required to improve course design. A case scenario of a cohort of students is presented and quantitative statistical analysis of their learning process performance is gathered and displayed in realtime using dashboards.
Abstract: This paper introduces a new signal denoising based on the Empirical mode decomposition (EMD) framework. The method is a fully data driven approach. Noisy signal is decomposed adaptively into oscillatory components called Intrinsic mode functions (IMFs) by means of a process called sifting. The EMD denoising involves filtering or thresholding each IMF and reconstructs the estimated signal using the processed IMFs. The EMD can be combined with a filtering approach or with nonlinear transformation. In this work the Savitzky-Golay filter and shoftthresholding are investigated. For thresholding, IMF samples are shrinked or scaled below a threshold value. The standard deviation of the noise is estimated for every IMF. The threshold is derived for the Gaussian white noise. The method is tested on simulated and real data and compared with averaging, median and wavelet approaches.
Abstract: In this paper, we propose a robust face relighting
technique by using spherical space properties. The proposed method
is done for reducing the illumination effects on face recognition.
Given a single 2D face image, we relight the face object by
extracting the nine spherical harmonic bases and the face spherical
illumination coefficients. First, an internal training illumination
database is generated by computing face albedo and face normal
from 2D images under different lighting conditions. Based on the
generated database, we analyze the target face pixels and compare
them with the training bootstrap by using pre-generated tiles. In this
work, practical real time processing speed and small image size were
considered when designing the framework. In contrast to other works,
our technique requires no 3D face models for the training process
and takes a single 2D image as an input. Experimental results on
publicly available databases show that the proposed technique works
well under severe lighting conditions with significant improvements
on the face recognition rates.
Abstract: An application framework provides a reusable design
and implementation for a family of software systems. Frameworks
are introduced to reduce the cost of a product line (i.e., a family of
products that shares the common features). Software testing is a timeconsuming
and costly ongoing activity during the application
software development process. Generating reusable test cases for the
framework applications during the framework development stage,
and providing and using the test cases to test part of the framework
application whenever the framework is used reduces the application
development time and cost considerably. This paper introduces the
Framework Interface State Transition Tester (FIST2), a tool for
automated unit testing of Java framework applications. During the
framework development stage, given the formal descriptions of the
framework hooks, the specifications of the methods of the
framework-s extensible classes, and the illegal behavior description
of the Framework Interface Classes (FICs), FIST2 generates unitlevel
test cases for the classes. At the framework application
development stage, given the customized method specifications of
the implemented FICs, FIST2 automates the use, execution, and
evaluation of the already generated test cases to test the implemented
FICs. The paper illustrates the use of the FIST2 tool for testing
several applications that use the SalesPoint framework.
Abstract: A transient heat transfer mathematical model for the
prediction of temperature distribution in the car body during primer
baking has been developed by considering the thermal radiation and
convection in the furnace chamber and transient heat conduction
governing equations in the car framework. The car cockpit is
considered like a structure with six flat plates, four vertical plates
representing the car doors and the rear and front panels. The other
two flat plates are the car roof and floor. The transient heat
conduction in each flat plate is modeled by the lumped capacitance
method. Comparison with the experimental data shows that the heat
transfer model works well for the prediction of thermal behavior of
the car body in the curing furnace, with deviations below 5%.
Abstract: This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.
Abstract: Recently, there have been considerable efforts towards the convergence between P2P and Grid computing in order to reach a solution that takes the best of both worlds by exploiting the advantages that each offers. Augmenting the peer-to-peer model to the services of the Grid promises to eliminate bottlenecks and ensure greater scalability, availability, and fault-tolerance. The Grid Information Service (GIS) directly influences quality of service for grid platforms. Most of the proposed solutions for decentralizing the GIS are based on completely flat overlays. The main contributions for this paper are: the investigation of a novel resource discovery framework for Grid implementations based on a hierarchy of structured peer-to-peer overlay networks, and introducing a discovery algorithm utilizing the proposed framework. Validation of the framework-s performance is done via simulation. Experimental results show that the proposed organization has the advantage of being scalable while providing fault-isolation, effective bandwidth utilization, and hierarchical access control. In addition, it will lead to a reliable, guaranteed sub-linear search which returns results within a bounded interval of time and with a smaller amount of generated traffic within each domain.
Abstract: Low silica type X (LSX) Zeolite is one of useful
material in many manufacturing due to the advantage properties
including high surface area, stability, microporous crystalline
aluminosilicates and positive ion in an extra–framework. The LSX
was used rice husk silica source which obtained by leaching with
hydrochloric acid and calcination at 500C. To improve the
synthesis method, the LSX was crystallizated in Teflon–lined
autoclave will expedite deceasing of the amorphous particles. The
mixed gel with composition of 5.5 Na2O : 1.65 K2O : Al2O3 : 2.2
SiO2 : 122 H2O was crystallized in different container
(Polypropylene bottom and Teflon–lined autoclave). The obtained
powder was characterized by X–ray diffraction (XRD), X–ray
fluorescence spectrometry, N2 adsorption-desorption analysis BET
surface area Scanning electron microscopy (SEM) and Fourier
transform infrared spectroscopy to justify the quality of zeolite. The
results showed the crystallized zeolite in Teflon lined autoclave has
102.8 nm of crystal size, 286 m2/g of surface area and fewer amounts
of round amorphous particles when compared with the crystallized
zeolite in Polypropylene.
Abstract: Transportation is one of the main activities related to
creating value for the tourists. Transport management in tourism
mainly focuses on managing transfer points and vehicle capacity.
However, transport service level must also be ensured as it now
relates to tourist-s experiences. This paper emphasizes on the
responsiveness as one of key service performance measures. An
evaluation framework is developed and illustarted by using the case
of small bus service in Pattaya city. It can be seen as a great potential
for the city to utilize the small bus transportation in order to meet the
needs of more diverse group of passengers and to support the
expansion of tourist areas. The framework integrates with service
operations management, logistics, and tourism behavior perspectives.
The findings from the investigation of existing small bus service are
presented and preliminarily validate the usability of the framework.
Abstract: There are some existing Java benchmarks, application benchmarks as well as micro benchmarks or mixture both of them,such as: Java Grande, Spec98, CaffeMark, HBech, etc. But none of them deal with behaviors of multi tasks operating systems. As a result, the achieved outputs are not satisfied for performance evaluation engineers. Behaviors of multi tasks operating systems are based on a schedule management which is employed in these systems. Different processes can have different priority to share the same resources. The time is measured by estimating from applications started to it is finished does not reflect the real time value which the system need for running those programs. New approach to this problem should be done. Having said that, in this paper we present a new Java benchmark, named FHOJ benchmark, which directly deals with multi tasks behaviors of a system. Our study shows that in some cases, results from FHOJ benchmark are far more reliable in comparison with some existing Java benchmarks.
Abstract: Optimization is often a critical issue for most system
design problems. Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, finding optimal solution to complex high
dimensional, multimodal problems often require highly
computationally expensive function evaluations and hence are
practically prohibitive. The Dynamic Approximate Fitness based
Hybrid EA (DAFHEA) model presented in our earlier work [14]
reduced computation time by controlled use of meta-models to
partially replace the actual function evaluation by approximate
function evaluation. However, the underlying assumption in
DAFHEA is that the training samples for the meta-model are
generated from a single uniform model. Situations like model
formation involving variable input dimensions and noisy data
certainly can not be covered by this assumption. In this paper we
present an enhanced version of DAFHEA that incorporates a
multiple-model based learning approach for the SVM approximator.
DAFHEA-II (the enhanced version of the DAFHEA framework) also
overcomes the high computational expense involved with additional
clustering requirements of the original DAFHEA framework. The
proposed framework has been tested on several benchmark functions
and the empirical results illustrate the advantages of the proposed
technique.
Abstract: This research paper presents a framework on how to
build up malware dataset.Many researchers took longer time to
clean the dataset from any noise or to transform the dataset into a
format that can be used straight away for testing. Therefore, this
research is proposing a framework to help researchers to speed up
the malware dataset cleaningprocesses which later can be used for
testing. It is believed, an efficient malware dataset cleaning
processes, can improved the quality of the data, thus help to improve
the accuracy and the efficiency of the subsequent analysis. Apart
from that, an in-depth understanding of the malware taxonomy is
also important prior and during the dataset cleaning processes. A
new Trojan classification has been proposed to complement this
framework.This experiment has been conducted in a controlled lab
environment and using the dataset from VxHeavens dataset. This
framework is built based on the integration of static and dynamic
analyses, incident response method and knowledge database
discovery (KDD) processes.This framework can be used as the basis
guideline for malware researchers in building malware dataset.
Abstract: In Thailand, the practice of pre-hospital Emergency
Medical Service (EMS) in each area reveals the different growth
rates and effectiveness of the practices. Those can be found as the
diverse quality and quantity. To shorten the learning curve prior to
speed-up the practices in other areas, story telling and lessons learnt
from the effective practices are valued as meaningful knowledge. To
this paper, it was to ascertain the factors, lessons learnt and best
practices that have impact as contributing to the success of prehospital
EMS system. Those were formulized as model prior to
speedup the practice in other areas. To develop the model, Malcolm
Baldrige National Quality Award (MBNQA), which is widely
recognized as a framework for organizational quality assessment and
improvement, was chosen as the discussion framework. Remarkably,
this study was based on the consideration of knowledge capture;
however it was not to complete the loop of knowledge activities.
Nevertheless, it was to highlight the recognition of knowledge
capture, which is the initiation of knowledge management.
Abstract: Nowadays, people are going more and more mobile, both in terms of devices and associated applications. Moreover, services that these devices are offering are getting wider and much more complex. Even though actual handheld devices have considerable computing power, their contexts of utilization are different. These contexts are affected by the availability of connection, high latency of wireless networks, battery life, size of the screen, on-screen or hard keyboard, etc. Consequently, development of mobile applications and their associated mobile Web services, if any, should follow a concise methodology so they will provide a high Quality of Service. The aim of this paper is to highlight and discuss main issues to consider when developing mobile applications and mobile Web services and then propose a framework that leads developers through different steps and modules toward development of efficient and secure mobile applications. First, different challenges in developing such applications are elicited and deeply discussed. Second, a development framework is presented with different modules addressing each of these challenges. Third, the paper presents an example of a mobile application, Eivom Cinema Guide, which benefits from following our development framework.
Abstract: At present, intelligent planning in the Graphplan framework is a focus of artificial intelligence. While the Creating or Destroying Objects Planning (CDOP) is one unsolved problem of this field, one of the difficulties, too. In this paper, we study this planning problem and bring forward the idea of transforming objects to propositions, based on which we offer an algorithm, Creating or Destroying Objects in the Graphplan framework (CDOGP). Compared to Graphplan, the new algorithm can solve not only the entire problems that Graphplan do, but also a part of CDOP. It is for the first time that we introduce the idea of object-proposition, and we emphasize the discussion on the representations of creating or destroying objects operator and an algorithm in the Graphplan framework. In addition, we analyze the complexity of this algorithm.
Abstract: This paper argues that a product development exercise
involves in addition to the conventional stages, several decisions
regarding other aspects. These aspects should be addressed
simultaneously in order to develop a product that responds to the
customer needs and that helps realize objectives of the stakeholders
in terms of profitability, market share and the like. We present a
framework that encompasses these different development
dimensions. The framework shows that a product development
methodology such as the Quality Function Deployment (QFD) is the
basic tool which allows definition of the target specifications of a
new product. Creativity is the first dimension that enables the
development exercise to live and end successfully. A number of
group processes need to be followed by the development team in
order to ensure enough creativity and innovation. Secondly,
packaging is considered to be an important extension of the product.
Branding strategies, quality and standardization requirements,
identification technologies, design technologies, production
technologies and costing and pricing are also integral parts to the
development exercise. These dimensions constitute the proposed
framework. The paper also presents a mathematical model used to
calculate the design targets based on the target costing principle. The
framework is used to study a case of a new product development in
the telecommunications services sector.
Abstract: The purpose of this paper is to develop a typology
based on market orientation (MO) and innovation orientation (IO),
and to illustrate to what extent housing companies in Sweden fit
within this framework. A qualitative study on 11 public housing
companies in the central part of Sweden has been conducted by the
help of open and semi-structured questions for data collection. Four
public housing company types- i.e. reactive prospector, proactive
prospector, reactive defender and proactive defender have been
identified by the combination of MO-IO dimensions. Future research
can include other dimensions like entrepreneurship and network to
observe how it particularly affects MO. An empirical study can
compare public and private housing companies on the basis of MO
and IO dimensions. One major contribution of the paper is the
proposition of typology which can be used to describe public housing
companies and deciding their future course of actions.
Abstract: In this paper we investigate how wide-ranging
organizational support and the more specific form of support,
namely management support, may influence on tourism workers
satisfaction with a cash transaction system. The IS continuance
theory, proposed by Bhattacherjee in 2001, is utilized as a
theoretical framework. This implies that both perceived usefulness
and ease of use is included in the research model, in addition to
organizational and management support. The sample consists of
500 workers from 10 cruise and tourist ferries in Scandinavia that
use a cash transaction system to perform their work tasks. Using
structural equation modelling, results indicate that organizational
support and ease of use perceptions is critical for the users- level of
satisfaction with the cash transaction system.The findings have
implications for business managers and IS practitioners that want
to increase the quality of IT-based business processes within the
tourism industry.
Abstract: Segmentation techniques based on Active Contour
Models have been strongly benefited from the use of prior information
during their evolution. Shape prior information is captured from
a training set and is introduced in the optimization procedure to
restrict the evolution into allowable shapes. In this way, the evolution
converges onto regions even with weak boundaries. Although
significant effort has been devoted on different ways of capturing
and analyzing prior information, very little thought has been devoted
on the way of combining image information with prior information.
This paper focuses on a more natural way of incorporating the
prior information in the level set framework. For proof of concept
the method is applied on hippocampus segmentation in T1-MR
images. Hippocampus segmentation is a very challenging task, due
to the multivariate surrounding region and the missing boundary
with the neighboring amygdala, whose intensities are identical. The
proposed method, mimics the human segmentation way and thus
shows enhancements in the segmentation accuracy.
Abstract: In this work, we study the impact of dynamically changing link slowdowns on the stability properties of packetswitched networks under the Adversarial Queueing Theory framework. Especially, we consider the Adversarial, Quasi-Static Slowdown Queueing Theory model, where each link slowdown may take on values in the two-valued set of integers {1, D} with D > 1 which remain fixed for a long time, under a (w, p)-adversary. In this framework, we present an innovative systematic construction for the estimation of adversarial injection rate lower bounds, which, if exceeded, cause instability in networks that use the LIS (Longest-in- System) protocol for contention-resolution. In addition, we show that a network that uses the LIS protocol for contention-resolution may result in dropping its instability bound at injection rates p > 0 when the network size and the high slowdown D take large values. This is the best ever known instability lower bound for LIS networks.