Abstract: Sensor Network are emerging as a new tool for
important application in diverse fields like military surveillance,
habitat monitoring, weather, home electrical appliances and others.
Technically, sensor network nodes are limited in respect to energy
supply, computational capacity and communication bandwidth. In
order to prolong the lifetime of the sensor nodes, designing efficient
routing protocol is very critical. In this paper, we illustrate the
existing routing protocol for wireless sensor network using data
centric approach and present performance analysis of these protocols.
The paper focuses in the performance analysis of specific protocol
namely Directed Diffusion and SPIN. This analysis reveals that the
energy usage is important features which need to be taken into
consideration while designing routing protocol for wireless sensor
network.
Abstract: The purpose of this research was to study five vital
factors related to employees’ job performance. A total of 250
respondents were sampled from employees who worked at a public
warehouse organization, Bangkok, Thailand. Samples were divided
into two groups according to their work experience. The average
working experience was about 9 years for group one and 28 years for
group two. A questionnaire was utilized as a tool to collect data.
Statistics utilized in this research included frequency, percentage,
mean, standard deviation, t-test analysis, one way ANOVA, and
Pearson Product-moment correlation coefficient. Data were analyzed
by using Statistical Package for the Social Sciences. The findings
disclosed that the majority of respondents were female between 23-
31 years old, single, and hold an undergraduate degree. The average
income of respondents was less than 30,900 baht. The findings also
revealed that the factors of organization chart awareness, job process
and technology, internal environment, employee loyalty, and policy
and management were ranked as medium level. The hypotheses
testing revealed that difference in gender, age, and position had
differences in terms of the awareness of organization chart, job
process and technology, internal environment, employee loyalty, and
policy and management in the same direction with low level.
Abstract: Aspheric optical components are an alternative to the use of conventional lenses in the implementation of imaging systems for the visible range. Spherical lenses are capable of producing aberrations. Therefore, they are not able to focus all the light into a single point. Instead, aspherical lenses correct aberrations and provide better resolution even with compact lenses incorporating a small number of lenses.
Metrology of these components is very difficult especially when the resolution requirements increase and insufficient or complexity of conventional tools requires the development of specific approaches to characterization.
This work is part of the problem existed because the objectives are the study and comparison of different methods used to measure surface rays hybrid aspherical lenses.
Abstract: A key aspect of the design of any software system is
its architecture. An architecture description provides a formal model
of the architecture in terms of components and connectors and how
they are composed together. COSA (Component-Object based
Software Structures), is based on object-oriented modeling and
component-based modeling. The model improves the reusability by
increasing extensibility, evolvability, and compositionality of the
software systems. This paper presents the COSA modelling tool
which help architects the possibility to verify the structural coherence
of a given system and to validate its semantics with COSA approach.
Abstract: This article first summarizes reasons why current approaches supporting Open Learning and Distance Education need to be complemented by tools permitting lecturers, researchers and students to cooperatively organize the semantic content of Learning related materials (courses, discussions, etc.) into a fine-grained shared semantic network. This first part of the article also quickly describes the approach adopted to permit such a collaborative work. Then, examples of such semantic networks are presented. Finally, an evaluation of the approach by students is provided and analyzed.
Abstract: When binary decision diagrams are formed from
uniformly distributed Monte Carlo data for a large number of
variables, the complexity of the decision diagrams exhibits a
predictable relationship to the number of variables and minterms. In
the present work, a neural network model has been used to analyze the
pattern of shortest path length for larger number of Monte Carlo data
points. The neural model shows a strong descriptive power for the
ISCAS benchmark data with an RMS error of 0.102 for the shortest
path length complexity. Therefore, the model can be considered as a
method of predicting path length complexities; this is expected to lead
to minimum time complexity of very large-scale integrated circuitries
and related computer-aided design tools that use binary decision
diagrams.
Abstract: The purposes of this study are 1) to study the frequent
English writing errors of students registering the course: Reading and
Writing English for Academic Purposes II, and 2) to find out the
results of writing error correction by using coded indirect corrective
feedback and writing error treatments. Samples include 28 2nd year
English Major students, Faculty of Education, Suan Sunandha
Rajabhat University. Tool for experimental study includes the lesson
plan of the course; Reading and Writing English for Academic
Purposes II, and tool for data collection includes 4 writing tests of
short texts. The research findings disclose that frequent English
writing errors found in this course comprise 7 types of grammatical
errors, namely Fragment sentence, Subject-verb agreement, Wrong
form of verb tense, Singular or plural noun endings, Run-ons
sentence, Wrong form of verb pattern and Lack of parallel structure.
Moreover, it is found that the results of writing error correction by
using coded indirect corrective feedback and error treatment reveal
the overall reduction of the frequent English writing errors and the
increase of students’ achievement in the writing of short texts with
the significance at .05.
Abstract: This paper discusses the causal explanation capability
of QRIOM, a tool aimed at supporting learning of organic chemistry
reactions. The development of the tool is based on the hybrid use of
Qualitative Reasoning (QR) technique and Qualitative Process
Theory (QPT) ontology. Our simulation combines symbolic,
qualitative description of relations with quantity analysis to generate
causal graphs. The pedagogy embedded in the simulator is to both
simulate and explain organic reactions. Qualitative reasoning through
a causal chain will be presented to explain the overall changes made
on the substrate; from initial substrate until the production of final
outputs. Several uses of the QPT modeling constructs in supporting
behavioral and causal explanation during run-time will also be
demonstrated. Explaining organic reactions through causal graph
trace can help improve the reasoning ability of learners in that their
conceptual understanding of the subject is nurtured.
Abstract: Phylogenies ; The evolutionary histories of groups of
species are one of the most widely used tools throughout the life
sciences, as well as objects of research with in systematic,
evolutionary biology. In every phylogenetic analysis reconstruction
produces trees. These trees represent the evolutionary histories of
many groups of organisms, bacteria due to horizontal gene transfer
and plants due to process of hybridization. The process of gene
transfer in bacteria and hybridization in plants lead to reticulate
networks, therefore, the methods of constructing trees fail in
constructing reticulate networks. In this paper a model has been
employed to reconstruct phylogenetic network in honey bee. This
network represents reticulate evolution in honey bee. The maximum
parsimony approach has been used to obtain this reticulate network.
Abstract: User-Centered Design (UCD), Usability Engineering (UE) and Participatory Design (PD) are the common Human- Computer Interaction (HCI) approaches that are practiced in the software development process, focusing towards issues and matters concerning user involvement. It overlooks the organizational perspective of HCI integration within the software development organization. The Management Information Systems (MIS) perspective of HCI takes a managerial and organizational context to view the effectiveness of integrating HCI in the software development process. The Human-Centered Design (HCD) which encompasses all of the human aspects including aesthetic and ergonomic, is claimed as to provide a better approach in strengthening the HCI approaches to strengthen the software development process. In determining the effectiveness of HCD in the software development process, this paper presents the findings of a content analysis of HCI approaches by viewing those approaches as a technology which integrates user requirements, ranging from the top management to other stake holder in the software development process. The findings obtained show that HCD approach is a technology that emphasizes on human, tools and knowledge in strengthening the HCI approaches to strengthen the software development process in the quest to produce a sustainable, usable and useful software product.
Abstract: Recently, the RFID (Radio Frequency
Identification) technology attracts the world market attention as
essential technology for ubiquitous environment. The RFID
market has focused on transponders and reader development.
But that concern has shifted to RFID software like as
high-valued e-business applications, RFID middleware and
related development tools. However, due to the high sensitivity
of data and service transaction within the RFID network,
security consideration must be addressed. In order to guarantee
trusted e-business based on RFID technology, we propose a
security enhanced RFID middleware system. Our proposal is
compliant with EPCglobal ALE (Application Level Events),
which is standard interface for middleware and its clients. We
show how to provide strengthened security and trust by
protecting transported data between middleware and its client,
and stored data in middleware. Moreover, we achieve the
identification and service access control against illegal service
abuse. Our system enables secure RFID middleware service
and trusted e-business service.
Abstract: The number of intrusions and attacks against critical
infrastructures and other information networks is increasing rapidly.
While there is no identified evidence that terrorist organizations are
currently planning a coordinated attack against the vulnerabilities of
computer systems and network connected to critical infrastructure,
and origins of the indiscriminate cyber attacks that infect computers
on network remain largely unknown. The growing trend toward the
use of more automated and menacing attack tools has also
overwhelmed some of the current methodologies used for tracking
cyber attacks. There is an ample possibility that this kind of cyber
attacks can be transform to cyberterrorism caused by illegal purposes.
Cyberterrorism is a matter of vital importance to national welfare.
Therefore, each countries and organizations have to take a proper
measure to meet the situation and consider effective legislation about
cyberterrorism.
Abstract: This paper describes a segmentation algorithm based
on the cooperation of an optical flow estimation method with edge
detection and region growing procedures.
The proposed method has been developed as a pre-processing
stage to be used in methodologies and tools for video/image indexing
and retrieval by content. The addressed problem consists in
extracting whole objects from background for producing images of
single complete objects from videos or photos. The extracted images
are used for calculating the object visual features necessary for both
indexing and retrieval processes.
The first task of the algorithm exploits the cues from motion
analysis for moving area detection. Objects and background are then
refined using respectively edge detection and region growing
procedures. These tasks are iteratively performed until objects and
background are completely resolved.
The developed method has been applied to a variety of indoor and
outdoor scenes where objects of different type and shape are
represented on variously textured background.
Abstract: The plastic forming process of sheet plate takes an
important place in forming metals. The traditional techniques of tool
design for sheet forming operations used in industry are experimental
and expensive methods. Prediction of the forming results,
determination of the punching force, blank holder forces and the
thickness distribution of the sheet metal will decrease the production
cost and time of the material to be formed. In this paper, multi-stage
deep drawing simulation of an Industrial Part has been presented
with finite element method. The entire production steps with
additional operations such as intermediate annealing and springback
has been simulated by ABAQUS software under axisymmetric
conditions. The simulation results such as sheet thickness
distribution, Punch force and residual stresses have been extracted in
any stages and sheet thickness distribution was compared with
experimental results. It was found through comparison of results, the
FE model have proven to be in close agreement with those of
experiment.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.
Abstract: The vast amount of information on the World Wide
Web is created and published by many different types of providers.
Unlike books and journals, most of this information is not subject to
editing or peer review by experts. This lack of quality control and the
explosion of web sites make the task of finding quality information
on the web especially critical. Meanwhile new facilities for
producing web pages such as Blogs make this issue more significant
because Blogs have simple content management tools enabling nonexperts
to build easily updatable web diaries or online journals. On
the other hand despite a decade of active research in information
quality (IQ) there is no framework for measuring information quality
on the Blogs yet. This paper presents a novel experimental
framework for ranking quality of information on the Weblog. The
results of data analysis revealed seven IQ dimensions for the Weblog.
For each dimension, variables and related coefficients were
calculated so that presented framework is able to assess IQ of
Weblogs automatically.
Abstract: This research paper is based upon the simulation of
gradient of mathematical functions and scalar fields using MATLAB.
Scalar fields, their gradient, contours and mesh/surfaces are
simulated using different related MATLAB tools and commands for
convenient presentation and understanding. Different mathematical
functions and scalar fields are examined here by taking their
gradient, visualizing results in 3D with different color shadings and
using other necessary relevant commands. In this way the outputs of
required functions help us to analyze and understand in a better way
as compared to just theoretical study of gradient.
Abstract: Petri Net being one of the most useful graphical tools for modelling complex asynchronous systems, we have used Petri Net to model multi-track railway level crossing system. The roadway has been augmented with four half-size barriers. For better control, a three stage control mechanism has been introduced to ensure that no road-vehicle is trapped on the level crossing. Timed Petri Net is used to include the temporal nature of the signalling system. Safeness analysis has also been included in the discussion section.
Abstract: Simulation is a very powerful method used for highperformance
and high-quality design in distributed system, and now
maybe the only one, considering the heterogeneity, complexity and
cost of distributed systems. In Grid environments, foe example, it is
hard and even impossible to perform scheduler performance
evaluation in a repeatable and controllable manner as resources and
users are distributed across multiple organizations with their own
policies. In addition, Grid test-beds are limited and creating an
adequately-sized test-bed is expensive and time consuming.
Scalability, reliability and fault-tolerance become important
requirements for distributed systems in order to support distributed
computation. A distributed system with such characteristics is called
dependable. Large environments, like Cloud, offer unique
advantages, such as low cost, dependability and satisfy QoS for all
users. Resource management in large environments address
performant scheduling algorithm guided by QoS constrains. This
paper presents the performance evaluation of scheduling heuristics
guided by different optimization criteria. The algorithms for
distributed scheduling are analyzed in order to satisfy users
constrains considering in the same time independent capabilities of
resources. This analysis acts like a profiling step for algorithm
calibration. The performance evaluation is based on simulation. The
simulator is MONARC, a powerful tool for large scale distributed
systems simulation. The novelty of this paper consists in synthetic
analysis results that offer guidelines for scheduler service
configuration and sustain the empirical-based decision. The results
could be used in decisions regarding optimizations to existing Grid
DAG Scheduling and for selecting the proper algorithm for DAG
scheduling in various actual situations.
Abstract: It has become crucial over the years for nations to
improve their credit scoring methods and techniques in light of the
increasing volatility of the global economy. Statistical methods or
tools have been the favoured means for this; however artificial
intelligence or soft computing based techniques are becoming
increasingly preferred due to their proficient and precise nature and
relative simplicity. This work presents a comparison between Support
Vector Machines and Artificial Neural Networks two popular soft
computing models when applied to credit scoring. Amidst the
different criteria-s that can be used for comparisons; accuracy,
computational complexity and processing times are the selected
criteria used to evaluate both models. Furthermore the German credit
scoring dataset which is a real world dataset is used to train and test
both developed models. Experimental results obtained from our study
suggest that although both soft computing models could be used with
a high degree of accuracy, Artificial Neural Networks deliver better
results than Support Vector Machines.