Abstract: Automatic reusability appraisal is helpful in
evaluating the quality of developed or developing reusable software
components and in identification of reusable components from
existing legacy systems; that can save cost of developing the
software from scratch. But the issue of how to identify reusable
components from existing systems has remained relatively
unexplored. In this research work, structural attributes of software
components are explored using software metrics and quality of the
software is inferred by different Neural Network based approaches,
taking the metric values as input. The calculated reusability value
enables to identify a good quality code automatically. It is found that
the reusability value determined is close to the manual analysis used
to be performed by the programmers or repository managers. So, the
developed system can be used to enhance the productivity and
quality of software development.
Abstract: The Taiwan government has started to promote the “Plain Landscape Afforestation and Greening Program" since 2002. A key task of the program was the payment for environmental services (PES), entitled the “Plain Landscape Afforestation Policy" (PLAP), which was certificated by the Executive Yuan on August 31, 2001 and enacted on January 1, 2002. According to the policy, it is estimated that the total area of afforestation will be 25,100 hectares by December 31, 2007. Until the end of 2007, the policy had been enacted for six years in total and the actual area of afforestation was 8,919.18 hectares. Among them, Taiwan Sugar Corporation (TSC) was accounted for 7,960 hectares (with 2,450.83 hectares as public service area) which occupied 86.22% of the total afforestation area; the private farmland promoted by local governments was accounted for 869.18 hectares which occupied 9.75% of the total afforestation area. Based on the above, we observe that most of the afforestation area in this policy is executed by TSC, and the achievement ratio by TSC is better than by others. It implies that the success of the PLAP is seriously related to the execution of TSC. The objective of this study is to analyze the relevant policy planning of TSC's participation in the PLAP, suggest complementary measures, and draw up effective adjustment mechanisms, so as to improve the effectiveness of executing the policy. Our main conclusions and suggestions are summarized as follows: 1. The main reason for TSC’s participation in the PLAP is based on their passive cooperation with the central government or company policy. Prior to TSC’s participation in the PLAP, their lands were mainly used for growing sugarcane. 2. The main factors of TSC's consideration on the selection of tree species are based on the suitability of land and species. The largest proportion of tree species is allocated to economic forests, and the lack of technical instruction was the main problem during afforestation. Moreover, the method of improving TSC’s future development in leisure agriculture and landscape business becomes a key topic. 3. TSC has developed short and long-term plans on participating in the PLAP for the future. However, there is no great willingness or incentive on budgeting for such detailed planning. 4. Most people from TSC interviewed consider the requirements on PLAP unreasonable. Among them, an unreasonable requirement on the number of trees accounted for the greatest proportion; furthermore, most interviewees suggested that the government should continue to provide incentives even after 20 years. 5. Since the government shares the same goals as TSC, there should be sufficient cooperation and communication that support the technical instruction and reduction of afforestation cost, which will also help to improve effectiveness of the policy.
Abstract: This paper compares six approaches of object serialization
from qualitative and quantitative aspects. Those are object
serialization in Java, IDL, XStream, Protocol Buffers, Apache Avro,
and MessagePack. Using each approach, a common example is
serialized to a file and the size of the file is measured. The qualitative
comparison works are investigated in the way of checking whether
schema definition is required or not, whether schema compiler is
required or not, whether serialization is based on ascii or binary, and
which programming languages are supported. It is clear that there
is no best solution. Each solution makes good in the context it was
developed.
Abstract: In the Equivalent Transformation (ET) computation
model, a program is constructed by the successive accumulation of
ET rules. A method by meta-computation by which a correct ET
rule is generated has been proposed. Although the method covers a
broad range in the generation of ET rules, all important ET rules
are not necessarily generated. Generation of more ET rules can be
achieved by supplementing generation methods which are specialized
for important ET rules. A Specialization-by-Equation (Speq) rule is
one of those important rules. A Speq rule describes a procedure in
which two variables included in an atom conjunction are equalized
due to predicate constraints. In this paper, we propose an algorithm
that systematically and recursively generate Speq rules and discuss
its effectiveness in the synthesis of ET programs. A Speq rule is
generated based on proof of a logical formula consisting of given
atom set and dis-equality. The proof is carried out by utilizing some
ET rules and the ultimately obtained rules in generating Speq rules.
Abstract: The purpose of this study is to investiagte the use of
the ecommerce website in Indonesia as a developing country. The
ecommerce website has been identified having the significant impact
on business activities in particular solving the geographical problem
for islanded countries likes Indonesia. Again, website is identified as
a crucial marketing tool. This study presents the effect of quality and
features on the use and user satisfaction employing ecommerce
websites. Survey method for 115 undergraduate students of
Management Department in Andalas University who are attending
Management Information Systems (SIM) class have been
undertaken. The data obtained is analyzed using Structural Equation
Modeling (SEM) using SmartPLS program. This result found that
quality of system and information, feature as well satisfaction
influencing the use ecommerce website in Indonesia contexts.
Abstract: This paper describes the evolution of strategies to
evaluate ePortfolios in an online Master-s of Education (M.Ed.)
degree in Instructional Technology. The ePortfolios are required as a
culminating activity for students in the program. By using Web 2.0
tools to develop the ePortfolios, students are able to showcase their
technical skills, integrate national standards, demonstrate their
professional understandings, and reflect on their individual learning.
Faculty have created assessment strategies to evaluate student
achievement of these skills. To further develop ePortfolios as a tool
promoting authentic learning, faculty are moving toward integrating
transparency as part of the evaluation process.
Abstract: In this paper a functional interpretation of quantum
theory (QT) with emphasis on quantum field theory (QFT) is proposed.
Besides the usual statements on relations between a functions
initial state and final state, a functional interpretation also contains
a description of the dynamic evolution of the function. That is, it
describes how things function. The proposed functional interpretation
of QT/QFT has been developed in the context of the author-s work
towards a computer model of QT with the goal of supporting
the largest possible scope of QT concepts. In the course of this
work, the author encountered a number of problems inherent in the
translation of quantum physics into a computer program. He came
to the conclusion that the goal of supporting the major QT concepts
can only be satisfied, if the present model of QT is supplemented
by a "functional interpretation" of QT/QFT. The paper describes a
proposal for that
Abstract: Postgraduate education is generally aimed at providing in-depth knowledge and understanding that include general philosophy in the world sciences, management, technologies, applications and other elements closely related to specific areas. In most universities, besides core and non-core subjects, a thesis is one of the requirements for the postgraduate student to accomplish before graduating. This paper reports on the empirical investigation into attributes that are associated with the obstacles to thesis accomplishment among postgraduate students. Using the quantitative approach the experiences of postgraduate students were tapped. Findings clearly revealed that information seeking, writing skills and other factors which refer to supervisor and time management, in particular, are recognized as contributory factors which positively or negatively influence postgraduates’ thesis accomplishment. Among these, writing skills dimensions were found to be the most difficult process in thesis accomplishment compared to information seeking and other factors. This pessimistic indication has provided some implications not only for the students but supervisors and institutions as a whole.
Abstract: Persuasive technology has been applied in marketing,
health, environmental conservation, safety and other domains and is
found to be quite effective in changing people-s attitude and
behaviours. This research extends the application domains of
persuasive technology to information security awareness and uses a
theory-driven approach to evaluate the effectiveness of a web-based
program developed based on the principles of persuasive technology
to improve the information security awareness of end users. The
findings confirm the existence of a very strong effect of the webbased
program in raising users- attitude towards information security
aware behavior. This finding is useful to the IT researchers and
practitioners in developing appropriate and effective education
strategies for improving the information security attitudes for endusers.
Abstract: Software development has experienced remarkable progress in the past decade. However, due to the rising complexity and magnitude of the project the development productivity has not been consistently improved. By analyzing the latest ISBSG data repository with 4106 projects, we discovered that software development productivity has actually undergone irregular variations between the years 1995 and 2005. Considering the factors significant to the productivity, we found its variations are primarily caused by the variations of average team size and the unbalanced uses of the less productive language 3GL.
Abstract: Careful design and selection of daylighting systems can greatly help in reducing not only artificial lighting use, but also decrease cooling energy consumption and, therefore, potential for downsizing air-conditioning systems. This paper aims to evaluate the energy performance of two types of top-light daylighting systems due to the integration of daylight together with artificial lighting in an existing examinaton hall in University Kebangsaan Malaysia, based on a hot and humid climate. Computer simulation models have been created for building case study (base case) and the two types of toplight daylighting designs for building energy performance evaluation using the VisualDOE 4.0 building energy simulation program. The finding revealed that daylighting through top-light systems is a very beneficial design strategy in reducing annual lighting energy consumption and the overall total annual energy consumption.
Abstract: In this paper, an automatic detecting algorithm for
QRS complex detecting was applied for analyzing ECG recordings
and five criteria for dangerous arrhythmia diagnosing are applied for a
protocol type of automatic arrhythmia diagnosing system. The
automatic detecting algorithm applied in this paper detected the
distribution of QRS complexes in ECG recordings and related
information, such as heart rate and RR interval. In this investigation,
twenty sampled ECG recordings of patients with different pathologic
conditions were collected for off-line analysis. A combinative
application of four digital filters for bettering ECG signals and
promoting detecting rate for QRS complex was proposed as
pre-processing. Both of hardware filters and digital filters were
applied to eliminate different types of noises mixed with ECG
recordings. Then, an automatic detecting algorithm of QRS complex
was applied for verifying the distribution of QRS complex. Finally,
the quantitative clinic criteria for diagnosing arrhythmia were
programmed in a practical application for automatic arrhythmia
diagnosing as a post-processor. The results of diagnoses by automatic
dangerous arrhythmia diagnosing were compared with the results of
off-line diagnoses by experienced clinic physicians. The results of
comparison showed the application of automatic dangerous
arrhythmia diagnosis performed a matching rate of 95% compared
with an experienced physician-s diagnoses.
Abstract: The objective of current study is to investigate the
differences of winning and losing teams in terms of goal scoring and
passing sequences. Total of 31 matches from UEFA-EURO 2012
were analyzed and 5 matches were excluded from analysis due to
matches end up drawn. There are two groups of variable used in the
study which is; i. the goal scoring variable and: ii. passing sequences
variable. Data were analyzed using Wilcoxon matched pair rank test
with significant value set at p < 0.05. Current study found the timing
of goal scored was significantly higher for winning team at 1st half
(Z=-3.416, p=.001) and 2nd half (Z=-3.252, p=.001). The scoring
frequency was also found to be increase as time progressed and the
last 15 minutes of the game was the time interval the most goals
scored. The indicators that were significantly differences between
winning and losing team were the goal scored (Z=-4.578, p=.000),
the head (Z=-2.500, p=.012), the right foot (Z=-3.788,p=.000),
corner (Z=-.2.126,p=.033), open play (Z=-3.744,p=.000), inside the
penalty box (Z=-4.174, p=.000) , attackers (Z=-2.976, p=.003) and
also the midfielders (Z=-3.400, p=.001). Regarding the passing
sequences, there are significance difference between both teams in
short passing sequences (Z=-.4.141, p=.000). While for the long
passing, there were no significance difference (Z=-.1.795, p=.073).
The data gathered in present study can be used by the coaches to
construct detailed training program based on their objectives.
Abstract: BRI-STARS (BRIdge Stream Tube model for Alluvial
River Simulation) program was used to investigate the scour depth around bridge piers in some of the major river systems in Iran. Model
calibration was performed by collecting different field data. Field data are cataloged on three categories, first group of bridges that
their rivers bed are formed by fine material, second group of bridges
that their rivers bed are formed by sand material, and finally bridges that their rivers bed are formed by gravel or cobble materials.
Verification was performed with some field data in Fars Province. Results show that for wide piers, computed scour depth is more than
measured one. In gravel bed streams, computed scour depth is greater
than measured scour depth, the reason is due to formation of armor layer on bed of channel. Once this layer is eroded, the computed
scour depth is close to the measured one.
Abstract: The aim of the current work is to present a comparison among three popular optimization methods in the inverse elastostatics problem (IESP) of flaw detection within a solid. In more details, the performance of a simulated annealing, a Hooke & Jeeves and a sequential quadratic programming algorithm was studied in the test case of one circular flaw in a plate solved by both the boundary element (BEM) and the finite element method (FEM). The proposed optimization methods use a cost function that utilizes the displacements of the static response. The methods were ranked according to the required number of iterations to converge and to their ability to locate the global optimum. Hence, a clear impression regarding the performance of the aforementioned algorithms in flaw identification problems was obtained. Furthermore, the coupling of BEM or FEM with these optimization methods was investigated in order to track differences in their performance.
Abstract: In this paper, we describe a rule-based message passing method to support developing collaborative applications, in which multiple users share resources in distributed environments. Message communications of applications in collaborative environments tend to be very complex because of the necessity to manage context situations such as sharing events, access controlling of users, and network places. In this paper, we propose a message communications method based on unification of artificial intelligence and logic programming for defining rules of such context information in a procedural object-oriented programming language. We also present an implementation of the method as java classes.
Abstract: This paper presents an efficient VLSI architecture
design to achieve real time video processing using Full-Search Block
Matching (FSBM) algorithm. The design employs parallel bank
architecture with minimum latency, maximum throughput, and full
hardware utilization. We use nine parallel processors in our
architecture and each controlled by a state machine. State machine
control implementation makes the design very simple and cost
effective. The design is implemented using VHDL and the
programming techniques we incorporated makes the design
completely programmable in the sense that the search ranges and the
block sizes can be varied to suit any given requirements. The design
can operate at frequencies up to 36 MHz and it can function in QCIF
and CIF video resolution at 1.46 MHz and 5.86 MHz, respectively.
Abstract: IPsec has now become a standard information security
technology throughout the Internet society. It provides a well-defined
architecture that takes into account confidentiality, authentication,
integrity, secure key exchange and protection mechanism against
replay attack also. For the connectionless security services on packet
basis, IETF IPsec Working Group has standardized two extension
headers (AH&ESP), key exchange and authentication protocols. It is
also working on lightweight key exchange protocol and MIB's for
security management. IPsec technology has been implemented on
various platforms in IPv4 and IPv6, gradually replacing old
application-specific security mechanisms. IPv4 and IPv6 are not
directly compatible, so programs and systems designed to one
standard can not communicate with those designed to the other. We
propose the design and implementation of controlled Internet security
system, which is IPsec-based Internet information security system in
IPv4/IPv6 network and also we show the data of performance
measurement. With the features like improved scalability and
routing, security, ease-of-configuration, and higher performance of
IPv6, the controlled Internet security system provides consistent
security policy and integrated security management on IPsec-based
Internet security system.
Abstract: As we make progressive products for good works, and
future industries want to get higher speed and resolution from various
developments in the robotics as well as precise control system, the
concept of control feedback is getting more important. Within a range
of industrial developments, the concept is most responsible for the
high reliability of a device. We explain an efficient analyzing method
of a rotary encoder such as an incremental type encoder and absolute
type encoder using the LabVIEW program
Abstract: By systematically applying different engineering
methods, difficult financial problems become approachable. Using a
combination of theory and techniques such as wavelet transform,
time series data mining, Markov chain based discrete stochastic
optimization, and evolutionary algorithms, this work formulated a
strategy to characterize and forecast non-linear time series. It
attempted to extract typical features from the volatility data sets of
S&P100 and S&P500 indices that include abrupt drops, jumps and
other non-linearity. As a result, accuracy of forecasting has reached
an average of over 75% surpassing any other publicly available
results on the forecast of any financial index.