Abstract: Along with the progress of our information society,
various risks are becoming increasingly common, causing multiple social problems. For this reason, risk communications for
establishing consensus among stakeholders who have different
priorities have become important. However, it is not always easy for the decision makers to agree on measures to reduce risks based on
opposing concepts, such as security, privacy and cost. Therefore, we previously developed and proposed the “Multiple Risk Communicator" (MRC) with the following functions: (1) modeling
the support role of the risk specialist, (2) an optimization engine, and (3) displaying the computed results. In this paper, MRC program
version 1.0 is applied to the personal information leakage problem. The application process and validation of the results are discussed.
Abstract: Climate change causes severe effects on natural
habitats, especially wetlands. These challenges require the adaptation
of their management to probable effects of climate change. A
compilation of necessary changes in land management was collected
in a Hungarian area being both national park and Natura 2000 SAC
and SCI site in favor of increasing the resilience and reducing
vulnerability. Several factors, such as ecological aspects, nature
conservation and climatic adaptation should be combined with social
and economic factors during the process of developing climate
change adapted management on vulnerable wetlands. Planning
adaptive management should be determined by a priority order of
conservation aims and evaluation of factors at the determined
planning unit. Mowing techniques, frequency and exact date should
be observed as well as grazing species and their breed, due to
different grazing, group forming and trampling habits. Integrating
landscape history and historical land development into the planning
process is essential.
Abstract: For a spatiotemporal database management system,
I/O cost of queries and other operations is an important performance
criterion. In order to optimize this cost, an intense research on
designing robust index structures has been done in the past decade.
With these major considerations, there are still other design issues
that deserve addressing due to their direct impact on the I/O cost.
Having said this, an efficient buffer management strategy plays a key
role on reducing redundant disk access. In this paper, we proposed an
efficient buffer strategy for a spatiotemporal database index
structure, specifically indexing objects moving over a network of
roads. The proposed strategy, namely MONPAR, is based on the data
type (i.e. spatiotemporal data) and the structure of the index
structure. For the purpose of an experimental evaluation, we set up a
simulation environment that counts the number of disk accesses
while executing a number of spatiotemporal range-queries over the
index. We reiterated simulations with query sets with different
distributions, such as uniform query distribution and skewed query
distribution. Based on the comparison of our strategy with wellknown
page-replacement techniques, like LRU-based and Prioritybased
buffers, we conclude that MONPAR behaves better than its
competitors for small and medium size buffers under all used query-distributions.
Abstract: In the current context of globalization, accountability has become a key subject of real interest for both, national and international business areas, due to the need for comparability and transparency of the economic situation, so we can speak about the harmonization and convergence of international accounting. The paper presents a qualitative research through content analysis of several reports concerning the roadmap for convergence. First, we develop a conceptual framework for the evolution of standards’ convergence and further we discuss the degree of standards harmonization and convergence between US GAAP and IAS/IFRS as to October 2012. We find that most topics did not follow the expected progress. Furthermore there are still some differences in the long-term project that are in process to be completed and other that were reassessed as a lower priority project.
Abstract: The Taiwan government has started to promote the “Plain Landscape Afforestation and Greening Program" since 2002. A key task of the program was the payment for environmental services (PES), entitled the “Plain Landscape Afforestation Policy" (PLAP), which was certificated by the Executive Yuan on August 31, 2001 and enacted on January 1, 2002. According to the policy, it is estimated that the total area of afforestation will be 25,100 hectares by December 31, 2007. Until the end of 2007, the policy had been enacted for six years in total and the actual area of afforestation was 8,919.18 hectares. Among them, Taiwan Sugar Corporation (TSC) was accounted for 7,960 hectares (with 2,450.83 hectares as public service area) which occupied 86.22% of the total afforestation area; the private farmland promoted by local governments was accounted for 869.18 hectares which occupied 9.75% of the total afforestation area. Based on the above, we observe that most of the afforestation area in this policy is executed by TSC, and the achievement ratio by TSC is better than by others. It implies that the success of the PLAP is seriously related to the execution of TSC. The objective of this study is to analyze the relevant policy planning of TSC's participation in the PLAP, suggest complementary measures, and draw up effective adjustment mechanisms, so as to improve the effectiveness of executing the policy. Our main conclusions and suggestions are summarized as follows: 1. The main reason for TSC’s participation in the PLAP is based on their passive cooperation with the central government or company policy. Prior to TSC’s participation in the PLAP, their lands were mainly used for growing sugarcane. 2. The main factors of TSC's consideration on the selection of tree species are based on the suitability of land and species. The largest proportion of tree species is allocated to economic forests, and the lack of technical instruction was the main problem during afforestation. Moreover, the method of improving TSC’s future development in leisure agriculture and landscape business becomes a key topic. 3. TSC has developed short and long-term plans on participating in the PLAP for the future. However, there is no great willingness or incentive on budgeting for such detailed planning. 4. Most people from TSC interviewed consider the requirements on PLAP unreasonable. Among them, an unreasonable requirement on the number of trees accounted for the greatest proportion; furthermore, most interviewees suggested that the government should continue to provide incentives even after 20 years. 5. Since the government shares the same goals as TSC, there should be sufficient cooperation and communication that support the technical instruction and reduction of afforestation cost, which will also help to improve effectiveness of the policy.
Abstract: In the paper, a fast high-resolution range profile synthetic algorithm called orthogonal matching pursuit with sensing dictionary (OMP-SD) is proposed. It formulates the traditional HRRP synthetic to be a sparse approximation problem over redundant dictionary. As it employs a priori that the synthetic range profile (SRP) of targets are sparse, SRP can be accomplished even in presence of data lost. Besides, the computation complexity decreases from O(MNDK) flops for OMP to O(M(N + D)K) flops for OMP-SD by introducing sensing dictionary (SD). Simulation experiments illustrate its advantages both in additive white Gaussian noise (AWGN) and noiseless situation, respectively.
Abstract: Neural networks are well known for their ability to
model non linear functions, but as statistical methods usually does,
they use a no parametric approach thus, a priori knowledge is not
obvious to be taken into account no more than the a posteriori
knowledge. In order to deal with these problematics, an original way
to encode the knowledge inside the architecture is proposed. This
method is applied to the problem of the evapotranspiration inside
karstic aquifer which is a problem of huge utility in order to deal
with water resource.
Abstract: In this paper, a new learning algorithm based on a
hybrid metaheuristic integrating Differential Evolution (DE) and
Reduced Variable Neighborhood Search (RVNS) is introduced to train
the classification method PROAFTN. To apply PROAFTN, values of
several parameters need to be determined prior to classification. These
parameters include boundaries of intervals and relative weights for
each attribute. Based on these requirements, the hybrid approach,
named DEPRO-RVNS, is presented in this study. In some cases, the
major problem when applying DE to some classification problems
was the premature convergence of some individuals to local optima.
To eliminate this shortcoming and to improve the exploration and
exploitation capabilities of DE, such individuals were set to iteratively
re-explored using RVNS. Based on the generated results on
both training and testing data, it is shown that the performance of
PROAFTN is significantly improved. Furthermore, the experimental
study shows that DEPRO-RVNS outperforms well-known machine
learning classifiers in a variety of problems.
Abstract: Group contribution methods such as the UNIFAC are
very useful to researchers and engineers involved in synthesis,
feasibility studies, design and optimization of separation processes.
They can be applied successfully to predict phase equilibrium and
excess properties in the development of chemical and separation
processes. The main focus of this work was to investigate the
possibility of absorbing selected volatile organic compounds (VOCs)
into polydimethylsiloxane (PDMS) using three selected UNIFAC
group contribution methods. Absorption followed by subsequent
stripping is the predominant available abatement technology of
VOCs from flue gases prior to their release into the atmosphere. The
original, modified and effective UNIFAC models were used in this
work. The thirteen selected VOCs that have been considered in this
research are: pentane, hexane, heptanes, trimethylamine, toluene,
xylene, cyclohexane, butyl acetate, diethyl acetate, chloroform,
acetone, ethyl methyl ketone and isobutyl methyl ketone. The
computation was done for solute VOC concentration of 8.55x10-8
which is well in the infinite dilution region. The results obtained in
this study compare very well with those published in literature
obtained through both measurements and predictions. The phase
equilibrium obtained in this study show that PDMS is a good
absorbent for the removal of VOCs from contaminated air streams
through physical absorption.
Abstract: Oilsands bitumen is an extremely important source of
energy for North America. However, due to the presence of large
molecules such as asphaltenes, the density and viscosity of the
bitumen recovered from these sands are much higher than those of
conventional crude oil. As a result the extracted bitumen has to be
diluted with expensive solvents, or thermochemically upgraded in
large, capital-intensive conventional upgrading facilities prior to
pipeline transport. This study demonstrates that globally abundant
natural zeolites such as clinoptilolite from Saint Clouds, New Mexico
and Ca-chabazite from Bowie, Arizona can be used as very effective
reagents for cracking and visbreaking of oilsands bitumen. Natural
zeolite cracked oilsands bitumen products are highly recoverable (up
to ~ 83%) using light hydrocarbons such as pentane, which indicates
substantial conversion of heavier fractions to lighter components.
The resultant liquid products are much less viscous, and have lighter
product distribution compared to those produced from pure thermal
treatment. These natural minerals impart similar effect on industrially
extracted Athabasca bitumen.
Abstract: Technology transfer is a common method for
companies to acquire new technology and presents both challenges
and substantial benefits. In some cases especially in developing
countries, the mere possession of technology does not guarantee a
competitive advantage if the appropriate infrastructure is not in place.
In this paper, we identify the localization factors needed to provide a
better understanding of the conditions necessary for localization in
order to benefit from future technology developments. Our
theoretical and empirical analyses allow us to identify several factors
in the technology transfer process that affect localization and provide
leverage in enhancing capabilities and absorptive capacity.The
impact factors are categorized within different groups of government,
firms, institutes and market, and are verified through the empirical
survey of a technology transfer experience. Moreover, statistical
analysis has allowed a deeper understanding of the importance of
each factor and has enabled each group to prioritize their
organizational policies to effectively localize their technology.
Abstract: Anti-money laundering is commonly recognized as a
set of procedures, laws or regulations designed to reduce the practice
of generating income through illegal actions. In Malaysia, the
government and law enforcement agencies have stepped up their
capacities and efforts to curb money laundering since 2001. One of
these measures was the enactment of the Anti-Money Laundering
Act (AMLA) in 2001. The implementation costs on anti-money
laundering requirements (AMLR) can be burdensome to those who
are involved in enforcing them. The objective of this paper is to
explore the perceived effectiveness of AMLR from the enforcement
agencies- perspective. This is a preliminary study whose findings
will help to give direction for further AML research in Malaysia. In
addition, the results of this study provide empirical evidences on the
perceived effectiveness of AMLR prior to further investigations on
barriers and improvements of the implementation of the anti-money
laundering regime in Malaysia.
Abstract: Prior research evidenced that unimodal biometric
systems have several tradeoffs like noisy data, intra-class variations,
restricted degrees of freedom, non-universality, spoof attacks, and
unacceptable error rates. In order for the biometric system to be more
secure and to provide high performance accuracy, more than one
form of biometrics are required. Hence, the need arise for multimodal
biometrics using combinations of different biometric modalities. This
paper introduces a multimodal biometric system (MMBS) based on
fusion of whole dorsal hand geometry and fingerprints that acquires
right and left (Rt/Lt) near-infra-red (NIR) dorsal hand geometry (HG)
shape and (Rt/Lt) index and ring fingerprints (FP). Database of 100
volunteers were acquired using the designed prototype. The acquired
images were found to have good quality for all features and patterns
extraction to all modalities. HG features based on the hand shape
anatomical landmarks were extracted. Robust and fast algorithms for
FP minutia points feature extraction and matching were used. Feature
vectors that belong to similar biometric traits were fused using
feature fusion methodologies. Scores obtained from different
biometric trait matchers were fused using the Min-Max
transformation-based score fusion technique. Final normalized scores
were merged using the sum of scores method to obtain a single
decision about the personal identity based on multiple independent
sources. High individuality of the fused traits and user acceptability
of the designed system along with its experimental high performance
biometric measures showed that this MMBS can be considered for
med-high security levels biometric identification purposes.
Abstract: The internet is constantly expanding. Identifying web
links of interest from web browsers requires users to visit each of the
links listed, individually until a satisfactory link is found, therefore
those users need to evaluate a considerable amount of links before
finding their link of interest; this can be tedious and even
unproductive. By incorporating web assistance, web users could be
benefited from reduced time searching on relevant websites. In this
paper, a rough set approach is presented, which facilitates
classification of unlimited available e-vocabulary, to assist web users
in reducing search times looking for relevant web sites. This
approach includes two methods for identifying relevance data on web
links based on the priority and percentage of relevance. As a result of
these methods, a list of web sites is generated in priority sequence
with an emphasis of the search criteria.
Abstract: Our goal is to effectively increase the number of boats in the river during a six month period. The main factors of determining the number of boats are duration and “select the priority trip". In the microcosmic simulation model, the best result is 4 to 24 nights with DSCF, and the number of boats is 812 with an increasing ratio of 9.0% related to the second best result. However, the number of boats is related to 31.6% less than the best one in 6 to 18 nights with FCFS. In the discrete duration model, we get from 6 to 18 nights, the numbers of boats have increased to 848 with an increase ratio of 29.7% than the best result in model I for the same time range. Moreover, from 4 to 24 nights, the numbers of boats have increase to 1194 with an increase ratio of 47.0% than the best result in model I for the same time range.
Abstract: In this study, a novel approach of image embedding is introduced. The proposed method consists of three main steps. First, the edge of the image is detected using Sobel mask filters. Second, the least significant bit LSB of each pixel is used. Finally, a gray level connectivity is applied using a fuzzy approach and the ASCII code is used for information hiding. The prior bit of the LSB represents the edged image after gray level connectivity, and the remaining six bits represent the original image with very little difference in contrast. The proposed method embeds three images in one image and includes, as a special case of data embedding, information hiding, identifying and authenticating text embedded within the digital images. Image embedding method is considered to be one of the good compression methods, in terms of reserving memory space. Moreover, information hiding within digital image can be used for security information transfer. The creation and extraction of three embedded images, and hiding text information is discussed and illustrated, in the following sections.
Abstract: A Simultaneous Multithreading (SMT) Processor is
capable of executing instructions from multiple threads in the same
cycle. SMT in fact was introduced as a powerful architecture to
superscalar to increase the throughput of the processor.
Simultaneous Multithreading is a technique that permits multiple
instructions from multiple independent applications or threads to
compete limited resources each cycle. While the fetch unit has been
identified as one of the major bottlenecks of SMT architecture, several
fetch schemes were proposed by prior works to enhance the fetching
efficiency and overall performance.
In this paper, we propose a novel fetch policy called queue situation
identifier (QSI) which counts some kind of long latency instructions of
each thread each cycle then properly selects which threads to fetch
next cycle. Simulation results show that in best case our fetch policy
can achieve 30% on speedup and also can reduce the data cache level 1
miss rate.
Abstract: This paper contributes to our knowledge about buyerseller
relations by identifying barriers and conflict situations
associated with maintaining and developing durable business
relationships by small companies. The contribution of prior studies
with regard to negative aspects of marketing relationships is
presented in the first section. The international research results are
discussed with regard to the existing conceptualizations and main
research implications identified at the end.
Abstract: Grid environments consist of the volatile integration
of discrete heterogeneous resources. The notion of the Grid is to
unite different users and organisations and pool their resources into
one large computing platform where they can harness, inter-operate,
collaborate and interact. If the Grid Community is to achieve this
objective, then participants (Users and Organisations) need to be
willing to donate or share their resources and permit other
participants to use their resources. Resources do not have to be
shared at all times, since it may result in users not having access to
their own resource. The idea of reward-based computing was
developed to address the sharing problem in a pragmatic manner.
Participants are offered a reward to donate their resources to the
Grid. A reward may include monetary recompense or a pro rata share
of available resources when constrained. This latter point may imply
a quality of service, which in turn may require some globally agreed
reservation mechanism. This paper presents a platform for economybased
computing using the WebCom Grid middleware. Using this
middleware, participants can configure their resources at times and
priority levels to suit their local usage policy. The WebCom system
accounts for processing done on individual participants- resources
and rewards them accordingly.
Abstract: Dynamic bandwidth allocation in EPONs can be
generally separated into inter-ONU scheduling and intra-ONU scheduling. In our previous work, the active intra-ONU scheduling
(AS) utilizes multiple queue reports (QRs) in each report message to cooperate with the inter-ONU scheduling and makes the granted
bandwidth fully utilized without leaving unused slot remainder (USR).
This scheme successfully solves the USR problem originating from the
inseparability of Ethernet frame. However, without proper setting of
threshold value in AS, the number of QRs constrained by the IEEE
802.3ah standard is not enough, especially in the unbalanced traffic
environment. This limitation may be solved by enlarging the threshold
value. The large threshold implies the large gap between the adjacent QRs, thus resulting in the large difference between the best granted bandwidth and the real granted bandwidth. In this paper, we integrate
AS with a cooperative prediction mechanism and distribute multiple
QRs to reduce the penalty brought by the prediction error.
Furthermore, to improve the QoS and save the usage of queue reports,
the highest priority (EF) traffic which comes during the waiting time is
granted automatically by OLT and is not considered in the requested
bandwidth of ONU. The simulation results show that the proposed
scheme has better performance metrics in terms of bandwidth
utilization and average delay for different classes of packets.