Abstract: This research proposes a Preemptive Possibilistic
Linear Programming (PPLP) approach for solving multiobjective
Aggregate Production Planning (APP) problem with interval demand
and imprecise unit price and related operating costs. The proposed
approach attempts to maximize profit and minimize changes of
workforce. It transforms the total profit objective that has imprecise
information to three crisp objective functions, which are maximizing
the most possible value of profit, minimizing the risk of obtaining the
lower profit and maximizing the opportunity of obtaining the higher
profit. The change of workforce level objective is also converted.
Then, the problem is solved according to objective priorities. It is
easier than simultaneously solve the multiobjective problem as
performed in existing approach. Possible range of interval demand is
also used to increase flexibility of obtaining the better production
plan. A practical application of an electronic company is illustrated to
show the effectiveness of the proposed model.
Abstract: This paper presents a protocol aiming at proving that an encryption system contains structural weaknesses without disclosing any information on those weaknesses. A verifier can check in a polynomial time that a given property of the cipher system output has been effectively realized. This property has been chosen by the prover in such a way that it cannot been achieved by known attacks or exhaustive search but only if the prover indeed knows some undisclosed weaknesses that may effectively endanger the cryptosystem security. This protocol has been denoted zero-knowledge-like proof of cryptanalysis. In this paper, we apply this protocol to the Bluetooth core encryption algorithm E0, used in many mobile environments and thus we suggest that its security can seriously be put into question.
Abstract: This paper presents the design and implements the prototype of an intelligent data processing framework in ubiquitous sensor networks. Much focus is put on how to handle the sensor data stream as well as the interoperability between the low-level sensor data and application clients. Our framework first addresses systematic middleware which mitigates the interaction between the application layer and low-level sensors, for the sake of analyzing a great volume of sensor data by filtering and integrating to create value-added context information. Then, an agent-based architecture is proposed for real-time data distribution to efficiently forward a specific event to the appropriate application registered in the directory service via the open interface. The prototype implementation demonstrates that our framework can host a sophisticated application on the ubiquitous sensor network and it can autonomously evolve to new middleware, taking advantages of promising technologies such as software agents, XML, cloud computing, and the like.
Abstract: This paper proposes a framework for product
development including hardware and software components. It
provides separation of hardware dependent software, modifications of
current product development process, and integration of software
modules with existing product configuration models and assembly
product structures. In order to decide the dependent software, the
framework considers product configuration modules and engineering
changes of associated software and hardware components. In order to
support efficient integration of the two different hardware and
software development, a modified product development process is
proposed. The process integrates the dependent software development
into product development through the interchanges of specific product
information. By using existing product data models in Product Data
Management (PDM), the framework represents software as modules
for product configurations and software parts for product structure.
The framework is applied to development of a robot system in order to
show its effectiveness.
Abstract: Neighborhood Rough Sets (NRS) has been proven to
be an efficient tool for heterogeneous attribute reduction. However,
most of researches are focused on dealing with complete and noiseless
data. Factually, most of the information systems are noisy, namely,
filled with incomplete data and inconsistent data. In this paper, we
introduce a generalized neighborhood rough sets model, called
VPTNRS, to deal with the problem of heterogeneous attribute
reduction in noisy system. We generalize classical NRS model with
tolerance neighborhood relation and the probabilistic theory.
Furthermore, we use the neighborhood dependency to evaluate the
significance of a subset of heterogeneous attributes and construct a
forward greedy algorithm for attribute reduction based on it.
Experimental results show that the model is efficient to deal with noisy
data.
Abstract: Information technology managers nowadays are
facing with tremendous pressure to plan, implement, and adopt new
technology solution due to the rapidity of technology changes.
Resulted from a lack of study that have been done in this topic, the
aim of this paper is to provide a comparison review on current tools
that are currently being used in order to respond to technological
changes. The study is based on extensive literature review of
published works with majority of them are ranging from 2000 to the
first part of 2011. The works were gathered from journals, books,
and other information sources available on the Web. Findings show
that, each tools has different focus and none of the tools are
providing a framework in holistic view, which should include
technical, people, process, and business environment aspect. Hence,
this result provides potential information about current available
tools that IT managers could use to manage changes in technology.
Further, the result reveals a research gap in the area where the
industries a short of such framework.
Abstract: A challenging problem in radar signal processing is to
achieve reliable target detection in the presence of interferences. In
this paper, we propose a novel algorithm for automatic censoring of
radar interfering targets in log-normal clutter. The proposed
algorithm, termed the forward automatic censored cell averaging
detector (F-ACCAD), consists of two steps: removing the corrupted
reference cells (censoring) and the actual detection. Both steps are
performed dynamically by using a suitable set of ranked cells to
estimate the unknown background level and set the adaptive
thresholds accordingly. The F-ACCAD algorithm does not require
any prior information about the clutter parameters nor does it require
the number of interfering targets. The effectiveness of the F-ACCAD
algorithm is assessed by computing, using Monte Carlo simulations,
the probability of censoring and the probability of detection in
different background environments.
Abstract: Lighting upgrades involve relatively lower costs which
allow the benefits to be spread more widely than is possible with any
other energy efficiency measure. In order to popularize the adoption of
CFL in Taiwan, the authority proposes to implement a new energy efficient lamp comparative label system. The current study was
accordingly undertaken to investigate the factors affecting the performance and the deviation of actual and labeled performance of
commercially available integrated CFLs. In this paper, standard test
methods to determine the electrical and photometric performances of
CFL were developed based on CIE 84-1989 and CIE 60901-1987,
then 55 selected CFLs from market were tested. The results show that
with higher color temperature of CFLs lower efficacy are achieved. It
was noticed that the most packaging of CFL often lack the information of Color Rendering Index. Also, there was no correlation between
price and performance of the CFLs was indicated in this work. The results of this paper might help consumers to make more informed
CFL-purchasing decisions.
Abstract: The purpose of this study is to analyze the islands
tourist travel information sources, as well as for the satisfaction of the
tourist destination services. This study used questionnaires to the
island of Taiwan to the Penghu Islands to engage in tourism activities
tourist adopt the designated convenience sampling method, a total of
889 valid questionnaires were collected. After statistical analysis, this
study found that: 1. tourists to the Penghu Islands travel information
source for “friends and family came to Penghu". 2. Tourists feel the
service of the outlying islands of Penghu, the highest feelings of
“friendly local residents". 3. There are different demographic variables
affect the tourist travel information source and service satisfaction.
Based on the findings of this study not only for Penghu's tourism
industry with the unit in charge of the proposed operating and
suggestions for future research to other researchers.
Abstract: With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.
Abstract: The paper outlines the relevance of computational
geometry within the design and production process of architecture.
Based on two case studies, the digital chain - from the initial formfinding
to the final realization of spatial concepts - is discussed in
relation to geometric principles. The association with the fascinating
complexity that can be found in nature and its underlying geometry
was the starting point for both projects presented in the paper. The
translation of abstract geometric principles into a three-dimensional
digital design model – realized in Rhinoceros – was followed by a
process of transformation and optimization of the initial shape that
integrated aesthetic, spatial and structural qualities as well as aspects
of material properties and conditions of production.
Abstract: For most image fusion algorithms separate
relationship by pixels in the image and treat them more or less
independently. In addition, they have to be adjusted different
parameters in different time or weather. In this paper, we propose a
region–based image fusion which combines aspects of feature and
pixel-level fusion method to replace only by pixel. The basic idea is
to segment far infrared image only and to add information of each
region from segmented image to visual image respectively. Then we
determine different fused parameters according different region. At
last, we adopt artificial neural network to deal with the problems of
different time or weather, because the relationship between fused
parameters and image features are nonlinear. It render the fused
parameters can be produce automatically according different states.
The experimental results present the method we proposed indeed
have good adaptive capacity with automatic determined fused
parameters. And the architecture can be used for lots of applications.
Abstract: Throughput is an important measure of performance of production system. Analyzing and modeling of production throughput is complex in today-s dynamic production systems due to uncertainties of production system. The main reasons are that uncertainties are materialized when the production line faces changes in setup time, machinery break down, lead time of manufacturing, and scraps. Besides, demand changes are fluctuating from time to time for each product type. These uncertainties affect the production performance. This paper proposes Bayesian inference for throughput modeling under five production uncertainties. Bayesian model utilized prior distributions related to previous information about the uncertainties where likelihood distributions are associated to the observed data. Gibbs sampling algorithm as the robust procedure of Monte Carlo Markov chain was employed for sampling unknown parameters and estimating the posterior mean of uncertainties. The Bayesian model was validated with respect to convergence and efficiency of its outputs. The results presented that the proposed Bayesian models were capable to predict the production throughput with accuracy of 98.3%.
Abstract: Creating shared value (CSV) is a newly introduced
concept whose essence and expressions, relationship to Corporate
social responsibility (CSR) and implications for the business and
society is now at the core of management and social responsibility
debates of the scientific world. The aim of the paper is to gain clearer
understanding of the CSR and CSV concepts, their implementation
and role in sustainable development of organizations in Latvia. In this
paper the authors discuss and compare the two conceptsand, based on
the results of Sustainability Index (SI) initiative and analysis of
publically available company information, evaluate their
implementation in Latvia and draw conclusions on the development
trends and potential of these approaches in Latvian market.
Abstract: Intensive changes of environment and strong market
competition have raised management of information and knowledge
to the strategic level of companies. In a knowledge based economy
only those organizations are capable of living which have up-to-date,
special knowledge and they are able to exploit and develop it.
Companies have to know what knowledge they have by taking a
survey of organizational knowledge and they have to fix actual and
additional knowledge in organizational memory. The question is how
to identify, acquire, fix and use knowledge effectively. The paper will
show that over and above the tools of information technology
supporting acquisition, storage and use of information and
organizational learning as well as knowledge coming into being as a
result of it, fixing and storage of knowledge in the memory of a
company play an important role in the intelligence of organizations
and competitiveness of a company.
Abstract: This paper presents a linear-elastic finite element method based flattening algorithm for three dimensional triangular surfaces. First, an intrinsic characteristic preserving method is used to obtain the initial developing graph, which preserves the angles and length ratios between two adjacent edges. Then, an iterative equation is established based on linear-elastic finite element method and the flattening result with an equilibrium state of internal force is obtained by solving this iterative equation. The results show that complex surfaces can be dealt with this proposed method, which is an efficient tool for the applications in computer aided design, such as mould design.
Abstract: This paper proposes a new parameter identification
method based on Linear Fractional Transformation (LFT). It is
assumed that the target linear system includes unknown parameters.
The parameter deviations are separated from a nominal system via
LFT, and identified by organizing I/O signals around the separated
deviations of the real system. The purpose of this paper is to apply LFT
to simultaneously identify the parameter deviations in systems with
fewer outputs than unknown parameters. As a fundamental example,
this method is implemented to one degree of freedom vibratory system.
Via LFT, all physical parameters were simultaneously identified in this
system. Then, numerical simulations were conducted for this system to
verify the results. This study shows that all the physical parameters of a
system with fewer outputs than unknown parameters can be effectively
identified simultaneously using LFT.
Abstract: In this paper, we propose a new model of English-
Vietnamese bilingual Information Retrieval system. Although there
are so many CLIR systems had been researched and built, the accuracy of searching results in different languages that the CLIR
system supports still need to improve, especially in finding bilingual documents. The problems identified in this paper are the limitation of
machine translation-s result and the extra large collections of document to be found. So we try to establish a different model to overcome these problems.
Abstract: The development of Internet technology in recent years has led to a more active role of users in creating Web content. This has significant effects both on individual learning and collaborative knowledge building. This paper will present an integrative framework model to describe and explain learning and knowledge building with shared digital artifacts on the basis of Luhmann-s systems theory and Piaget-s model of equilibration. In this model, knowledge progress is based on cognitive conflicts resulting from incongruities between an individual-s prior knowledge and the information which is contained in a digital artifact. Empirical support for the model will be provided by 1) applying it descriptively to texts from Wikipedia, 2) examining knowledge-building processes using a social network analysis, and 3) presenting a survey of a series of experimental laboratory studies.
Abstract: How to coordinate the behaviors of the agents through
learning is a challenging problem within multi-agent domains.
Because of its complexity, recent work has focused on how
coordinated strategies can be learned. Here we are interested in using
reinforcement learning techniques to learn the coordinated actions of a
group of agents, without requiring explicit communication among
them. However, traditional reinforcement learning methods are based
on the assumption that the environment can be modeled as Markov
Decision Process, which usually cannot be satisfied when multiple
agents coexist in the same environment. Moreover, to effectively
coordinate each agent-s behavior so as to achieve the goal, it-s
necessary to augment the state of each agent with the information
about other existing agents. Whereas, as the number of agents in a
multiagent environment increases, the state space of each agent grows
exponentially, which will cause the combinational explosion problem.
Profit sharing is one of the reinforcement learning methods that allow
agents to learn effective behaviors from their experiences even within
non-Markovian environments. In this paper, to remedy the drawback
of the original profit sharing approach that needs much memory to
store each state-action pair during the learning process, we firstly
address a kind of on-line rational profit sharing algorithm. Then, we
integrate the advantages of modular learning architecture with on-line
rational profit sharing algorithm, and propose a new modular
reinforcement learning model. The effectiveness of the technique is
demonstrated using the pursuit problem.