Abstract: We investigated statistical performance of Bayesian inference using maximum entropy and MAP estimation for several models which approximated wave-fronts in remote sensing using SAR interferometry. Using Monte Carlo simulation for a set of wave-fronts generated by assumed true prior, we found that the method of maximum entropy realized the optimal performance around the Bayes-optimal conditions by using model of the true prior and the likelihood representing optical measurement due to the interferometer. Also, we found that the MAP estimation regarded as a deterministic limit of maximum entropy almost achieved the same performance as the Bayes-optimal solution for the set of wave-fronts. Then, we clarified that the MAP estimation perfectly carried out phase unwrapping without using prior information, and also that the MAP estimation realized accurate phase unwrapping using conjugate gradient (CG) method, if we assumed the model of the true prior appropriately.
Abstract: In this paper, a novel scheme is proposed for Ownership Identification and Color Image Authentication by deploying Cryptography & Digital Watermarking. The color image is first transformed from RGB to YST color space exclusively designed for watermarking. Followed by color space transformation, each channel is divided into 4×4 non-overlapping blocks with selection of central 2×2 sub-blocks. Depending upon the channel selected two to three LSBs of each central 2×2 sub-block are set to zero to hold the ownership, authentication and recovery information. The size & position of sub-block is important for correct localization, enhanced security & fast computation. As YS ÔèÑ T so it is suitable to embed the recovery information apart from the ownership and authentication information, therefore 4×4 block of T channel along with ownership information is then deployed by SHA160 to compute the content based hash that is unique and invulnerable to birthday attack or hash collision instead of using MD5 that may raise the condition i.e. H(m)=H(m'). For recovery, intensity mean of 4x4 block of each channel is computed and encoded upto eight bits. For watermark embedding, key based mapping of blocks is performed using 2DTorus Automorphism. Our scheme is oblivious, generates highly imperceptible images with correct localization of tampering within reasonable time and has the ability to recover the original work with probability of near one.
Abstract: Producing IT products/services required carefully
designed. IT development process is intangible and labour intensive.
Making optimal use of available resources, both soft (knowledge,
skill-set etc.) and hard (computer system, ancillary equipment etc.),
is vital if IT development is to achieve sensible economical
advantages. Apart from the norm of Project Life Cycle and System
Development Life Cycle (SDLC), there is an urgent need to establish
a general yet widely acceptable guideline on the most effective and
efficient way to precede an IT project in the broader view of Product
Life Cycle. The current paper proposes such a framework with two
major areas of concern: (1) an integration of IT Products and IT
Services within an existing IT Process architecture and; (2) how IT
Product and IT Services are built into the framework of Product Life
Cycle, Project Life Cycle and SDLC.
Abstract: The main objective of this work is to provide a fault detection and isolation based on Markov parameters for residual generation and a neural network for fault classification. The diagnostic approach is accomplished in two steps: In step 1, the system is identified using a series of input / output variables through an identification algorithm. In step 2, the fault is diagnosed comparing the Markov parameters of faulty and non faulty systems. The Artificial Neural Network is trained using predetermined faulty conditions serves to classify the unknown fault. In step 1, the identification is done by first formulating a Hankel matrix out of Input/ output variables and then decomposing the matrix via singular value decomposition technique. For identifying the system online sliding window approach is adopted wherein an open slit slides over a subset of 'n' input/output variables. The faults are introduced at arbitrary instances and the identification is carried out in online. Fault residues are extracted making a comparison of the first five Markov parameters of faulty and non faulty systems. The proposed diagnostic approach is illustrated on benchmark problems with encouraging results.
Abstract: In this work, we present a novel active learning approach
for learning a visual object detection system. Our system
is composed of an active learning mechanism as wrapper around
a sub-algorithm which implement an online boosting-based learning
object detector. In the core is a combination of a bootstrap procedure
and a semi automatic learning process based on the online boosting
procedure. The idea is to exploit the availability of classifier during
learning to automatically label training samples and increasingly
improves the classifier. This addresses the issue of reducing labeling
effort meanwhile obtain better performance. In addition, we propose
a verification process for further improvement of the classifier.
The idea is to allow re-update on seen data during learning for
stabilizing the detector. The main contribution of this empirical study
is a demonstration that active learning based on an online boosting
approach trained in this manner can achieve results comparable or
even outperform a framework trained in conventional manner using
much more labeling effort. Empirical experiments on challenging data
set for specific object deteciton problems show the effectiveness of
our approach.
Abstract: Reverse Engineering is a very important process in
Software Engineering. It can be performed backwards from system
development life cycle (SDLC) in order to get back the source data
or representations of a system through analysis of its structure,
function and operation. We use reverse engineering to introduce an
automatic tool to generate system requirements from its program
source codes. The tool is able to accept the Cµ programming source
codes, scan the source codes line by line and parse the codes to
parser. Then, the engine of the tool will be able to generate system
requirements for that specific program to facilitate reuse and
enhancement of the program. The purpose of producing the tool is to
help recovering the system requirements of any system when the
system requirements document (SRD) does not exist due to
undocumented support of the system.
Abstract: Wind catchers are traditional natural ventilation
systems attached to buildings in order to ventilate the indoor air. The
most common type of wind catcher is four sided one which is
capable to catch wind in all directions. CFD simulation is the perfect
way to evaluate the wind catcher performance. The accuracy of CFD
results is the issue of concern, so sensitivity analyses is crucial to
find out the effect of different settings of CFD on results. This paper
presents a series of 3D steady RANS simulations for a generic
isolated four-sided wind catcher attached to a room subjected to wind
direction ranging from 0º to 180º with an interval of 45º. The CFD
simulations are validated with detailed wind tunnel experiments. The
influence of an extensive range of computational parameters is
explored in this paper, including the resolution of the computational
grid, the size of the computational domain and the turbulence model.
This study found that CFD simulation is a reliable method for wind
catcher study, but it is less accurate in prediction of models with non
perpendicular wind directions.
Abstract: In this paper, a novel method for a biometric system based on the ECG signal is proposed, using spectral coefficients computed through linear predictive coding (LPC). ECG biometric systems have traditionally incorporated characteristics of fiducial points of the ECG signal as the feature set. These systems have been shown to contain loopholes and thus a non-fiducial system allows for tighter security. In the proposed system, incorporating non-fiducial features from the LPC spectrum produced a segment and subject recognition rate of 99.52% and 100% respectively. The recognition rates outperformed the biometric system that is based on the wavelet packet decomposition (WPD) algorithm in terms of recognition rates and computation time. This allows for LPC to be used in a practical ECG biometric system that requires fast, stringent and accurate recognition.
Abstract: The small interfering RNA (siRNA) alters the
regulatory role of mRNA during gene expression by translational
inhibition. Recent studies show that upregulation of mRNA because
serious diseases like cancer. So designing effective siRNA with good
knockdown effects plays an important role in gene silencing. Various
siRNA design tools had been developed earlier. In this work, we are
trying to analyze the existing good scoring second generation siRNA
predicting tools and to optimize the efficiency of siRNA prediction
by designing a computational model using Artificial Neural Network
and whole stacking energy (%G), which may help in gene silencing
and drug design in cancer therapy. Our model is trained and tested
against a large data set of siRNA sequences. Validation of our results
is done by finding correlation coefficient of experimental versus
observed inhibition efficacy of siRNA. We achieved a correlation
coefficient of 0.727 in our previous computational model and we
could improve the correlation coefficient up to 0.753 when the
threshold of whole tacking energy is greater than or equal to -32.5
kcal/mol.
Abstract: One of the common problems encountered in software
engineering is addressing and responding to the changing nature of
requirements. While several approaches have been devised to address
this issue, ranging from instilling resistance to changing requirements
in order to mitigate impact to project schedules, to developing an
agile mindset towards requirements, the approach discussed in this
paper is one of conceptualizing the delta in requirement and
modeling it, in order to plan a response to it. To provide some
context here, change is first formally identified and categorized as
either formal change or informal change. While agile methodology
facilitates informal change, the approach discussed in this paper
seeks to develop the idea of facilitating formal change. To collect,
document meta-requirements that represent the phenomena of change
would be a pro-active measure towards building a realistic cognition
of the requirements entity that can further be harnessed in the
software engineering process.
Abstract: To determine the presence and location of faults in a transmission by the adaptation of protective distance relay based on the measurement of fixed settings as line impedance is achieved by several different techniques. Moreover, a fast, accurate and robust technique for real-time purposes is required for the modern power systems. The appliance of radial basis function neural network in transmission line protection is demonstrated in this paper. The method applies the power system via voltage and current signals to learn the hidden relationship presented in the input patterns. It is experiential that the proposed technique is competent to identify the particular fault direction more speedily. System simulations studied show that the proposed approach is able to distinguish the direction of a fault on a transmission line swiftly and correctly, therefore suitable for the real-time purposes.
Abstract: The globe Sustainability has become the subject of international attention, the key reason is that global climate change. Climate and disasters around the abnormal frequency multiplier, the global temperature of the catastrophe and disaster continue to occur throughout the world, as well as countries around the world. Currently there are many important international conferences and policy, it is a "global environmental sustainability " and "living human health " as the goal of development, including the APEC 2007 meeting to "climate Clean Energy" as the theme Sydney Declaration, 2008 World Economic Forum's "Carbon - promote Cool Earth energy efficiency improvement project", the EU proposed "Green Idea" program, the Japanese annual policy, "low-carbon society, sustainable eco-city environment (Eco City) "And from 2009 to 2010 to promote the "Eco-Point" to promote green energy and carbon reduction products .And the 2010 World Climate Change Conference (COP16 United Nations Climate Change Conference Copenhagen), the world has been the subject of Negative conservative "Environmental Protection ", "save energy consumption, " into a positive response to the "Sustainable " and" LOHAS", while Taiwan has actively put forward eco-cities, green building, green building materials and other related environmental response Measures, especially green building construction environment that is the basis of factors, the most widely used application level, and direct contact with human health and the key to sustainable planet. "Sustainable development "is a necessary condition for continuation of the Earth, "healthy and comfortable" is a necessary condition for the continuation of life, and improve the "quality" is a necessary condition for economic development, balance between the three is "to enhance the efficiency of ", According to the World Business Council for Sustainable Development (WBCSD) for the "environmental efficiency "(Eco-Efficiency) proposed: " the achievement of environmental efficiency, the price to be competitive in the provision of goods or services to meet people's needs, improve living Quality at the same time, the goods or services throughout the life cycle. Its impact on the environment and natural resource utilization and gradually reduced to the extent the Earth can load. "whichever is the economy "Economic" and " Ecologic". The research into the methodology to obtain the Taiwan Green Building Material Labeling product as the scope of the study, by investigating and weight analysis to explore green building environmental load (Ln) factor and the Green Building Quality (Qn) factor to Establish green building environmental efficiency assessment model (GBM Eco-Efficiency). And building materials for healthy green label products for priority assessment object, the object is set in the material evidence for the direct response to the environmental load from the floor class-based, explicit feedback correction to the Green Building environmental efficiency assessment model, "efficiency " as a starting point to achieve balance between human "health "and Earth "sustainable development of win-win strategy. The study is expected to reach 1.To establish green building materials and the quality of environmental impact assessment system, 2. To establish value of GBM Eco-Efficiency model, 3. To establish the GBM Eco-Efficiency model for application of green building material feedback mechanisms.
Abstract: Information sharing and gathering are important in the rapid advancement era of technology. The existence of WWW has caused rapid growth of information explosion. Readers are overloaded with too many lengthy text documents in which they are more interested in shorter versions. Oil and gas industry could not escape from this predicament. In this paper, we develop an Automated Text Summarization System known as AutoTextSumm to extract the salient points of oil and gas drilling articles by incorporating statistical approach, keywords identification, synonym words and sentence-s position. In this study, we have conducted interviews with Petroleum Engineering experts and English Language experts to identify the list of most commonly used keywords in the oil and gas drilling domain. The system performance of AutoTextSumm is evaluated using the formulae of precision, recall and F-score. Based on the experimental results, AutoTextSumm has produced satisfactory performance with F-score of 0.81.
Abstract: Dredging activities inevitably cause sediment
dispersion. In certain locations, where there are important ecological
areas such as mangroves or coral reefs, carefully planning the
dredging can significantly reduce negative impacts. This article
utilizes the dredging at Phuket port, Thailand, as a case study to
demonstrate how computer simulations can be helpful to protect
existing coral reefs. A software package named MIKE21 was
applied. Necessary information required by the simulations was
gathered. After calibrating and verifying the model, various dredging
scenario were simulated to predict spoil movement. The simulation
results were used as guidance to setting up an environmental
measure. Finally, the recommendation to dredge during flood tide
with silt curtains installed was made.
Abstract: While service quality is acceptably most valued in the tourism industry, the issue of safety and security plays a key role in sustaining the industry success. Such an issue has been part of Thailand-s tourism development and promotion for several years. Evidently, the Tourist Police Department was set up for this purpose. Its main responsibility is to deal with international tourists- safety and confidence in travelling within Thai territory. However, to strengthen the tourism safety of the country, it is important to better understand international tourists- safety concerns about Thailand. This article seeks to compare international tourists- safety needs and Thai tourist polices- perception towards the tourists- safety concern to determine what measure should be taken to assure the tourist of Thailand-s secure environment. Through the employment of quantitative and qualitative methodological approaches, the tourism safety need of international tourists from Europe, North America and Asia was excavated, how Thai tourist polices and local polices perceived the international tourist-s safety concern was investigated, and opinion and experiences about how the police deal with international tourists- problems in eight touristic areas were also explored. A comparative result reveals a certain degrees of differences in international tourists- safety needs and Thai polices- perception towards their needs. The tourism safety prevention and protection measure and practice are also suggested.
Abstract: Developing techniques for mobile robot navigation constitutes one of the major trends in the current
research on mobile robotics. This paper develops a local
model network (LMN) for mobile robot navigation. The
LMN represents the mobile robot by a set of locally valid
submodels that are Multi-Layer Perceptrons (MLPs).
Training these submodels employs Back Propagation (BP) algorithm. The paper proposes the fuzzy C-means (FCM) in this scheme to divide the input space to sub regions, and then a submodel (MLP) is identified to represent a particular
region. The submodels then are combined in a unified
structure. In run time phase, Radial Basis Functions (RBFs) are employed as windows for the activated submodels. This
proposed structure overcomes the problem of changing operating regions of mobile robots. Read data are used in all experiments. Results for mobile robot navigation using the
proposed LMN reflect the soundness of the proposed
scheme.
Abstract: Censored Production Rule is an extension of standard
production rule, which is concerned with problems of reasoning with
incomplete information, subject to resource constraints and problem
of reasoning efficiently with exceptions. A CPR has a form: IF A
(Condition) THEN B (Action) UNLESS C (Censor), Where C is the
exception condition. Fuzzy CPR are obtained by augmenting
ordinary fuzzy production rule “If X is A then Y is B with an
exception condition and are written in the form “If X is A then Y is B
Unless Z is C. Such rules are employed in situation in which the
fuzzy conditional statement “If X is A then Y is B" holds frequently
and the exception condition “Z is C" holds rarely. Thus “If X is A
then Y is B" part of the fuzzy CPR express important information
while the unless part acts only as a switch that changes the polarity of
“Y is B" to “Y is not B" when the assertion “Z is C" holds. The
proposed approach is an attempt to discover fuzzy censored
production rules from set of discovered fuzzy if then rules in the
form:
A(X)  B(Y) || C(Z).
Abstract: Biclustering aims at identifying several biclusters that
reveal potential local patterns from a microarray matrix. A bicluster is
a sub-matrix of the microarray consisting of only a subset of genes
co-regulates in a subset of conditions. In this study, we extend the
motif of subspace clustering to present a K-biclusters clustering (KBC)
algorithm for the microarray biclustering issue. Besides minimizing
the dissimilarities between genes and bicluster centers within all
biclusters, the objective function of the KBC algorithm additionally
takes into account how to minimize the residues within all biclusters
based on the mean square residue model. In addition, the objective
function also maximizes the entropy of conditions to stimulate more
conditions to contribute the identification of biclusters. The KBC
algorithm adopts the K-means type clustering process to efficiently
make the partition of K biclusters be optimized. A set of experiments
on a practical microarray dataset are demonstrated to show the
performance of the proposed KBC algorithm.
Abstract: We propose an enhanced collaborative filtering
method using Hofstede-s cultural dimensions, calculated for 111
countries. We employ 4 of these dimensions, which are correlated to
the costumers- buying behavior, in order to detect users- preferences
for items. In addition, several advantages of this method
demonstrated for data sparseness and cold-start users, which are
important challenges in collaborative filtering. We present
experiments using a real dataset, Book Crossing Dataset.
Experimental results shows that the proposed algorithm provide
significant advantages in terms of improving recommendation
quality.
Abstract: In this paper, a fuzzy algorithm and a fuzzy multicriteria
decision framework are developed and used for a practical
question of optimizing biofuels policy making. The methodological
framework shows how to incorporate fuzzy set theory in a decision
process of finding a sustainable biofuels policy among several policy
options. Fuzzy set theory is used here as a tool to deal with
uncertainties of decision environment, vagueness and ambiguities of
policy objectives, subjectivities of human assessments and imprecise
and incomplete information about the evaluated policy instruments.