Abstract: In this study, a classification-based video
super-resolution method using artificial neural network (ANN) is
proposed to enhance low-resolution (LR) to high-resolution (HR)
frames. The proposed method consists of four main steps:
classification, motion-trace volume collection, temporal adjustment,
and ANN prediction. A classifier is designed based on the edge
properties of a pixel in the LR frame to identify the spatial information.
To exploit the spatio-temporal information, a motion-trace volume is
collected using motion estimation, which can eliminate unfathomable
object motion in the LR frames. In addition, temporal lateral process is
employed for volume adjustment to reduce unnecessary temporal
features. Finally, ANN is applied to each class to learn the complicated
spatio-temporal relationship between LR and HR frames. Simulation
results show that the proposed method successfully improves both
peak signal-to-noise ratio and perceptual quality.
Abstract: A fault detection and identification (FDI) technique is
presented to create a fault tolerant control system (FTC). The fault
detection is achieved by monitoring the position of the light source
using an array of light sensors. When a decision is made about the
presence of a fault an identification process is initiated to locate the
faulty component and reconfigure the controller signals. The signals
provided by the sensors are predictable; therefore the existence of a
fault is easily identified. Identification of the faulty sensor is based on
the dynamics of the frame. The technique is not restricted to a
particular type of controllers and the results show consistency.
Abstract: In this paper, a framework is presented trying to make
the most secure web system out of the available generic and web
security technology which can be used as a guideline for
organizations building their web sites. The framework is designed to
provide necessary security services, to address the known security
threats, and to provide some cover to other security problems
especially unknown threats. The requirements for the design are
discussed which guided us to the design of secure web system. The
designed security framework is then simulated and various quality of
service (QoS) metrics are calculated to measure the performance of
this system.
Abstract: Reverse Engineering is a very important process in
Software Engineering. It can be performed backwards from system
development life cycle (SDLC) in order to get back the source data
or representations of a system through analysis of its structure,
function and operation. We use reverse engineering to introduce an
automatic tool to generate system requirements from its program
source codes. The tool is able to accept the Cµ programming source
codes, scan the source codes line by line and parse the codes to
parser. Then, the engine of the tool will be able to generate system
requirements for that specific program to facilitate reuse and
enhancement of the program. The purpose of producing the tool is to
help recovering the system requirements of any system when the
system requirements document (SRD) does not exist due to
undocumented support of the system.
Abstract: Many attempts have been made to strengthen Feistel based block ciphers. Among the successful proposals is the key- dependent S-box which was implemented in some of the high-profile ciphers. In this paper a key-dependent permutation box is proposed and implemented on DES as a case study. The new modified DES, MDES, was tested against Diehard Tests, avalanche test, and performance test. The results showed that in general MDES is more resistible to attacks than DES with negligible overhead. Therefore, it is believed that the proposed key-dependent permutation should be considered as a valuable primitive that can help strengthen the security of Substitution-Permutation Network which is a core design in many Feistel based block ciphers.
Abstract: The world wide web coupled with the ever-increasing
sophistication of online technologies and software applications puts
greater emphasis on the need of even more sophisticated and
consistent quality requirements modeling than traditional software
applications. Web sites and Web applications (WebApps) are
becoming more information driven and content-oriented raising the
concern about their information quality (InQ). The consistent and
consolidated modeling of InQ requirements for WebApps at different
stages of the life cycle still poses a challenge. This paper proposes an
approach to specify InQ requirements for WebApps by reusing and
extending the ISO 25012:2008(E) data quality model. We also
discuss learnability aspect of information quality for the WebApps.
The proposed ISO 25012 based InQ framework is a step towards a
standardized approach to evaluate WebApps InQ.
Abstract: As the performance of the filtering system depends
upon the accuracy of the noise detection scheme, in this paper, we
present a new scheme for impulse noise detection based on two
levels of decision. In this scheme in the first stage we coarsely
identify the corrupted pixels and in the second stage we finally
decide whether the pixel under consideration is really corrupt or not.
The efficacy of the proposed filter has been confirmed by extensive
simulations.
Abstract: Location-aware computing is a type of pervasive
computing that utilizes user-s location as a dominant factor for
providing urban services and application-related usages. One of the
important urban services is navigation instruction for wayfinders in a
city especially when the user is a tourist. The services which are
presented to the tourists should provide adapted location aware
instructions. In order to achieve this goal, the main challenge is to
find spatial relevant objects and location-dependent information. The
aim of this paper is the development of a reusable location-aware
model to handle spatial relevancy parameters in urban location-aware
systems. In this way we utilized ontology as an approach which could
manage spatial relevancy by defining a generic model. Our
contribution is the introduction of an ontological model based on the
directed interval algebra principles. Indeed, it is assumed that the
basic elements of our ontology are the spatial intervals for the user
and his/her related contexts. The relationships between them would
model the spatial relevancy parameters. The implementation language
for the model is OWLs, a web ontology language. The achieved
results show that our proposed location-aware model and the
application adaptation strategies provide appropriate services for the
user.
Abstract: In order to accelerate the similarity search in highdimensional database, we propose a new hierarchical indexing method. It is composed of offline and online phases. Our contribution concerns both phases. In the offline phase, after gathering the whole of the data in clusters and constructing a hierarchical index, the main originality of our contribution consists to develop a method to construct bounding forms of clusters to avoid overlapping. For the online phase, our idea improves considerably performances of similarity search. However, for this second phase, we have also developed an adapted search algorithm. Our method baptized NOHIS (Non-Overlapping Hierarchical Index Structure) use the Principal Direction Divisive Partitioning (PDDP) as algorithm of clustering. The principle of the PDDP is to divide data recursively into two sub-clusters; division is done by using the hyper-plane orthogonal to the principal direction derived from the covariance matrix and passing through the centroid of the cluster to divide. Data of each two sub-clusters obtained are including by a minimum bounding rectangle (MBR). The two MBRs are directed according to the principal direction. Consequently, the nonoverlapping between the two forms is assured. Experiments use databases containing image descriptors. Results show that the proposed method outperforms sequential scan and SRtree in processing k-nearest neighbors.
Abstract: The identification and classification of weeds are of
major technical and economical importance in the agricultural
industry. To automate these activities, like in shape, color and
texture, weed control system is feasible. The goal of this paper is to
build a real-time, machine vision weed control system that can detect
weed locations. In order to accomplish this objective, a real-time
robotic system is developed to identify and locate outdoor plants
using machine vision technology and pattern recognition. The
algorithm is developed to classify images into broad and narrow class
for real-time selective herbicide application. The developed
algorithm has been tested on weeds at various locations, which have
shown that the algorithm to be very effectiveness in weed
identification. Further the results show a very reliable performance
on weeds under varying field conditions. The analysis of the results
shows over 90 percent classification accuracy over 140 sample
images (broad and narrow) with 70 samples from each category of
weeds.
Abstract: Distributed denial-of-service (DDoS) attacks pose a
serious threat to network security. There have been a lot of
methodologies and tools devised to detect DDoS attacks and reduce
the damage they cause. Still, most of the methods cannot
simultaneously achieve (1) efficient detection with a small number of
false alarms and (2) real-time transfer of packets. Here, we introduce
a method for proactive detection of DDoS attacks, by classifying the
network status, to be utilized in the detection stage of the proposed
anti-DDoS framework. Initially, we analyse the DDoS architecture
and obtain details of its phases. Then, we investigate the procedures
of DDoS attacks and select variables based on these features. Finally,
we apply the k-nearest neighbour (k-NN) method to classify the
network status into each phase of DDoS attack. The simulation result
showed that each phase of the attack scenario is classified well and
we could detect DDoS attack in the early stage.
Abstract: The authors have been developing several models
based on artificial neural networks, linear regression models, Box-
Jenkins methodology and ARIMA models to predict the time series
of tourism. The time series consist in the “Monthly Number of Guest
Nights in the Hotels" of one region. Several comparisons between the
different type models have been experimented as well as the features
used at the entrance of the models. The Artificial Neural Network
(ANN) models have always had their performance at the top of the
best models. Usually the feed-forward architecture was used due to
their huge application and results. In this paper the author made a
comparison between different architectures of the ANNs using
simply the same input. Therefore, the traditional feed-forward
architecture, the cascade forwards, a recurrent Elman architecture and
a radial based architecture were discussed and compared based on the
task of predicting the mentioned time series.
Abstract: This paper takes the actual scene of Aletheia
University campus – the Class 2 national monument, the first
educational institute in northern Taiwan as an example, to present a
3D virtual navigation system which supports user positioning and
pre-download mechanism. The proposed system was designed based
on the principle of Voronoi Diagra) to divide the virtual scenes and
its multimedia information, which combining outdoor GPS
positioning and the indoor RFID location detecting function. When
users carry mobile equipments such as notebook computer, UMPC,
EeePC...etc., walking around the actual scenes of indoor and outdoor
areas of campus, this system can automatically detect the moving
path of users and pre-download the needed data so that users will
have a smooth and seamless navigation without waiting.
Abstract: Information sharing and gathering are important in the rapid advancement era of technology. The existence of WWW has caused rapid growth of information explosion. Readers are overloaded with too many lengthy text documents in which they are more interested in shorter versions. Oil and gas industry could not escape from this predicament. In this paper, we develop an Automated Text Summarization System known as AutoTextSumm to extract the salient points of oil and gas drilling articles by incorporating statistical approach, keywords identification, synonym words and sentence-s position. In this study, we have conducted interviews with Petroleum Engineering experts and English Language experts to identify the list of most commonly used keywords in the oil and gas drilling domain. The system performance of AutoTextSumm is evaluated using the formulae of precision, recall and F-score. Based on the experimental results, AutoTextSumm has produced satisfactory performance with F-score of 0.81.
Abstract: In this paper, we propose the robust water level detection method based on the accumulated histogram under small changed image which is acquired from water level surveillance camera. In general surveillance system, this is detecting and recognizing invasion from searching area which is in big change on the sequential images. However, in case of a water level detection system, these general surveillance techniques are not suitable due to small change on the water surface. Therefore the algorithm introduces the accumulated histogram which is emphasizing change of water surface in sequential images. Accumulated histogram is based on the current image frame. The histogram is cumulating differences between previous images and current image. But, these differences are also appeared in the land region. The band pass filter is able to remove noises in the accumulated histogram Finally, this algorithm clearly separates water and land regions. After these works, the algorithm converts from the water level value on the image space to the real water level on the real space using calibration table. The detected water level is sent to the host computer with current image. To evaluate the proposed algorithm, we use test images from various situations.
Abstract: This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.
Abstract: Fractional Fourier Transform is a powerful tool,
which is a generalization of the classical Fourier Transform. This
paper provides a mathematical relation relating the span in Fractional
Fourier domain with the amplitude and phase functions of the signal,
which is further used to study the variation of quality factor with
different values of the transform order. It is seen that with the
increase in the number of transients in the signal, the deviation of
average Fractional Fourier span from the frequency bandwidth
increases. Also, with the increase in the transient nature of the signal,
the optimum value of transform order can be estimated based on the
quality factor variation, and this value is found to be very close to
that for which one can obtain the most compact representation. With
the entire mathematical analysis and experimentation, we consolidate
the fact that Fractional Fourier Transform gives more optimal
representations for a number of transform orders than Fourier
transform.
Abstract: Cloud Computing is an approach that provides computation and storage services on-demand to clients over the network, independent of device and location. In the last few years, cloud computing became a trend in information technology with many companies that transfer their business processes and applications in the cloud. Cloud computing with service oriented architecture has contributed to rapid development of Geographic Information Systems. Open Geospatial Consortium with its standards provides the interfaces for hosted spatial data and GIS functionality to integrated GIS applications. Furthermore, with the enormous processing power, clouds provide efficient environment for data intensive applications that can be performed efficiently, with higher precision, and greater reliability. This paper presents our work on the geospatial data services within the cloud computing environment and its technology. A cloud computing environment with the strengths and weaknesses of the geographic information system will be introduced. The OGC standards that solve our application interoperability are highlighted. Finally, we outline our system architecture with utilities for requesting and invoking our developed data intensive applications as a web service.
Abstract: Censored Production Rule is an extension of standard
production rule, which is concerned with problems of reasoning with
incomplete information, subject to resource constraints and problem
of reasoning efficiently with exceptions. A CPR has a form: IF A
(Condition) THEN B (Action) UNLESS C (Censor), Where C is the
exception condition. Fuzzy CPR are obtained by augmenting
ordinary fuzzy production rule “If X is A then Y is B with an
exception condition and are written in the form “If X is A then Y is B
Unless Z is C. Such rules are employed in situation in which the
fuzzy conditional statement “If X is A then Y is B" holds frequently
and the exception condition “Z is C" holds rarely. Thus “If X is A
then Y is B" part of the fuzzy CPR express important information
while the unless part acts only as a switch that changes the polarity of
“Y is B" to “Y is not B" when the assertion “Z is C" holds. The
proposed approach is an attempt to discover fuzzy censored
production rules from set of discovered fuzzy if then rules in the
form:
A(X)  B(Y) || C(Z).
Abstract: In this paper, we propose a dual version of the first
threshold ring signature scheme based on error-correcting code proposed
by Aguilar et. al in [1]. Our scheme uses an improvement of
Véron zero-knowledge identification scheme, which provide smaller
public and private key sizes and better computation complexity than
the Stern one. This scheme is secure in the random oracle model.