Abstract: There are many problems associated with the World Wide
Web: getting lost in the hyperspace; the web content is still accessible only
to humans and difficulties of web administration. The solution to these
problems is the Semantic Web which is considered to be the extension
for the current web presents information in both human readable and
machine processable form. The aim of this study is to reach new
generic foundation architecture for the Semantic Web because there
is no clear architecture for it, there are four versions, but still up to
now there is no agreement for one of these versions nor is there a
clear picture for the relation between different layers and
technologies inside this architecture. This can be done depending on
the idea of previous versions as well as Gerber-s evaluation method
as a step toward an agreement for one Semantic Web architecture.
Abstract: This study focuses on the development of triangular fuzzy numbers, the revising of triangular fuzzy numbers, and the constructing of a HCFN (half-circle fuzzy number) model which can be utilized to perform more plural operations. They are further transformed for trigonometric functions and polar coordinates. From half-circle fuzzy numbers we can conceive cylindrical fuzzy numbers, which work better in algebraic operations. An example of fuzzy control is given in a simulation to show the applicability of the proposed half-circle fuzzy numbers.
Abstract: Since dealing with high dimensional data is
computationally complex and sometimes even intractable, recently
several feature reductions methods have been developed to reduce
the dimensionality of the data in order to simplify the calculation
analysis in various applications such as text categorization, signal
processing, image retrieval, gene expressions and etc. Among feature
reduction techniques, feature selection is one the most popular
methods due to the preservation of the original features.
In this paper, we propose a new unsupervised feature selection
method which will remove redundant features from the original
feature space by the use of probability density functions of various
features. To show the effectiveness of the proposed method, popular
feature selection methods have been implemented and compared.
Experimental results on the several datasets derived from UCI
repository database, illustrate the effectiveness of our proposed
methods in comparison with the other compared methods in terms of
both classification accuracy and the number of selected features.
Abstract: This paper objects to extend Jon Kleinberg-s research. He introduced the structure of small-world in a grid and shows with a greedy algorithm using only local information able to find route between source and target in delivery time O(log2n). His fundamental model for distributed system uses a two-dimensional grid with longrange random links added between any two node u and v with a probability proportional to distance d(u,v)-2. We propose with an additional information of the long link nearby, we can find the shorter path. We apply the ant colony system as a messenger distributed their pheromone, the long-link details, in surrounding area. The subsequence forwarding decision has more option to move to, select among local neighbors or send to node has long link closer to its target. Our experiment results sustain our approach, the average routing time by Color Pheromone faster than greedy method.
Abstract: This paper investigates the use of mobile phones and
tablets for learning purposes among university students in Saudi
Arabia. For this purpose, an extended Technology Acceptance Model
(TAM) is proposed to analyze the adoption of mobile devices and
smart phones by Saudi university students for accessing course
materials, searching the web for information related to their
discipline, sharing knowledge, conducting assignments etc.
Abstract: The development of Artificial Neural Networks
(ANNs) is usually a slow process in which the human expert has to
test several architectures until he finds the one that achieves best
results to solve a certain problem. This work presents a new
technique that uses Genetic Programming (GP) for automatically
generating ANNs. To do this, the GP algorithm had to be changed in
order to work with graph structures, so ANNs can be developed. This
technique also allows the obtaining of simplified networks that solve
the problem with a small group of neurons. In order to measure the
performance of the system and to compare the results with other
ANN development methods by means of Evolutionary Computation
(EC) techniques, several tests were performed with problems based
on some of the most used test databases. The results of those
comparisons show that the system achieves good results comparable
with the already existing techniques and, in most of the cases, they
worked better than those techniques.
Abstract: In this paper the optimality of the solution of an
existing real word assignment problem known as the seat assignment
problem using Seat Assignment Method (SAM) is discussed. SAM is
the newly driven method from three existing methods, Hungarian
Method, Northwest Corner Method and Least Cost Method in a
special way that produces the easiness & fairness among all methods
that solve the seat assignment problem.
Abstract: As far as the latest technological improvements are concerned, digital systems more become popular than the past. Despite this growing demand to the digital systems, content copy and attack against the digital cinema contents becomes a serious problem. To solve the above security problem, we propose “traceable watermarking using Hash functions for digital cinema system. Digital Cinema is a great application for traceable watermarking since it uses watermarking technology during content play as well as content transmission. The watermark is embedded into the randomly selected movie frames using CRC-32 techniques. CRC-32 is a Hash function. Using it, the embedding position is distributed by Hash Function so that any party cannot break off the watermarking or will not be able to change. Finally, our experimental results show that proposed DWT watermarking method using CRC-32 is much better than the convenient watermarking techniques in terms of robustness, image quality and its simple but unbreakable algorithm.
Abstract: This paper proposes a novel methodology for enabling
debugging and tracing of production web applications without
affecting its normal flow and functionality. This method of debugging
enables developers and maintenance engineers to replace a set of
existing resources such as images, server side scripts, cascading
style sheets with another set of resources per web session. The new
resources will only be active in the debug session and other sessions
will not be affected. This methodology will help developers in tracing
defects, especially those that appear only in production environments
and in exploring the behaviour of the system. A realization of the
proposed methodology has been implemented in Java.
Abstract: This paper presents an approach for repairing word order errors in English text by reordering words in a sentence and choosing the version that maximizes the number of trigram hits according to a language model. A possible way for reordering the words is to use all the permutations. The problem is that for a sentence with length N words the number of all permutations is N!. The novelty of this method concerns the use of an efficient confusion matrix technique for reordering the words. The confusion matrix technique has been designed in order to reduce the search space among permuted sentences. The limitation of search space is succeeded using the statistical inference of N-grams. The results of this technique are very interesting and prove that the number of permuted sentences can be reduced by 98,16%. For experimental purposes a test set of TOEFL sentences was used and the results show that more than 95% can be repaired using the proposed method.
Abstract: This paper presents an alternate approach that uses
artificial neural network to simulate the flood level dynamics in a
river basin. The algorithm was developed in a decision support
system environment in order to enable users to process the data. The
decision support system is found to be useful due to its interactive
nature, flexibility in approach and evolving graphical feature and can
be adopted for any similar situation to predict the flood level. The
main data processing includes the gauging station selection, input
generation, lead-time selection/generation, and length of prediction.
This program enables users to process the flood level data, to
train/test the model using various inputs and to visualize results. The
program code consists of a set of files, which can as well be modified
to match other purposes. This program may also serve as a tool for
real-time flood monitoring and process control. The running results
indicate that the decision support system applied to the flood level
seems to have reached encouraging results for the river basin under
examination. The comparison of the model predictions with the
observed data was satisfactory, where the model is able to forecast
the flood level up to 5 hours in advance with reasonable prediction
accuracy. Finally, this program may also serve as a tool for real-time
flood monitoring and process control.
Abstract: Graph partitioning is a NP-hard problem with multiple
conflicting objectives. The graph partitioning should minimize the
inter-partition relationship while maximizing the intra-partition
relationship. Furthermore, the partition load should be evenly
distributed over the respective partitions. Therefore this is a multiobjective
optimization problem (MOO). One of the approaches to
MOO is Pareto optimization which has been used in this paper. The
proposed methods of this paper used to improve the performance are
injecting best solutions of previous runs into the first generation of
next runs and also storing the non-dominated set of previous
generations to combine with later generation's non-dominated set.
These improvements prevent the GA from getting stuck in the local
optima and increase the probability of finding more optimal
solutions. Finally, a simulation research is carried out to investigate
the effectiveness of the proposed algorithm. The simulation results
confirm the effectiveness of the proposed method.
Abstract: This paper presents a novel approach to assessing textile porosity by the application of the image analysis techniques. The images of different types of sample fabrics, taken through a microscope when the fabric is placed over a constant light source,transfer the problem into the image analysis domain. Indeed, porosity can thus be expressed in terms of a brightness percentage index calculated on the digital microscope image. Furthermore, it is meaningful to compare the brightness percentage index with the air permeability and the tightness indices of each fabric type. We have experimentally shown that there exists an approximately linear relation between brightness percentage and air permeability indices.
Abstract: The lack of security obstructs a large scale de- ployment of the multicast communication model. There- fore, a host of research works have been achieved in order to deal with several issues relating to securing the multicast, such as confidentiality, authentication, non-repudiation, in- tegrity and access control. Many applications require au- thenticating the source of the received traffic, such as broadcasting stock quotes and videoconferencing and hence source authentication is a required component in the whole multicast security architecture. In this paper, we propose a new and efficient source au- thentication protocol which guarantees non-repudiation for multicast flows, and tolerates packet loss. We have simu- lated our protocol using NS-2, and the simulation results show that the protocol allows to achieve improvements over protocols fitting into the same category.
Abstract: A new method, based on the normal shrink and
modified version of Katssagelous and Lay, is proposed for multiscale
blind image restoration. The method deals with the noise and blur in
the images. It is shown that the normal shrink gives the highest S/N
(signal to noise ratio) for image denoising process. The multiscale
blind image restoration is divided in two sections. The first part of
this paper proposes normal shrink for image denoising and the
second part of paper proposes modified version of katssagelous and
Lay for blur estimation and the combination of both methods to reach
a multiscale blind image restoration.
Abstract: The increase on the demand of IT resources diverts
the enterprises to use the cloud as a cheap and scalable solution.
Cloud computing promises achieved by using the virtual machine as a
basic unite of computation. However, the virtual machine pre-defined
settings might be not enough to handle jobs QoS requirements. This
paper addresses the problem of mapping jobs have critical start
deadlines to virtual machines that have predefined specifications.
These virtual machines hosted by physical machines and shared a
fixed amount of bandwidth. This paper proposed an algorithm that
uses the idle virtual machines bandwidth to increase the quote of other
virtual machines nominated as executors to urgent jobs. An algorithm
with empirical study have been given to evaluate the impact of the
proposed model on impatient jobs. The results show the importance
of dynamic bandwidth allocation in virtualized environment and its
affect on throughput metric.
Abstract: This paper presents an ESN-based Arabic phoneme
recognition system trained with supervised, forced and combined
supervised/forced supervised learning algorithms. Mel-Frequency
Cepstrum Coefficients (MFCCs) and Linear Predictive Code (LPC)
techniques are used and compared as the input feature extraction
technique. The system is evaluated using 6 speakers from the King
Abdulaziz Arabic Phonetics Database (KAPD) for Saudi Arabia
dialectic and 34 speakers from the Center for Spoken Language
Understanding (CSLU2002) database of speakers with different
dialectics from 12 Arabic countries. Results for the KAPD and
CSLU2002 Arabic databases show phoneme recognition
performances of 72.31% and 38.20% respectively.
Abstract: Segmenting the lungs in medical images is a
challenging and important task for many applications. In particular,
automatic segmentation of lung cavities from multiple magnetic
resonance (MR) images is very useful for oncological applications
such as radiotherapy treatment planning. However, distinguishing of
the lung areas is not trivial due to largely changing lung shapes, low
contrast and poorly defined boundaries. In this paper, we address
lung segmentation problem from pulmonary magnetic resonance
images and propose an automated method based on a robust regionaided
geometric snake with a modified diffused region force into the
standard geometric model definition. The extra region force gives the
snake a global complementary view of the lung boundary
information within the image which along with the local gradient
flow, helps detect fuzzy boundaries. The proposed method has been
successful in segmenting the lungs in every slice of 30 magnetic
resonance images with 80 consecutive slices in each image. We
present results by comparing our automatic method to manually
segmented lung cavities provided by an expert radiologist and with
those of previous works, showing encouraging results and high
robustness of our approach.
Abstract: In this paper we present the deep study about the Bio-
Medical Images and tag it with some basic extracting features (e.g.
color, pixel value etc). The classification is done by using a nearest
neighbor classifier with various distance measures as well as the
automatic combination of classifier results. This process selects a
subset of relevant features from a group of features of the image. It
also helps to acquire better understanding about the image by
describing which the important features are. The accuracy can be
improved by increasing the number of features selected. Various
types of classifications were evolved for the medical images like
Support Vector Machine (SVM) which is used for classifying the
Bacterial types. Ant Colony Optimization method is used for optimal
results. It has high approximation capability and much faster
convergence, Texture feature extraction method based on Gabor
wavelets etc..
Abstract: In this paper, a mathematical model for data object replication in ad hoc networks is formulated. The derived model is general, flexible and adaptable to cater for various applications in ad hoc networks. We propose a game theoretical technique in which players (mobile hosts) continuously compete in a non-cooperative environment to improve data accessibility by replicating data objects. The technique incorporates the access frequency from mobile hosts to each data object, the status of the network connectivity, and communication costs. The proposed technique is extensively evaluated against four well-known ad hoc network replica allocation methods. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality