Abstract: Video watermarking is usually considered as watermarking of a set of still images. In frame-by-frame watermarking approach, each video frame is seen as a single watermarked image, so collusion attack is more critical in video watermarking. If the same or redundant watermark is used for embedding in every frame of video, the watermark can be estimated and then removed by watermark estimate remodolulation (WER) attack. Also if uncorrelated watermarks are used for every frame, these watermarks can be washed out with frame temporal filtering (FTF). Switching watermark system or so-called SS-N system has better performance against WER and FTF attacks. In this system, for each frame, the watermark is randomly picked up from a finite pool of watermark patterns. At first SS-N system will be surveyed and then a new collusion attack for SS-N system will be proposed using a new algorithm for separating video frame based on watermark pattern. So N sets will be built in which every set contains frames carrying the same watermark. After that, using WER attack in every set, N different watermark patterns will be estimated and removed later.
Abstract: In this paper, we consider the design of pulse shaping
filter using orthogonal Hermite-Rodriguez basis functions. The pulse
shaping filter design problem has been formulated and solved as a
quadratic programming problem with linear inequality constraints.
Compared with the existing approaches reported in the literature, the
use of Hermite-Rodriguez functions offers an effective alternative to
solve the constrained filter synthesis problem. This is demonstrated
through a numerical example which is concerned with the design of
an equalization filter for a digital transmission channel.
Abstract: Automatic reading of handwritten cheque is a computationally
complex process and it plays an important role in financial
risk management. Machine vision and learning provide a viable
solution to this problem. Research effort has mostly been focused
on recognizing diverse pitches of cheques and demand drafts with an
identical outline. However most of these methods employ templatematching
to localize the pitches and such schemes could potentially
fail when applied to different types of outline maintained by the
bank. In this paper, the so-called outline problem is resolved by
a cheque information tree (CIT), which generalizes the localizing
method to extract active-region-of-entities. In addition, the weight
based density plot (WBDP) is performed to isolate text entities and
read complete pitches. Recognition is based on texture features using
neural classifiers. Legal amount is subsequently recognized by both
texture and perceptual features. A post-processing phase is invoked
to detect the incorrect readings by Type-2 grammar using the Turing
machine. The performance of the proposed system was evaluated
using cheque and demand drafts of 22 different banks. The test data
consists of a collection of 1540 leafs obtained from 10 different
account holders from each bank. Results show that this approach
can easily be deployed without significant design amendments.
Abstract: Mobile devices, which are progressively surrounded
in our everyday life, have created a new paradigm where they
interconnect, interact and collaborate with each other. This network
can be used for flexible and secure coordinated sharing. On the other
hand Grid computing provides dependable, consistent, pervasive, and
inexpensive access to high-end computational capabilities. In this
paper, efforts are made to map the concepts of Grid on Ad-Hoc
networks because both exhibit similar kind of characteristics like
Scalability, Dynamism and Heterogeneity. In this context we
propose “Mobile Ad-Hoc Services Grid – MASGRID".
Abstract: In this study, a fuzzy-logic based control system was
designed to ensure that time and energy is saved during the operation
of load elevators which are used during the construction of tall
buildings. In the control system that was devised, for the load
elevators to work more efficiently, the energy interval where the
motor worked was taken as the output variable whereas the amount
of load and the building height were taken as input variables. The
most appropriate working intervals depending on the characteristics
of these variables were defined by the help of an expert. Fuzzy expert
system software was formed using Delphi programming language. In
this design, mamdani max-min inference mechanism was used and
the centroid method was employed in the clarification procedure. In
conclusion, it is observed that the system that was designed is
feasible and this is supported by statistical analyses..
Abstract: The IEEE 802.11e which is an enhanced version of the 802.11 WLAN standards incorporates the Quality of Service (QoS) which makes it a better choice for multimedia and real time applications. In this paper we study various aspects concerned with 802.11e standard. Further, the analysis results for this standard are compared with the legacy 802.11 standard. Simulation results show that IEEE 802.11e out performs legacy IEEE 802.11 in terms of quality of service due to its flow differentiated channel allocation and better queue management architecture. We also propose a method to improve the unfair allocation of bandwidth for downlink and uplink channels by varying the medium access priority level.
Abstract: Global Software Development (GSD) projects are
passing through different boundaries of a company, country and even
in other continents where time zone differs between both sites.
Beside many benefits of such development, research declared plenty
of negative impacts on these GSD projects. It is important to
understand problems which may lie during the execution of GSD
project with different time zones. This research project discussed and
provided different issues related to time delays in GSD projects. In
this paper, authors investigated some of the time delay factors which
usually lie in GSD projects with different time zones. This
investigation is done through systematic review of literature.
Furthermore, the practices to overcome these delay factors which
have already been reported in literature and GSD organizations are
also explored through literature survey and case studies.
Abstract: Hydrogen is an important chemical in many industries
and it is expected to become one of the major fuels for energy
generation in the future. Unfortunately, hydrogen does not exist in its
elemental form in nature and therefore has to be produced from
hydrocarbons, hydrogen-containing compounds or water.
Above its critical point (374.8oC and 22.1MPa), water has lower
density and viscosity, and a higher heat capacity than those of
ambient water. Mass transfer in supercritical water (SCW) is
enhanced due to its increased diffusivity and transport ability. The
reduced dielectric constant makes supercritical water a better solvent
for organic compounds and gases. Hence, due to the aforementioned
desirable properties, there is a growing interest toward studies
regarding the gasification of organic matter containing biomass or
model biomass solutions in supercritical water.
In this study, hydrogen and biofuel production by the catalytic
gasification of 2-Propanol in supercritical conditions of water was
investigated. Pt/Al2O3and Ni/Al2O3were the catalysts used in the
gasification reactions. All of the experiments were performed under a
constant pressure of 25MPa. The effects of five reaction temperatures
(400, 450, 500, 550 and 600°C) and five reaction times (10, 15, 20,
25 and 30 s) on the gasification yield and flammable component
content were investigated.
Abstract: LABVIEW is a graphical programming language that has its roots in automation control and data acquisition. In this paper we have utilized this platform to provide a powerful toolset for process identification and control of nonlinear systems based on artificial neural networks (ANN). This tool has been applied to the monitoring and control of a lab-scale distillation column DELTALAB DC-SP. The proposed control scheme offers high speed of response for changes in set points and null stationary error for dual composition control and shows robustness in presence of externally imposed disturbance.
Abstract: Water hyacinth has been used in aquatic systems for
wastewater purification in many years worldwide. The role of water
hyacinth (Eichhornia crassipes) species in polishing nitrate and
phosphorus concentration from municipal wastewater treatment plant
effluent by phytoremediation method was evaluated. The objective
of this project is to determine the removal efficiency of water
hyacinth in polishing nitrate and phosphorus, as well as chemical
oxygen demand (COD) and ammonia. Water hyacinth is considered
as the most efficient aquatic plant used in removing vast range of
pollutants such as organic matters, nutrients and heavy metals. Water
hyacinth, also referred as macrophytes, were cultivated in the
treatment house in a reactor tank of approximately 90(L) x 40(W) x
25(H) in dimension and built with three compartments. Three water
hyacinths were placed in each compartments and water sample in
each compartment were collected in every two days. The plant
observation was conducted by weight measurement, plant uptake and
new young shoot development. Water hyacinth effectively removed
approximately 49% of COD, 81% of ammonia, 67% of phosphorus
and 92% of nitrate. It also showed significant growth rate at starting
from day 6 with 0.33 shoot/day and they kept developing up to 0.38
shoot/day at the end of day 24. From the studies conducted, it was
proved that water hyacinth is capable of polishing the effluent of
municipal wastewater which contains undesirable amount of nitrate
and phosphorus concentration.
Abstract: In this paper we present a hybrid search algorithm for
solving constraint satisfaction and optimization problems. This
algorithm combines ideas of two basic approaches: complete and
incomplete algorithms which also known as systematic search and
local search algorithms. Different characteristics of systematic search
and local search methods are complementary. Therefore we have
tried to get the advantages of both approaches in the presented
algorithm. The major advantage of presented algorithm is finding
partial sound solution for complicated problems which their complete
solution could not be found in a reasonable time. This algorithm
results are compared with other algorithms using the well known
n-queens problem.
Abstract: As the Internet continues to grow at a rapid pace as
the primary medium for communications and commerce and as
telecommunication networks and systems continue to expand their
global reach, digital information has become the most popular and
important information resource and our dependence upon the
underlying cyber infrastructure has been increasing significantly.
Unfortunately, as our dependency has grown, so has the threat to the
cyber infrastructure from spammers, attackers and criminal
enterprises. In this paper, we propose a new machine learning based
network intrusion detection framework for cyber security. The
detection process of the framework consists of two stages: model
construction and intrusion detection. In the model construction stage,
a semi-supervised machine learning algorithm is applied to a
collected set of network audit data to generate a profile of normal
network behavior and in the intrusion detection stage, input network
events are analyzed and compared with the patterns gathered in the
profile, and some of them are then flagged as anomalies should these
events are sufficiently far from the expected normal behavior. The
proposed framework is particularly applicable to the situations where
there is only a small amount of labeled network training data
available, which is very typical in real world network environments.
Abstract: As the enormous amount of on-line text grows on the
World-Wide Web, the development of methods for automatically
summarizing this text becomes more important. The primary goal of
this research is to create an efficient tool that is able to summarize
large documents automatically. We propose an Evolving
connectionist System that is adaptive, incremental learning and
knowledge representation system that evolves its structure and
functionality. In this paper, we propose a novel approach for Part of
Speech disambiguation using a recurrent neural network, a paradigm
capable of dealing with sequential data. We observed that
connectionist approach to text summarization has a natural way of
learning grammatical structures through experience. Experimental
results show that our approach achieves acceptable performance.
Abstract: The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.
Abstract: The complexity of today-s software systems makes
collaborative development necessary to accomplish tasks.
Frameworks are necessary to allow developers perform their tasks
independently yet collaboratively. Similarity detection is one of the
major issues to consider when developing such frameworks. It allows
developers to mine existing repositories when developing their own
views of a software artifact, and it is necessary for identifying the
correspondences between the views to allow merging them and
checking their consistency. Due to the importance of the
requirements specification stage in software development, this paper
proposes a framework for collaborative development of Object-
Oriented formal specifications along with a similarity detection
approach to support the creation, merging and consistency checking
of specifications. The paper also explores the impact of using
additional concepts on improving the matching results. Finally, the
proposed approach is empirically evaluated.
Abstract: XML has become a popular standard for information exchange via web. Each XML document can be presented as a rooted, ordered, labeled tree. The Node label shows the exact position of a node in the original document. Region and Dewey encoding are two famous methods of labeling trees. In this paper, we propose a new insert friendly labeling method named IFDewey based on recently proposed scheme, called Extended Dewey. In Extended Dewey many labels must be modified when a new node is inserted into the XML tree. Our method eliminates this problem by reserving even numbers for future insertion. Numbers generated by Extended Dewey may be even or odd. IFDewey modifies Extended Dewey so that only odd numbers are generated and even numbers can then be used for a much easier insertion of nodes.
Abstract: Personal computers draw non-sinusoidal current
with odd harmonics more significantly. Power Quality of
distribution networks is severely affected due to the flow of these
generated harmonics during the operation of electronic loads. In
this paper, mathematical modeling of odd harmonics in current like
3rd, 5th, 7th and 9th influencing the power quality has been presented.
Live signals have been captured with the help of power quality
analyzer for analysis purpose. The interesting feature is that Total
Harmonic Distortion (THD) in current decreases with the increase
of nonlinear loads has been verified theoretically. The results
obtained using mathematical expressions have been compared with
the practical results and exciting results have been found.
Abstract: Subjective loneliness describes people who feel a
disagreeable or unacceptable lack of meaningful social relationships,
both at the quantitative and qualitative level. The studies to be
presented tested an Italian 18-items self-report loneliness measure,
that included items adapted from scales previously developed,
namely a short version of the UCLA (Russell, Peplau and Cutrona,
1980), and the 11-items Loneliness scale by De Jong-Gierveld &
Kamphuis (JGLS; 1985). The studies aimed at testing the developed
scale and at verifying whether loneliness is better conceptualized as a
unidimensional (so-called 'general loneliness') or a bidimensional
construct, namely comprising the distinct facets of social and
emotional loneliness. The loneliness questionnaire included 2 singleitem
criterion measures of sad mood, and social contact, and asked
participants to supply information on a number of socio-demographic
variables. Factorial analyses of responses obtained in two
preliminary studies, with 59 and 143 Italian participants respectively,
showed good factor loadings and subscale reliability and confirmed
that perceived loneliness has clearly two components, a social and an
emotional one, the latter measured by two subscales, a 7-item
'general' loneliness subscale derived from UCLA, and a 6–item
'emotional' scale included in the JGLS. Results further showed that
type and amount of loneliness are related, negatively, to frequency of
social contacts, and, positively, to sad mood. In a third study data
were obtained from a nation-wide sample of 9.097 Italian subjects,
12 to about 70 year-olds, who filled the test on-line, on the Italian
web site of a large-audience magazine, Focus. The results again
confirmed the reliability of the component subscales, namely social,
emotional, and 'general' loneliness, and showed that they were
highly correlated with each other, especially the latter two.
Loneliness scores were significantly predicted by sex, age, education
level, sad mood and social contact, and, less so, by other variables –
e.g., geographical area and profession. The scale validity was
confirmed by the results of a fourth study, with elderly men and
women (N 105) living at home or in residential care units. The three
subscales were significantly related, among others, to depression, and
to various measures of the extension of, and satisfaction with, social
contacts with relatives and friends. Finally, a fifth study with 315
career-starters showed that social and emotional loneliness correlate
with life satisfaction, and with measures of emotional intelligence.
Altogether the results showed a good validity and reliability in the
tested samples of the entire scale, and of its components.
Abstract: The purpose of this research is to disentangle and
validate the underlying factorial-structure of Ecotourism Experiential
Value (EEV) measurement scale and subsequently investigate its
psychometric properties. The analysis was based on a sample of 225
eco-tourists, collected at the vicinity of Taman Negara National Park
(TNNP) via interviewer-administered questionnaire. Exploratory
factor analysis (EFA) was performed to determine the factorial
structure of EEV. Subsequently, to confirm and validate the factorial
structure and assess the psychometric properties of EEV,
confirmatory factor analysis (CFA) was executed. In addition, to
establish the nomological validity of EEV a structural model was
developed to examine the effect of EEV on Total Eco-tourist
Experience Quality (TEEQ). It is unveiled that EEV is a secondorder
six-factorial structure construct and it scale has adequately met
the psychometric criteria, thus could permit interpretation of results
confidently. The findings have important implications for future
research directions and management of ecotourism destination.
Abstract: Diabetes is one of the high prevalence diseases
worldwide with increased number of complications, with retinopathy
as one of the most common one. This paper describes how data
mining and case-based reasoning were integrated to predict
retinopathy prevalence among diabetes patients in Malaysia. The
knowledge base required was built after literature reviews and
interviews with medical experts. A total of 140 diabetes patients- data
were used to train the prediction system. A voting mechanism selects
the best prediction results from the two techniques used. It has been
successfully proven that both data mining and case-based reasoning
can be used for retinopathy prediction with an improved accuracy of
85%.