Abstract: In recent years image watermarking has become an
important research area in data security, confidentiality and image
integrity. Many watermarking techniques were proposed for medical
images. However, medical images, unlike most of images, require
extreme care when embedding additional data within them because
the additional information must not affect the image quality and
readability. Also the medical records, electronic or not, are linked to
the medical secrecy, for that reason, the records must be confidential.
To fulfill those requirements, this paper presents a lossless
watermarking scheme for DICOM images. The proposed a fragile
scheme combines two reversible techniques based on difference
expansion for patient's data hiding and protecting the region of
interest (ROI) with tamper detection and recovery capability.
Patient's data are embedded into ROI, while recovery data are
embedded into region of non-interest (RONI). The experimental
results show that the original image can be exactly extracted from the
watermarked one in case of no tampering. In case of tampered ROI,
tampered area can be localized and recovered with a high quality
version of the original area.
Abstract: All Text processing systems allow their users to
search a pattern of string from a given text. String matching is
fundamental to database and text processing applications. Every text
editor must contain a mechanism to search the current document for
arbitrary strings. Spelling checkers scan an input text for words in the
dictionary and reject any strings that do not match. We store our
information in data bases so that later on we can retrieve the same
and this retrieval can be done by using various string matching
algorithms. This paper is describing a new string matching algorithm
for various applications. A new algorithm has been designed with the
help of Rabin Karp Matcher, to improve string matching process.
Abstract: Since water resources of desert Naein City are very
limited, a approach which saves water resources and meanwhile
meets the needs of the greenspace for water is to use city-s sewage
wastewater. Proper treatment of Naein-s sewage up to the standards
required for green space uses may solve some of the problems of
green space development of the city. The present paper closely
examines available statistics and information associated with city-s
sewage system, and determines complementary stages of sewage
treatment facilities of the city. In the present paper, population, per
capita water use, and required discharge for various greenspace
pieces including different plants are calculated. Moreover, in order to
facilitate the application of water resources, a Crude water
distribution network apart from drinking water distribution network is
designed, and a plan for mixing municipal wells- water with sewage
wastewater in proposed mixing tanks is suggested. Hence, following
greenspace irrigation reform and complementary plan, per capita
greenspace of the city will be increased from current amount of 13.2
square meters to 32 square meters.
Abstract: As the web continues to grow exponentially, the idea
of crawling the entire web on a regular basis becomes less and less
feasible, so the need to include information on specific domain,
domain-specific search engines was proposed. As more information
becomes available on the World Wide Web, it becomes more difficult
to provide effective search tools for information access. Today,
people access web information through two main kinds of search
interfaces: Browsers (clicking and following hyperlinks) and Query
Engines (queries in the form of a set of keywords showing the topic
of interest) [2]. Better support is needed for expressing one's
information need and returning high quality search results by web
search tools. There appears to be a need for systems that do reasoning
under uncertainty and are flexible enough to recover from the
contradictions, inconsistencies, and irregularities that such reasoning
involves. In a multi-view problem, the features of the domain can be
partitioned into disjoint subsets (views) that are sufficient to learn the
target concept. Semi-supervised, multi-view algorithms, which
reduce the amount of labeled data required for learning, rely on the
assumptions that the views are compatible and uncorrelated. This
paper describes the use of semi-structured machine learning approach
with Active learning for the “Domain Specific Search Engines". A
domain-specific search engine is “An information access system that
allows access to all the information on the web that is relevant to a
particular domain. The proposed work shows that with the help of
this approach relevant data can be extracted with the minimum
queries fired by the user. It requires small number of labeled data and
pool of unlabelled data on which the learning algorithm is applied to
extract the required data.
Abstract: Along with the advances in medicine, providing medical information to individual patient is becoming more important. In Japan such information via Braille is hardly provided to blind and partially sighted people. Thus we are researching and developing a Web-based automatic translation program “eBraille" to translate Japanese text into Japanese Braille. First we analyzed the Japanese transcription rules to implement them on our program. We then added medical words to the dictionary of the program to improve its translation accuracy for medical text. Finally we examined the efficacy of statistical learning models (SLMs) for further increase of word segmentation accuracy in braille translation. As a result, eBraille had the highest translation accuracy in the comparison with other translation programs, improved the accuracy for medical text and is utilized to make hospital brochures in braille for outpatients and inpatients.
Abstract: This paper deals with wireless relay communication
systems in which multiple sources transmit information to the
destination node by the help of multiple relays. We consider a
signal forwarding technique based on the minimum mean-square
error (MMSE) approach with multiple antennas for each relay. A
source-relay-destination joint design strategy is proposed with power
constraints at the destination and the source nodes. Simulation results
confirm that the proposed joint design method improves the average
MSE performance compared with that of conventional MMSE relaying
schemes.
Abstract: A minimal complexity version of component mode
synthesis is presented that requires simplified computer
programming, but still provides adequate accuracy for modeling
lower eigenproperties of large structures and their transient
responses. The novelty is that a structural separation into components
is done along a plane/surface that exhibits rigid-like behavior, thus
only normal modes of each component is sufficient to use, without
computing any constraint, attachment, or residual-attachment modes.
The approach requires only such input information as a few (lower)
natural frequencies and corresponding undamped normal modes of
each component. A novel technique is shown for formulation of
equations of motion, where a double transformation to generalized
coordinates is employed and formulation of nonproportional damping
matrix in generalized coordinates is shown.
Abstract: In this paper we develop an efficient numerical method for the finite-element model updating of damped gyroscopic systems based on incomplete complex modal measured data. It is assumed that the analytical mass and stiffness matrices are correct and only the damping and gyroscopic matrices need to be updated. By solving a constrained optimization problem, the optimal corrected symmetric damping matrix and skew-symmetric gyroscopic matrix complied with the required eigenvalue equation are found under a weighted Frobenius norm sense.
Abstract: There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.
Abstract: In this paper we propose an intelligent agent approach
to control the electric power grid at a smaller granularity in order to
give it self-healing capabilities. We develop a method using the
influence model to transform transmission substations into
information processing, analyzing and decision making (intelligent
behavior) units. We also develop a wireless communication method
to deliver real-time uncorrupted information to an intelligent
controller in a power system environment. A combined networking
and information theoretic approach is adopted in meeting both the
delay and error probability requirements. We use a mobile agent
approach in optimizing the achievable information rate vector and in
the distribution of rates to users (sensors). We developed the concept
and the quantitative tools require in the creation of cooperating semiautonomous
subsystems which puts the electric grid on the path
towards intelligent and self-healing system.
Abstract: The purpose of semantic web research is to transform
the Web from a linked document repository into a distributed knowledge base and application platform, thus allowing the vast range of available information and services to be more efficiently
exploited. As a first step in this transformation, languages such as
OWL have been developed. Although fully realizing the Semantic Web still seems some way off, OWL has already been very
successful and has rapidly become a defacto standard for ontology
development in fields as diverse as geography, geology, astronomy,
agriculture, defence and the life sciences. The aim of this paper is to classify key concepts of Semantic Web as well as introducing a new
practical approach which uses these concepts to outperform Word Wide Web.
Abstract: We propose an enhanced key management scheme
based on Key Infection, which is lightweight scheme for tiny sensors.
The basic scheme, Key Infection, is perfectly secure against node
capture and eavesdropping if initial communications after node
deployment is secure. If, however, an attacker can eavesdrop on
the initial communications, they can take the session key. We use
common neighbors for each node to generate the session key. Each
node has own secret key and shares it with its neighbor nodes. Then
each node can establish the session key using common neighbors-
secret keys and a random number. Our scheme needs only a few
communications even if it uses neighbor nodes- information. Without
losing the lightness of basic scheme, it improves the resistance against
eavesdropping on the initial communications more than 30%.
Abstract: Evaluation of educational portals is an important
subject area that needs more attention from researchers. A university
that has an educational portal which is difficult to use and interact by
teachers or students or management staff can reduce the position and
reputation of the university. Therefore, it is important to have the
ability to make an evaluation of the quality of e-services the
university provide to improve them over time.
The present study evaluates the usability of the Information
Technology Faculty portal at University of Benghazi. Two evaluation
methods were used: a questionnaire-based method and an online
automated tool-based method. The first method was used to measure
the portal's external attributes of usability (Information, Content and
Organization of the portal, Navigation, Links and Accessibility,
Aesthetic and Visual Appeal, Performance and Effectiveness and
educational purpose) from users' perspectives, while the second
method was used to measure the portal's internal attributes of
usability (number and size of HTML files, number and size of images,
load time, HTML check errors, browsers compatibility problems,
number of bad and broken links), which cannot be perceived by the
users. The study showed that some of the usability aspects have been
found at the acceptable level of performance and quality, and some
others have been found otherwise. In general, it was concluded that
the usability of IT faculty educational portal generally acceptable.
Recommendations and suggestions to improve the weakness and
quality of the portal usability are presented in this study.
Abstract: Recognizing the increasing importance of using the
Internet to conduct business, this paper looks at some related matters
associated with small businesses making a decision of whether or not
to have a Website and go online. Small businesses in Saudi Arabia
struggle to have this decision. For organizations, to fully go online,
conduct business and provide online information services, they need
to connect their database to the Web. Some issues related to doing
that might be beyond the capabilities of most small businesses in
Saudi Arabia, such as Website management, technical issues and
security concerns. Here we focus on a small business firm in Saudi
Arabia (Case Study), discussing the issues related to going online
decision and the firm's options of what to do and how to do it. The
paper suggested some valuable solutions of connecting databases to
the Web. It also discusses some of the important issues related to
online information services and e-commerce, mainly Web hosting
options and security issues.
Abstract: Computerized lip reading has been one of the most
actively researched areas of computer vision in recent past because
of its crime fighting potential and invariance to acoustic environment.
However, several factors like fast speech, bad pronunciation,
poor illumination, movement of face, moustaches and beards make
lip reading difficult. In present work, we propose a solution for
automatic lip contour tracking and recognizing letters of English
language spoken by speakers using the information available from
lip movements. Level set method is used for tracking lip contour
using a contour velocity model and a feature vector of lip movements
is then obtained. Character recognition is performed using modified
k nearest neighbor algorithm which assigns more weight to nearer
neighbors. The proposed system has been found to have accuracy
of 73.3% for character recognition with speaker lip movements as
the only input and without using any speech recognition system in
parallel. The approach used in this work is found to significantly
solve the purpose of lip reading when size of database is small.
Abstract: A computational platform is presented in this
contribution. It has been designed as a virtual laboratory to be used
for exploring optimization algorithms in biological problems. This
platform is built on a blackboard-based agent architecture. As a test
case, the version of the platform presented here is devoted to the
study of protein folding, initially with a bead-like description of the
chain and with the widely used model of hydrophobic and polar
residues (HP model). Some details of the platform design are
presented along with its capabilities and also are revised some
explorations of the protein folding problems with different types of
discrete space. It is also shown the capability of the platform to
incorporate specific tools for the structural analysis of the runs in
order to understand and improve the optimization process.
Accordingly, the results obtained demonstrate that the ensemble of
computational tools into a single platform is worthwhile by itself,
since experiments developed on it can be designed to fulfill different
levels of information in a self-consistent fashion. By now, it is being
explored how an experiment design can be useful to create a
computational agent to be included within the platform. These
inclusions of designed agents –or software pieces– are useful for the
better accomplishment of the tasks to be developed by the platform.
Clearly, while the number of agents increases the new version of the
virtual laboratory thus enhances in robustness and functionality.
Abstract: Performance Measurement is still a difficult task for forwarding companies. This is caused on the one hand by missing resources and on the other hand by missing tools. The research project “Management Information System for Logistics Service Providers" aims for closing the gap between needed and disposable solutions. Core of the project is the development
Abstract: Avalanche release of snow has been modeled in the present studies. Snow is assumed to be represented by semi-solid and the governing equations have been studied from the concept of continuum approach. The dynamical equations have been solved for two different zones [starting zone and track zone] by using appropriate initial and boundary conditions. Effect of density (ρ), Eddy viscosity (η), Slope angle (θ), Slab depth (R) on the flow parameters have been observed in the present studies. Numerical methods have been employed for computing the non linear differential equations. One of the most interesting and fundamental innovation in the present studies is getting initial condition for the computation of velocity by numerical approach. This information of the velocity has obtained through the concept of fracture mechanics applicable to snow. The results on the flow parameters have found to be in qualitative agreement with the published results.
Abstract: Agriculture products are being more demanding in
market today. To increase its productivity, automation to produce
these products will be very helpful. The purpose of this work is to
measure and determine the ripeness and quality of watermelon. The
textures on watermelon skin will be captured using digital camera.
These images will be filtered using image processing technique. All
these information gathered will be trained using ANN to determine
the watermelon ripeness accuracy. Initial results showed that the best
model has produced percentage accuracy of 86.51%, when measured
at 32 hidden units with a balanced percentage rate of training dataset.
Abstract: fibers of pure cellulose can be made from some bacteria such as acetobacter xylinum. Bacterial cellulose fibers are very pure, tens of nm across and about 0.5 micron long. The fibers are very stiff and, although nobody seems to have measured the strength of individual fibers. Their stiffness up to 70 GPa. Fundamental strengths should be at least greater than those of the best commercial polymers, but best bulk strength seems to about the same as that of steel. They can potentially be produced in industrial quantities at greatly lowered cost and water content, and with triple the yield, by a new process. This article presents a critical review of the available information on the bacterial cellulose as a biological nonwoven fabric with special emphasis on its fermentative production and applications. Characteristics of bacterial cellulose biofabric with respect to its structure and physicochemical properties are discussed. Current and potential applications of bacterial cellulose in textile, nonwoven cloth, paper, films synthetic fiber coating, food, pharmaceutical and other industries are also presented.