Abstract: In recent years image watermarking has become an
important research area in data security, confidentiality and image
integrity. Many watermarking techniques were proposed for medical
images. However, medical images, unlike most of images, require
extreme care when embedding additional data within them because
the additional information must not affect the image quality and
readability. Also the medical records, electronic or not, are linked to
the medical secrecy, for that reason, the records must be confidential.
To fulfill those requirements, this paper presents a lossless
watermarking scheme for DICOM images. The proposed a fragile
scheme combines two reversible techniques based on difference
expansion for patient's data hiding and protecting the region of
interest (ROI) with tamper detection and recovery capability.
Patient's data are embedded into ROI, while recovery data are
embedded into region of non-interest (RONI). The experimental
results show that the original image can be exactly extracted from the
watermarked one in case of no tampering. In case of tampered ROI,
tampered area can be localized and recovered with a high quality
version of the original area.
Abstract: Case based reasoning (CBR) methodology presents a foundation for a new technology of building intelligent computeraided diagnoses systems. This Technology directly addresses the problems found in the traditional Artificial Intelligence (AI) techniques, e.g. the problems of knowledge acquisition, remembering, robust and maintenance. This paper discusses the CBR methodology, the research issues and technical aspects of implementing intelligent medical diagnoses systems. Successful applications in cancer and heart diseases developed by Medical Informatics Research Group at Ain Shams University are also discussed.
Abstract: All Text processing systems allow their users to
search a pattern of string from a given text. String matching is
fundamental to database and text processing applications. Every text
editor must contain a mechanism to search the current document for
arbitrary strings. Spelling checkers scan an input text for words in the
dictionary and reject any strings that do not match. We store our
information in data bases so that later on we can retrieve the same
and this retrieval can be done by using various string matching
algorithms. This paper is describing a new string matching algorithm
for various applications. A new algorithm has been designed with the
help of Rabin Karp Matcher, to improve string matching process.
Abstract: Reinforced Concrete (RC) structures strengthened
with fiber reinforced polymer (FRP) lack in thermal resistance under
elevated temperatures in the event of fire. This phenomenon led to
the lining of strengthened concrete with thin high performance
cementitious composites (THPCC) to protect the substrate against
elevated temperature. Elevated temperature effects on THPCC, based
on different cementitious materials have been studied in the past but
high-alumina cement (HAC)-based THPCC have not been well
characterized. This research study will focus on the THPCC based on
HAC replaced by 60%, 70%, 80% and 85% of ground granulated
blast furnace slag (GGBS). Samples were evaluated by the
measurement of their mechanical strength (28 & 56 days of curing)
after exposed to 400°C, 600°C and 28°C of room temperature for
comparison and corroborated by their microstructure study. Results
showed that among all mixtures, the mix containing only HAC
showed the highest compressive strength after exposed to 600°C as
compared to other mixtures. However, the tensile strength of THPCC
made of HAC and 60% GGBS content was comparable to the
THPCC with HAC only after exposed to 600°C. Field emission
scanning electron microscopy (FESEM) images of THPCC
accompanying Energy Dispersive X-ray (EDX) microanalysis
revealed that the microstructure deteriorated considerably after
exposure to elevated temperatures which led to the decrease in
mechanical strength.
Abstract: In this paper authors presented the research of textile electroconductive materials, which can be used to construction
sensory textronic shirt to breath frequency measurement.
The full paper also will present results of measurements carried
out on unique measurement stands.
Abstract: As the web continues to grow exponentially, the idea
of crawling the entire web on a regular basis becomes less and less
feasible, so the need to include information on specific domain,
domain-specific search engines was proposed. As more information
becomes available on the World Wide Web, it becomes more difficult
to provide effective search tools for information access. Today,
people access web information through two main kinds of search
interfaces: Browsers (clicking and following hyperlinks) and Query
Engines (queries in the form of a set of keywords showing the topic
of interest) [2]. Better support is needed for expressing one's
information need and returning high quality search results by web
search tools. There appears to be a need for systems that do reasoning
under uncertainty and are flexible enough to recover from the
contradictions, inconsistencies, and irregularities that such reasoning
involves. In a multi-view problem, the features of the domain can be
partitioned into disjoint subsets (views) that are sufficient to learn the
target concept. Semi-supervised, multi-view algorithms, which
reduce the amount of labeled data required for learning, rely on the
assumptions that the views are compatible and uncorrelated. This
paper describes the use of semi-structured machine learning approach
with Active learning for the “Domain Specific Search Engines". A
domain-specific search engine is “An information access system that
allows access to all the information on the web that is relevant to a
particular domain. The proposed work shows that with the help of
this approach relevant data can be extracted with the minimum
queries fired by the user. It requires small number of labeled data and
pool of unlabelled data on which the learning algorithm is applied to
extract the required data.
Abstract: One of the essential requirements for the human
beings is the house for living. This is necessary to make the place of
satisfaction for contemporary houses residents by attention to their
culture. In this article represented the relevant theoretical literature
on cultural symbols by use the architecture semiotic to construct the
houses as a better place for living. In fact, make a place for everyday
life with changing the house to the home is one of the most
challengeable subject for architects all around the world. The target
of this article is to find Cypriot houses cultural symbols that assist
architect to design and build contemporary houses, to make more
satisfaction for its residents according to Cypriot life style and their
culture. This paper is based on researching the effect of cultural
symbols on housing, would require various types of methods.
However, this study focuses on two methods, which are quantitative
and qualitative. The purpose of the case-specific study is to finding
the symbols that used in contemporary houses by attention to the
Cypriot cultural symbols in Famagusta houses.
Abstract: The prospective analysis is presented as an important tool to identify the most relevant opportunities and needs in research and development from planned interventions in innovation systems. This study chose Phyllanthus niruri, known as "stone break" to describe the knowledge about the specie, by using biotechnological forecasting through the software Vantage Point. It can be seen a considerable increase in studies on Phyllanthus niruri in recent years and that there are patents about this plant since twenty-five years ago. India was the country that most carried out research on the specie, showing interest, mainly in studies of hepatoprotection, antioxidant and anti-cancer activities. Brazil is in the second place, with special interest for anti-tumor studies. Given the identification of the Brazilian groups that exploit the species it is possible to mediate partnerships and cooperation aiming to help on the implementing of the Program of Herbal medicines (phytotherapics) in Brazil.
Abstract: In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.
Abstract: Along with the advances in medicine, providing medical information to individual patient is becoming more important. In Japan such information via Braille is hardly provided to blind and partially sighted people. Thus we are researching and developing a Web-based automatic translation program “eBraille" to translate Japanese text into Japanese Braille. First we analyzed the Japanese transcription rules to implement them on our program. We then added medical words to the dictionary of the program to improve its translation accuracy for medical text. Finally we examined the efficacy of statistical learning models (SLMs) for further increase of word segmentation accuracy in braille translation. As a result, eBraille had the highest translation accuracy in the comparison with other translation programs, improved the accuracy for medical text and is utilized to make hospital brochures in braille for outpatients and inpatients.
Abstract: This paper investigated the impact of ceiling height and window head heights variation on daylighting inside architectural teaching studio with a full width window. In architectural education, using the studio is more than normal classroom in most credit hours. Therefore, window position, size and dimension of studio have direct influence on level of daylighting. Daylighting design is a critical factor that improves student learning, concentration and behavior, in addition to these, it also reduces energy consumption. The methodology of analysis involves using Radiance in IES software under overcast and cloudy sky in Malaysia. It has been established that presentation of daylighting of architecture studio can be enhanced by changing the ceiling heights and window level, because, different ceiling heights and window head heights can contribute to different range of daylight levels.
Abstract: This research was to evaluate a technical feasibility of
making single-layer experimental particleboard panels from bamboo
waste (Dendrocalamus asper Backer) by converting bamboo into
strips, which are used to make laminated bamboo furniture. Variable
factors were density (600, 700 and 800 kg/m3) and temperature of
condition (25, 40 and 55 °C). The experimental panels were tested for
their physical and mechanical properties including modulus of
elasticity (MOE), modulus of rupture (MOR), internal bonding
strength (IB), screw holding strength (SH) and thickness swelling
values according to the procedures defined by Japanese Industrial
Standard (JIS). The test result of mechanical properties showed that
the MOR, MOE and IB values were not in the set criteria, except the
MOR values at the density of 700 kg/m3 at 25 °C and at the density
of 800 kg/m3 at 25 and 40 °C, the IB values at the density of 600
kg/m3, at 40 °C, and at the density of 800 kg/m3 at 55 °C. The SH
values had the test result according to the set standard, except with
the density of 600 kg/m3, at 40 and 55 °C. Conclusively, a valuable
renewable biomass, bamboo waste could be used to manufacture
boards.
Abstract: Nozzle is the main part of various spinning systems
such as air-jet and Murata air vortex systems. Recently, many
researchers worked on the usage of the nozzle on different spinning
systems such as conventional ring and compact spinning systems. In
these applications, primary purpose is to improve the yarn quality. In
present study, it was produced the yarns with two different nozzle
types and determined the changes in yarn properties. In order to
explain the effect of the nozzle, airflow structure in the nozzle was
modelled and airflow variables were determined. In numerical
simulation, ANSYS 12.1 package program and Fluid Flow (CFX)
analysis method was used. As distinct from the literature, Shear
Stress Turbulent (SST) model is preferred. And also air pressure at
the nozzle inlet was measured by electronic mass flow meter and
these values were used for the simulation of the airflow. At last, the
yarn was modelled and the area from where the yarn is passing was
included to the numerical analysis.
Abstract: The paper depicts air velocity values, reproduced by laser Doppler anemometer (LDA) and ultrasonic anemometer (UA), relations with calculated ones from flow rate measurements using the gas meter which calibration uncertainty is ± (0.15 – 0.30) %. Investigation had been performed in channel installed in aerodynamical facility used as a part of national standard of air velocity. Relations defined in a research let us confirm the LDA and UA for air velocity reproduction to be the most advantageous measures. The results affirm ultrasonic anemometer to be reliable and favourable instrument for measurement of mean velocity or control of velocity stability in the velocity range of 0.05 m/s – 10 (15) m/s when the LDA used. The main aim of this research is to investigate low velocity regularities, starting from 0.05 m/s, including region of turbulent, laminar and transitional air flows. Theoretical and experimental results and brief analysis of it are given in the paper. Maximum and mean velocity relations for transitional air flow having unique distribution are represented. Transitional flow having distinctive and different from laminar and turbulent flow characteristics experimentally have not yet been analysed.
Abstract: There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.
Abstract: This paper shows that some properties of the decision
rules in the literature do not hold by presenting a counterexample. We
give sufficient and necessary conditions under which these properties
are valid. These results will be helpful when one tries to choose the
right decision rules in the research of rough set theory.
Abstract: The purpose of semantic web research is to transform
the Web from a linked document repository into a distributed knowledge base and application platform, thus allowing the vast range of available information and services to be more efficiently
exploited. As a first step in this transformation, languages such as
OWL have been developed. Although fully realizing the Semantic Web still seems some way off, OWL has already been very
successful and has rapidly become a defacto standard for ontology
development in fields as diverse as geography, geology, astronomy,
agriculture, defence and the life sciences. The aim of this paper is to classify key concepts of Semantic Web as well as introducing a new
practical approach which uses these concepts to outperform Word Wide Web.
Abstract: Hidden Markov Model (HMM) is a stochastic method
which has been used in various signal processing and character
recognition. This study proposes to use HMM to recognize Javanese
characters from a number of different handwritings, whereby HMM
is used to optimize the number of state and feature extraction. An
85.7 % accuracy is obtained as the best result in 16-stated vertical
model using pure HMM. This initial result is satisfactory for
prompting further research.
Abstract: We measured the major and trace element contents
and Rb-Sr isotopic compositions of 12 tektites from the Maoming
area, Guandong province (south China). All the samples studied are
splash-form tektites which show pitted or grooved surfaces with
schlieren structures on some surfaces. The trace element ratios Ba/Rb
(avg. 4.33), Th/Sm (avg. 2.31), Sm/Sc (avg. 0.44), Th/Sc (avg. 1.01) ,
La/Sc (avg. 2.86), Th/U (avg. 7.47), Zr/Hf (avg. 46.01) and the rare
earth elements (REE) contents of tektites of this study are similar to the
average upper continental crust. From the chemical composition, it is
suggested that tektites in this study are derived from similar parental
terrestrial sedimentary deposit which may be related to post-Archean
upper crustal rocks. The tektites from the Maoming area have high
positive εSr(0) values-ranging from 176.9~190.5 which indicate that
the parental material for these tektites have similar Sr isotopic
compositions to old terrestrial sedimentary rocks and they were not
dominantly derived from recent young sediments (such as soil or
loess). The Sr isotopic data obtained by the present study support the
conclusion proposed by Blum et al. (1992)[1] that the depositional age
of sedimentary target materials is close to 170Ma (Jurassic). Mixing
calculations based on the model proposed by Ho and Chen (1996)[2]
for various amounts and combinations of target rocks indicate that the
best fit for tektites from the Maoming area is a mixture of 40% shale,
30% greywacke, 30% quartzite.
Abstract: Automated discovery of hierarchical structures in
large data sets has been an active research area in the recent past.
This paper focuses on the issue of mining generalized rules with crisp
hierarchical structure using Genetic Programming (GP) approach to
knowledge discovery. The post-processing scheme presented in this
work uses flat rules as initial individuals of GP and discovers
hierarchical structure. Suitable genetic operators are proposed for the
suggested encoding. Based on the Subsumption Matrix(SM), an
appropriate fitness function is suggested. Finally, Hierarchical
Production Rules (HPRs) are generated from the discovered
hierarchy. Experimental results are presented to demonstrate the
performance of the proposed algorithm.