Abstract: This paper reports on an effort to address the issue of
inequality in girls- and women-s access to science, engineering and
technology (SET) education and careers through raising awareness on
SET among secondary school girls in South Africa. Girls participated
in hands-on high-tech rapid prototyping environment of a fabrication
laboratory that was aimed at stimulating creativity and innovation as
part of a Fab Kids initiative. The Fab Kids intervention is about
creating a SET pipeline as part of the Young Engineers and Scientists
of Africa Initiative.The methodology was based on a real world
situation and a hands-on approach. In the process, participants
acquired a number of skills including computer-aided design,
research skills, communication skills, teamwork skills, technical
drawing skills, writing skills and problem-solving skills. Exposure to
technology enhanced the girls- confidence in being able to handle
technology-related tasks.
Abstract: Many natural language expressions are ambiguous, and
need to draw on other sources of information to be interpreted.
Interpretation of the e word تعاون to be considered as a noun or a verb
depends on the presence of contextual cues. To interpret words we
need to be able to discriminate between different usages. This paper
proposes a hybrid of based- rules and a machine learning method for
tagging Arabic words. The particularity of Arabic word that may be
composed of stem, plus affixes and clitics, a small number of rules
dominate the performance (affixes include inflexional markers for
tense, gender and number/ clitics include some prepositions,
conjunctions and others). Tagging is closely related to the notion of
word class used in syntax. This method is based firstly on rules (that
considered the post-position, ending of a word, and patterns), and
then the anomaly are corrected by adopting a memory-based learning
method (MBL). The memory_based learning is an efficient method to
integrate various sources of information, and handling exceptional
data in natural language processing tasks. Secondly checking the
exceptional cases of rules and more information is made available to
the learner for treating those exceptional cases. To evaluate the
proposed method a number of experiments has been run, and in
order, to improve the importance of the various information in
learning.
Abstract: The overall objective of this research is a strain
improvement technology for efficient pectinase production. A novel
cells cultivation technology by immobilization of fungal cells has
been studied in long time continuous fermentations. Immobilization
was achieved by using of new material for absorption of stores of
immobilized cultures which was for the first time used for
immobilization of microorganisms. Effects of various conditions of
nitrogen and carbon nutrition on the biosynthesis of pectolytic
enzymes in Aspergillus awamori 1-8 strain were studied. Proposed
cultivation technology along with optimization of media components
for pectinase overproduction led to increased pectinase productivity
in Aspergillus awamori 1-8 from 7 to 8 times. Proposed technology
can be applied successfully for production of major industrial
enzymes such as α-amylase, protease, collagenase etc.
Abstract: Video-on-demand (VOD) is designed by using content delivery networks (CDN) to minimize the overall operational cost and to maximize scalability. Estimation of the viewing pattern (i.e., the relationship between the number of viewings and the ranking of VOD contents) plays an important role in minimizing the total operational cost and maximizing the performance of the VOD systems. In this paper, we have analyzed a large body of commercial VOD viewing data and found that the viewing rank distribution fits well with the parabolic fractal distribution. The weighted linear model fitting function is used to estimate the parameters (coefficients) of the parabolic fractal distribution. This paper presents an analytical basis for designing an optimal hierarchical VOD contents distribution system in terms of its cost and performance.
Abstract: Expression data analysis is based mostly on the
statistical approaches that are indispensable for the study of
biological systems. Large amounts of multidimensional data resulting
from the high-throughput technologies are not completely served by
biostatistical techniques and are usually complemented with visual,
knowledge discovery and other computational tools. In many cases,
in biological systems we only speculate on the processes that are
causing the changes, and it is the visual explorative analysis of data
during which a hypothesis is formed. We would like to show the
usability of multidimensional visualization tools and promote their
use in life sciences. We survey and show some of the
multidimensional visualization tools in the process of data
exploration, such as parallel coordinates and radviz and we extend
them by combining them with the self-organizing map algorithm. We
use a time course data set of transitional cell carcinoma of the bladder
in our examples. Analysis of data with these tools has the potential to
uncover additional relationships and non-trivial structures.
Abstract: Biometric measures of one kind or another have been
used to identify people since ancient times, with handwritten
signatures, facial features, and fingerprints being the traditional
methods. Of late, Systems have been built that automate the task of
recognition, using these methods and newer ones, such as hand
geometry, voiceprints and iris patterns. These systems have different
strengths and weaknesses. This work is a two-section composition. In
the starting section, we present an analytical and comparative study
of common biometric techniques. The performance of each of them
has been viewed and then tabularized as a result. The latter section
involves the actual implementation of the techniques under
consideration that has been done using a state of the art tool called,
MATLAB. This tool aids to effectively portray the corresponding
results and effects.
Abstract: Starting from a biologically inspired framework, Gabor filters were built up from retinal filters via LMSE algorithms. Asubset of retinal filter kernels was chosen to form a particular Gabor filter by using a weighted sum. One-dimensional optimization approaches were shown to be inappropriate for the problem. All model parameters were fixed with biological or image processing constraints. Detailed analysis of the optimization procedure led to the introduction of a minimization constraint. Finally, quantization of weighting factors was investigated. This resulted in an optimized cascaded structure of a Gabor filter bank implementation with lower computational cost.
Abstract: Natural organic matter (NOM) is heterogeneous
mixture of organic compounds that enter the water media from
animal and plant remains, domestic and industrial wastes.
Researches showed that NOM is likely precursor material for
disinfection by products (DBPs). Chlorine very commenly used for
disinfection purposes and NOM and chlorine reacts then
Trihalomethane (THM) and Haloacetic acids (HAAs) which are
cancerogenics for human health are produced. The aim of the study is
to search NOM removal by enhanced coagulation from drinking
water source of Eskisehir which is supplied from Porsuk Dam.
Recently, Porsuk dam water is getting highly polluted and therefore
NOM concentration is increasing. Enhanced coagulation studies were
evaluated by measurement of Dissolved Organic Carbon (DOC), UV
absorbance at 254 nm (UV254), and different trihalomethane
formation potential (THMFP) tests. Results of jar test experiments
showed that NOM can be removed from water about 40-50 % of
efficiency by enhanced coagulation. Optimum coagulant type and
coagulant dosages were determined using FeCl3 and Alum.
Abstract: The steady-state temperature for one-dimensional transpiration cooling system has been conducted experimentally and numerically to investigate the heat transfer characteristics of combined convection and radiation. The Nickel –Chrome (Ni-Cr) open-cellular porous material having porosity of 0.93 and pores per inch (PPI) of 21.5 was examined. The upper surface of porous plate was heated by the heat flux of incoming radiation varying from 7.7 - 16.6 kW/m2 whereas air injection velocity fed into the lower surface was varied from 0.36 - 1.27 m/s, and was then rearranged as Reynolds number (Re). For the report of the results in the present study, two efficiencies including of temperature and conversion efficiency were presented. Temperature efficiency indicating how close the mean temperature of a porous heat plate to that of inlet air, and increased rapidly with the air injection velocity (Re). It was then saturated and had a constant value at Re higher than 10. The conversion efficiency, which was regarded as the ability of porous material in transferring energy by convection after absorbed from heat radiation, decreased with increasing of the heat flux and air injection velocity. In addition, it was then asymptotic to a constant value at the Re higher than 10. The numerical predictions also agreed with experimental data very well.
Abstract: With the beginning of the new century, man still faces
many challenges in how to form and develop his urban environment. To meet these challenges, many cities have tried to develop its visual
image. This is by transforming their urban environment into a branded visual image; this is at the level of squares, the main roads, the borders, and the landmarks.
In this realm, the paper aims at activating the role of branded urban spaces as an approach for the development of visual image of cities, especially in Egypt. It concludes the need to recognize the importance of developing the visual image in Egypt, through directing the urban planners to the important role of such spaces in achieving sustainability.
Abstract: A camera in the building site is exposed to different
weather conditions. Differences between images of the same scene
captured with the same camera arise also due to temperature variations.
The influence of temperature changes on camera parameters
were modelled and integrated into existing analytical camera model.
Modified camera model enables quantitatively assessing the influence
of temperature variations.
Abstract: In-place sorting algorithms play an important role in many fields such as very large database systems, data warehouses, data mining, etc. Such algorithms maximize the size of data that can be processed in main memory without input/output operations. In this paper, a novel in-place sorting algorithm is presented. The algorithm comprises two phases; rearranging the input unsorted array in place, resulting segments that are ordered relative to each other but whose elements are yet to be sorted. The first phase requires linear time, while, in the second phase, elements of each segment are sorted inplace in the order of z log (z), where z is the size of the segment, and O(1) auxiliary storage. The algorithm performs, in the worst case, for an array of size n, an O(n log z) element comparisons and O(n log z) element moves. Further, no auxiliary arithmetic operations with indices are required. Besides these theoretical achievements of this algorithm, it is of practical interest, because of its simplicity. Experimental results also show that it outperforms other in-place sorting algorithms. Finally, the analysis of time and space complexity, and required number of moves are presented, along with the auxiliary storage requirements of the proposed algorithm.
Abstract: The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.
Abstract: In this experimental investigation shake table tests
were conducted on two reduced models that represent normal single
room building constructed by Compressed Stabilized Earth Block
(CSEB) from locally available soil. One model was constructed with
earthquake resisting features (EQRF) having sill band, lintel band and
vertical bands to control the building vibration and another one was
without Earthquake Resisting Features. To examine the seismic
capacity of the models particularly when it is subjected to long-period
ground motion by large amplitude by many cycles of repeated
loading, the test specimen was shaken repeatedly until the failure.
The test results from Hi-end Data Acquisition system show that
model with EQRF behave better than without EQRF. This modified
masonry model with new material combined with new bands is used
to improve the behavior of masonry building.
Abstract: Calibration estimation is a method of adjusting the
original design weights to improve the survey estimates by using
auxiliary information such as the known population total (or mean)
of the auxiliary variables. A calibration estimator uses calibrated
weights that are determined to minimize a given distance measure to
the original design weights while satisfying a set of constraints
related to the auxiliary information. In this paper, we propose a new
multivariate calibration estimator for the population mean in the
stratified sampling design, which incorporates information available
for more than one auxiliary variable. The problem of determining the
optimum calibrated weights is formulated as a Mathematical
Programming Problem (MPP) that is solved using the Lagrange
multiplier technique.
Abstract: In this paper, we present the information life cycle, and analyze the importance of managing the corporate application portfolio across this life cycle. The approach presented here does not correspond just to the extension of the traditional information system development life cycle. This approach is based in the generic life cycle employed in other contexts like manufacturing or marketing. In this paper it is proposed a model of an information system life cycle, supported in the assumption that a system has a limited life. But, this limited life may be extended. This model is also applied in several cases; being reported here two examples of the framework application in a construction enterprise, and in a manufacturing enterprise.
Abstract: This paper presents a novel genetic algorithm, termed
the Optimum Individual Monogenetic Algorithm (OIMGA) and
describes its hardware implementation. As the monogenetic strategy
retains only the optimum individual, the memory requirement is
dramatically reduced and no crossover circuitry is needed, thereby
ensuring the requisite silicon area is kept to a minimum.
Consequently, depending on application requirements, OIMGA
allows the investigation of solutions that warrant either larger GA
populations or individuals of greater length. The results given in this
paper demonstrate that both the performance of OIMGA and its
convergence time are superior to those of existing hardware GA
implementations. Local convergence is achieved in OIMGA by
retaining elite individuals, while population diversity is ensured by
continually searching for the best individuals in fresh regions of the
search space.
Abstract: In recent years image watermarking has become an
important research area in data security, confidentiality and image
integrity. Many watermarking techniques were proposed for medical
images. However, medical images, unlike most of images, require
extreme care when embedding additional data within them because
the additional information must not affect the image quality and
readability. Also the medical records, electronic or not, are linked to
the medical secrecy, for that reason, the records must be confidential.
To fulfill those requirements, this paper presents a lossless
watermarking scheme for DICOM images. The proposed a fragile
scheme combines two reversible techniques based on difference
expansion for patient's data hiding and protecting the region of
interest (ROI) with tamper detection and recovery capability.
Patient's data are embedded into ROI, while recovery data are
embedded into region of non-interest (RONI). The experimental
results show that the original image can be exactly extracted from the
watermarked one in case of no tampering. In case of tampered ROI,
tampered area can be localized and recovered with a high quality
version of the original area.
Abstract: Case based reasoning (CBR) methodology presents a foundation for a new technology of building intelligent computeraided diagnoses systems. This Technology directly addresses the problems found in the traditional Artificial Intelligence (AI) techniques, e.g. the problems of knowledge acquisition, remembering, robust and maintenance. This paper discusses the CBR methodology, the research issues and technical aspects of implementing intelligent medical diagnoses systems. Successful applications in cancer and heart diseases developed by Medical Informatics Research Group at Ain Shams University are also discussed.
Abstract: A given polynomial, possibly with multiple roots, is
factored into several lower-degree distinct-root polynomials with
natural-order-integer powers. All the roots, including multiplicities,
of the original polynomial may be obtained by solving these lowerdegree
distinct-root polynomials, instead of the original high-degree
multiple-root polynomial directly.
The approach requires polynomial Greatest Common Divisor
(GCD) computation. The very simple and effective process, “Monic
polynomial subtractions" converted trickily from “Longhand
polynomial divisions" of Euclidean algorithm is employed. It
requires only simple elementary arithmetic operations without any
advanced mathematics.
Amazingly, the derived routine gives the expected results for the
test polynomials of very high degree, such as p( x) =(x+1)1000.