Abstract: Bond graph models of an electrical transformer
including the nonlinear saturation are presented. These models
determine the relation between self and mutual inductances, and
the leakage and magnetizing inductances of power transformers
with two and three windings using the properties of a bond
graph. The modelling and analysis using this methodology to
three phase power transformers or transformers with internal
incipient faults can be extended.
Abstract: The steady-state temperature for one-dimensional transpiration cooling system has been conducted experimentally and numerically to investigate the heat transfer characteristics of combined convection and radiation. The Nickel –Chrome (Ni-Cr) open-cellular porous material having porosity of 0.93 and pores per inch (PPI) of 21.5 was examined. The upper surface of porous plate was heated by the heat flux of incoming radiation varying from 7.7 - 16.6 kW/m2 whereas air injection velocity fed into the lower surface was varied from 0.36 - 1.27 m/s, and was then rearranged as Reynolds number (Re). For the report of the results in the present study, two efficiencies including of temperature and conversion efficiency were presented. Temperature efficiency indicating how close the mean temperature of a porous heat plate to that of inlet air, and increased rapidly with the air injection velocity (Re). It was then saturated and had a constant value at Re higher than 10. The conversion efficiency, which was regarded as the ability of porous material in transferring energy by convection after absorbed from heat radiation, decreased with increasing of the heat flux and air injection velocity. In addition, it was then asymptotic to a constant value at the Re higher than 10. The numerical predictions also agreed with experimental data very well.
Abstract: With the beginning of the new century, man still faces
many challenges in how to form and develop his urban environment. To meet these challenges, many cities have tried to develop its visual
image. This is by transforming their urban environment into a branded visual image; this is at the level of squares, the main roads, the borders, and the landmarks.
In this realm, the paper aims at activating the role of branded urban spaces as an approach for the development of visual image of cities, especially in Egypt. It concludes the need to recognize the importance of developing the visual image in Egypt, through directing the urban planners to the important role of such spaces in achieving sustainability.
Abstract: A camera in the building site is exposed to different
weather conditions. Differences between images of the same scene
captured with the same camera arise also due to temperature variations.
The influence of temperature changes on camera parameters
were modelled and integrated into existing analytical camera model.
Modified camera model enables quantitatively assessing the influence
of temperature variations.
Abstract: The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.
Abstract: This paper presents anti-synchronization of chaos
between two different chaotic systems using active control method.
The proposed technique is applied to achieve chaos antisynchronization
for the Lü and Rössler dynamical systems.
Numerical simulations are implemented to verify the results.
Abstract: Calibration estimation is a method of adjusting the
original design weights to improve the survey estimates by using
auxiliary information such as the known population total (or mean)
of the auxiliary variables. A calibration estimator uses calibrated
weights that are determined to minimize a given distance measure to
the original design weights while satisfying a set of constraints
related to the auxiliary information. In this paper, we propose a new
multivariate calibration estimator for the population mean in the
stratified sampling design, which incorporates information available
for more than one auxiliary variable. The problem of determining the
optimum calibrated weights is formulated as a Mathematical
Programming Problem (MPP) that is solved using the Lagrange
multiplier technique.
Abstract: In this paper, we present the information life cycle, and analyze the importance of managing the corporate application portfolio across this life cycle. The approach presented here does not correspond just to the extension of the traditional information system development life cycle. This approach is based in the generic life cycle employed in other contexts like manufacturing or marketing. In this paper it is proposed a model of an information system life cycle, supported in the assumption that a system has a limited life. But, this limited life may be extended. This model is also applied in several cases; being reported here two examples of the framework application in a construction enterprise, and in a manufacturing enterprise.
Abstract: This paper presents a novel genetic algorithm, termed
the Optimum Individual Monogenetic Algorithm (OIMGA) and
describes its hardware implementation. As the monogenetic strategy
retains only the optimum individual, the memory requirement is
dramatically reduced and no crossover circuitry is needed, thereby
ensuring the requisite silicon area is kept to a minimum.
Consequently, depending on application requirements, OIMGA
allows the investigation of solutions that warrant either larger GA
populations or individuals of greater length. The results given in this
paper demonstrate that both the performance of OIMGA and its
convergence time are superior to those of existing hardware GA
implementations. Local convergence is achieved in OIMGA by
retaining elite individuals, while population diversity is ensured by
continually searching for the best individuals in fresh regions of the
search space.
Abstract: A well balanced numerical scheme based on
stationary waves for shallow water flows with arbitrary topography
has been introduced by Thanh et al. [18]. The scheme was
constructed so that it maintains equilibrium states and tests indicate
that it is stable and fast. Applying the well-balanced scheme for the
one-dimensional shallow water equations, we study the early shock
waves propagation towards the Phuket coast in Southern Thailand
during a hypothetical tsunami. The initial tsunami wave is generated
in the deep ocean with the strength that of Indonesian tsunami of
2004.
Abstract: Case based reasoning (CBR) methodology presents a foundation for a new technology of building intelligent computeraided diagnoses systems. This Technology directly addresses the problems found in the traditional Artificial Intelligence (AI) techniques, e.g. the problems of knowledge acquisition, remembering, robust and maintenance. This paper discusses the CBR methodology, the research issues and technical aspects of implementing intelligent medical diagnoses systems. Successful applications in cancer and heart diseases developed by Medical Informatics Research Group at Ain Shams University are also discussed.
Abstract: All Text processing systems allow their users to
search a pattern of string from a given text. String matching is
fundamental to database and text processing applications. Every text
editor must contain a mechanism to search the current document for
arbitrary strings. Spelling checkers scan an input text for words in the
dictionary and reject any strings that do not match. We store our
information in data bases so that later on we can retrieve the same
and this retrieval can be done by using various string matching
algorithms. This paper is describing a new string matching algorithm
for various applications. A new algorithm has been designed with the
help of Rabin Karp Matcher, to improve string matching process.
Abstract: This work is devoted to the calculation of the
undulatory parameters and the study of the influence thickness of
electrical sheet on overvoltage compared to the carcass and between
whorls (sections) of the asynchronous motors supplied with PWM
converters.
Abstract: A given polynomial, possibly with multiple roots, is
factored into several lower-degree distinct-root polynomials with
natural-order-integer powers. All the roots, including multiplicities,
of the original polynomial may be obtained by solving these lowerdegree
distinct-root polynomials, instead of the original high-degree
multiple-root polynomial directly.
The approach requires polynomial Greatest Common Divisor
(GCD) computation. The very simple and effective process, “Monic
polynomial subtractions" converted trickily from “Longhand
polynomial divisions" of Euclidean algorithm is employed. It
requires only simple elementary arithmetic operations without any
advanced mathematics.
Amazingly, the derived routine gives the expected results for the
test polynomials of very high degree, such as p( x) =(x+1)1000.
Abstract: This paper gives an overview of the mapping
mechanism of SEAM-a methodology for the automatic generation of
knowledge models and its mapping onto Java codes. It discusses the
rules that will be used to map the different components in the
knowledge model automatically onto Java classes, properties and
methods. The aim of developing this mechanism is to help in the
creation of a prototype which will be used to validate the knowledge
model which has been generated automatically. It will also help to
link the modeling phase with the implementation phase as existing
knowledge engineering methodologies do not provide for proper
guidelines for the transition from the knowledge modeling phase to
development phase. This will decrease the development overheads
associated to the development of Knowledge Based Systems.
Abstract: Bioleaching of spent catalyst using moderate thermophilic chemolithotrophic acidophiles in growth medium without Fe source was investigated with two different pulp densities and three different size fractions. All the experiments were conducted on shake flasks at a temperature of 65 °C. The leaching yield of Ni and Al was found to be promising with very high leaching yield of 92-96% followed by Al as 41-76%, which means both Ni and Al leaching were favored by the moderate thermophilic bioleaching compared to the mesophilic bioleaching. The acid consumption was comparatively higher for the 10% pulp density experiments. Comparatively minimal difference in the leaching yield with different size fractions and different pulp densities show no requirement of grinding and using low pulp density less than 10%. This process would rather be economical as well as eco-friendly process for future optimization of the recovery of metal values from spent catalyst.
Abstract: In this paper the use of sequential machines for recognizing actions taken by the objects detected by a general tracking algorithm is proposed. The system may deal with the uncertainty inherent in medium-level vision data. For this purpose, fuzzification of input data is performed. Besides, this transformation allows to manage data independently of the tracking application selected and enables adding characteristics of the analyzed scenario. The representation of actions by means of an automaton and the generation of the input symbols for finite automaton depending on the object and action compared are described. The output of the comparison process between an object and an action is a numerical value that represents the membership of the object to the action. This value is computed depending on how similar the object and the action are. The work concludes with the application of the proposed technique to identify the behavior of vehicles in road traffic scenes.
Abstract: Since water resources of desert Naein City are very
limited, a approach which saves water resources and meanwhile
meets the needs of the greenspace for water is to use city-s sewage
wastewater. Proper treatment of Naein-s sewage up to the standards
required for green space uses may solve some of the problems of
green space development of the city. The present paper closely
examines available statistics and information associated with city-s
sewage system, and determines complementary stages of sewage
treatment facilities of the city. In the present paper, population, per
capita water use, and required discharge for various greenspace
pieces including different plants are calculated. Moreover, in order to
facilitate the application of water resources, a Crude water
distribution network apart from drinking water distribution network is
designed, and a plan for mixing municipal wells- water with sewage
wastewater in proposed mixing tanks is suggested. Hence, following
greenspace irrigation reform and complementary plan, per capita
greenspace of the city will be increased from current amount of 13.2
square meters to 32 square meters.
Abstract: Multiport diffusers are the effective engineering
devices installed at the modern marine outfalls for the steady
discharge of effluent streams from the coastal plants, such as
municipal sewage treatment, thermal power generation and seawater
desalination. A mathematical model using a two-dimensional
advection-diffusion equation based on a flat seabed and incorporating
the effect of a coastal tidal current is developed to calculate the
compounded concentration following discharges of desalination
brine from a sea outfall with multiport diffusers. The analytical
solutions are computed graphically to illustrate the merging of
multiple brine plumes in shallow coastal waters, and further
approximation will be made to the maximum shoreline's
concentration to formulate dilution of a multiport diffuser discharge.
Abstract: As the web continues to grow exponentially, the idea
of crawling the entire web on a regular basis becomes less and less
feasible, so the need to include information on specific domain,
domain-specific search engines was proposed. As more information
becomes available on the World Wide Web, it becomes more difficult
to provide effective search tools for information access. Today,
people access web information through two main kinds of search
interfaces: Browsers (clicking and following hyperlinks) and Query
Engines (queries in the form of a set of keywords showing the topic
of interest) [2]. Better support is needed for expressing one's
information need and returning high quality search results by web
search tools. There appears to be a need for systems that do reasoning
under uncertainty and are flexible enough to recover from the
contradictions, inconsistencies, and irregularities that such reasoning
involves. In a multi-view problem, the features of the domain can be
partitioned into disjoint subsets (views) that are sufficient to learn the
target concept. Semi-supervised, multi-view algorithms, which
reduce the amount of labeled data required for learning, rely on the
assumptions that the views are compatible and uncorrelated. This
paper describes the use of semi-structured machine learning approach
with Active learning for the “Domain Specific Search Engines". A
domain-specific search engine is “An information access system that
allows access to all the information on the web that is relevant to a
particular domain. The proposed work shows that with the help of
this approach relevant data can be extracted with the minimum
queries fired by the user. It requires small number of labeled data and
pool of unlabelled data on which the learning algorithm is applied to
extract the required data.