Abstract: This article considers the positional buckling of
composite thick plates under thermal loading . For this purpose , the
complex finite strip method is used . In analysis of complex finite
strip, harmonic complex function in longitudinal direction , cubic
functions in transversal direction and parabola distribution of
transverse shear strain in thickness of thick plate based on higherorder
shear deformation theory are used . In given examples , the
effect of angles of stratification , number of layers , dimensions ratio
and length – to – thick ratio across critical temperature are
considered.
Abstract: In this experimental investigation shake table tests
were conducted on two reduced models that represent normal single
room building constructed by Compressed Stabilized Earth Block
(CSEB) from locally available soil. One model was constructed with
earthquake resisting features (EQRF) having sill band, lintel band and
vertical bands to control the building vibration and another one was
without Earthquake Resisting Features. To examine the seismic
capacity of the models particularly when it is subjected to long-period
ground motion by large amplitude by many cycles of repeated
loading, the test specimen was shaken repeatedly until the failure.
The test results from Hi-end Data Acquisition system show that
model with EQRF behave better than without EQRF. This modified
masonry model with new material combined with new bands is used
to improve the behavior of masonry building.
Abstract: This paper focuses on Land Use and Land Cover Changes (LULCC) occurred in the urban coastal regions of the Mediterranean basin in the last thirty years. LULCC were assessed diachronically (1975-2006) in two urban areas, Rome (Italy) and Athens (Greece), by using CORINE land cover maps. In strictly coastal territories a persistent growth of built-up areas at the expenses of both agricultural and forest land uses was found. On the contrary, a different pattern was observed in the surrounding inland areas, where a high conversion rate of the agricultural land uses to both urban and forest land uses was recorded. The impact of city growth on the complex pattern of coastal LULCC is finally discussed.
Abstract: This paper presents anti-synchronization of chaos
between two different chaotic systems using active control method.
The proposed technique is applied to achieve chaos antisynchronization
for the Lü and Rössler dynamical systems.
Numerical simulations are implemented to verify the results.
Abstract: Calibration estimation is a method of adjusting the
original design weights to improve the survey estimates by using
auxiliary information such as the known population total (or mean)
of the auxiliary variables. A calibration estimator uses calibrated
weights that are determined to minimize a given distance measure to
the original design weights while satisfying a set of constraints
related to the auxiliary information. In this paper, we propose a new
multivariate calibration estimator for the population mean in the
stratified sampling design, which incorporates information available
for more than one auxiliary variable. The problem of determining the
optimum calibrated weights is formulated as a Mathematical
Programming Problem (MPP) that is solved using the Lagrange
multiplier technique.
Abstract: In this paper, we present the information life cycle, and analyze the importance of managing the corporate application portfolio across this life cycle. The approach presented here does not correspond just to the extension of the traditional information system development life cycle. This approach is based in the generic life cycle employed in other contexts like manufacturing or marketing. In this paper it is proposed a model of an information system life cycle, supported in the assumption that a system has a limited life. But, this limited life may be extended. This model is also applied in several cases; being reported here two examples of the framework application in a construction enterprise, and in a manufacturing enterprise.
Abstract: This paper presents a novel genetic algorithm, termed
the Optimum Individual Monogenetic Algorithm (OIMGA) and
describes its hardware implementation. As the monogenetic strategy
retains only the optimum individual, the memory requirement is
dramatically reduced and no crossover circuitry is needed, thereby
ensuring the requisite silicon area is kept to a minimum.
Consequently, depending on application requirements, OIMGA
allows the investigation of solutions that warrant either larger GA
populations or individuals of greater length. The results given in this
paper demonstrate that both the performance of OIMGA and its
convergence time are superior to those of existing hardware GA
implementations. Local convergence is achieved in OIMGA by
retaining elite individuals, while population diversity is ensured by
continually searching for the best individuals in fresh regions of the
search space.
Abstract: A well balanced numerical scheme based on
stationary waves for shallow water flows with arbitrary topography
has been introduced by Thanh et al. [18]. The scheme was
constructed so that it maintains equilibrium states and tests indicate
that it is stable and fast. Applying the well-balanced scheme for the
one-dimensional shallow water equations, we study the early shock
waves propagation towards the Phuket coast in Southern Thailand
during a hypothetical tsunami. The initial tsunami wave is generated
in the deep ocean with the strength that of Indonesian tsunami of
2004.
Abstract: Case based reasoning (CBR) methodology presents a foundation for a new technology of building intelligent computeraided diagnoses systems. This Technology directly addresses the problems found in the traditional Artificial Intelligence (AI) techniques, e.g. the problems of knowledge acquisition, remembering, robust and maintenance. This paper discusses the CBR methodology, the research issues and technical aspects of implementing intelligent medical diagnoses systems. Successful applications in cancer and heart diseases developed by Medical Informatics Research Group at Ain Shams University are also discussed.
Abstract: All Text processing systems allow their users to
search a pattern of string from a given text. String matching is
fundamental to database and text processing applications. Every text
editor must contain a mechanism to search the current document for
arbitrary strings. Spelling checkers scan an input text for words in the
dictionary and reject any strings that do not match. We store our
information in data bases so that later on we can retrieve the same
and this retrieval can be done by using various string matching
algorithms. This paper is describing a new string matching algorithm
for various applications. A new algorithm has been designed with the
help of Rabin Karp Matcher, to improve string matching process.
Abstract: This work is devoted to the calculation of the
undulatory parameters and the study of the influence thickness of
electrical sheet on overvoltage compared to the carcass and between
whorls (sections) of the asynchronous motors supplied with PWM
converters.
Abstract: A given polynomial, possibly with multiple roots, is
factored into several lower-degree distinct-root polynomials with
natural-order-integer powers. All the roots, including multiplicities,
of the original polynomial may be obtained by solving these lowerdegree
distinct-root polynomials, instead of the original high-degree
multiple-root polynomial directly.
The approach requires polynomial Greatest Common Divisor
(GCD) computation. The very simple and effective process, “Monic
polynomial subtractions" converted trickily from “Longhand
polynomial divisions" of Euclidean algorithm is employed. It
requires only simple elementary arithmetic operations without any
advanced mathematics.
Amazingly, the derived routine gives the expected results for the
test polynomials of very high degree, such as p( x) =(x+1)1000.
Abstract: This paper gives an overview of the mapping
mechanism of SEAM-a methodology for the automatic generation of
knowledge models and its mapping onto Java codes. It discusses the
rules that will be used to map the different components in the
knowledge model automatically onto Java classes, properties and
methods. The aim of developing this mechanism is to help in the
creation of a prototype which will be used to validate the knowledge
model which has been generated automatically. It will also help to
link the modeling phase with the implementation phase as existing
knowledge engineering methodologies do not provide for proper
guidelines for the transition from the knowledge modeling phase to
development phase. This will decrease the development overheads
associated to the development of Knowledge Based Systems.
Abstract: Bioleaching of spent catalyst using moderate thermophilic chemolithotrophic acidophiles in growth medium without Fe source was investigated with two different pulp densities and three different size fractions. All the experiments were conducted on shake flasks at a temperature of 65 °C. The leaching yield of Ni and Al was found to be promising with very high leaching yield of 92-96% followed by Al as 41-76%, which means both Ni and Al leaching were favored by the moderate thermophilic bioleaching compared to the mesophilic bioleaching. The acid consumption was comparatively higher for the 10% pulp density experiments. Comparatively minimal difference in the leaching yield with different size fractions and different pulp densities show no requirement of grinding and using low pulp density less than 10%. This process would rather be economical as well as eco-friendly process for future optimization of the recovery of metal values from spent catalyst.
Abstract: In this paper the use of sequential machines for recognizing actions taken by the objects detected by a general tracking algorithm is proposed. The system may deal with the uncertainty inherent in medium-level vision data. For this purpose, fuzzification of input data is performed. Besides, this transformation allows to manage data independently of the tracking application selected and enables adding characteristics of the analyzed scenario. The representation of actions by means of an automaton and the generation of the input symbols for finite automaton depending on the object and action compared are described. The output of the comparison process between an object and an action is a numerical value that represents the membership of the object to the action. This value is computed depending on how similar the object and the action are. The work concludes with the application of the proposed technique to identify the behavior of vehicles in road traffic scenes.
Abstract: Since water resources of desert Naein City are very
limited, a approach which saves water resources and meanwhile
meets the needs of the greenspace for water is to use city-s sewage
wastewater. Proper treatment of Naein-s sewage up to the standards
required for green space uses may solve some of the problems of
green space development of the city. The present paper closely
examines available statistics and information associated with city-s
sewage system, and determines complementary stages of sewage
treatment facilities of the city. In the present paper, population, per
capita water use, and required discharge for various greenspace
pieces including different plants are calculated. Moreover, in order to
facilitate the application of water resources, a Crude water
distribution network apart from drinking water distribution network is
designed, and a plan for mixing municipal wells- water with sewage
wastewater in proposed mixing tanks is suggested. Hence, following
greenspace irrigation reform and complementary plan, per capita
greenspace of the city will be increased from current amount of 13.2
square meters to 32 square meters.
Abstract: Multiport diffusers are the effective engineering
devices installed at the modern marine outfalls for the steady
discharge of effluent streams from the coastal plants, such as
municipal sewage treatment, thermal power generation and seawater
desalination. A mathematical model using a two-dimensional
advection-diffusion equation based on a flat seabed and incorporating
the effect of a coastal tidal current is developed to calculate the
compounded concentration following discharges of desalination
brine from a sea outfall with multiport diffusers. The analytical
solutions are computed graphically to illustrate the merging of
multiple brine plumes in shallow coastal waters, and further
approximation will be made to the maximum shoreline's
concentration to formulate dilution of a multiport diffuser discharge.
Abstract: Reinforced Concrete (RC) structures strengthened
with fiber reinforced polymer (FRP) lack in thermal resistance under
elevated temperatures in the event of fire. This phenomenon led to
the lining of strengthened concrete with thin high performance
cementitious composites (THPCC) to protect the substrate against
elevated temperature. Elevated temperature effects on THPCC, based
on different cementitious materials have been studied in the past but
high-alumina cement (HAC)-based THPCC have not been well
characterized. This research study will focus on the THPCC based on
HAC replaced by 60%, 70%, 80% and 85% of ground granulated
blast furnace slag (GGBS). Samples were evaluated by the
measurement of their mechanical strength (28 & 56 days of curing)
after exposed to 400°C, 600°C and 28°C of room temperature for
comparison and corroborated by their microstructure study. Results
showed that among all mixtures, the mix containing only HAC
showed the highest compressive strength after exposed to 600°C as
compared to other mixtures. However, the tensile strength of THPCC
made of HAC and 60% GGBS content was comparable to the
THPCC with HAC only after exposed to 600°C. Field emission
scanning electron microscopy (FESEM) images of THPCC
accompanying Energy Dispersive X-ray (EDX) microanalysis
revealed that the microstructure deteriorated considerably after
exposure to elevated temperatures which led to the decrease in
mechanical strength.
Abstract: We investigate efficient spreading codes for transmitter based techniques of code division multiple access (CDMA) systems. The channel is considered to be known at the transmitter which is usual in a time division duplex (TDD) system where the channel is assumed to be the same on uplink and downlink. For such a TDD/CDMA system, both bitwise and blockwise multiuser transmission schemes are taken up where complexity is transferred to the transmitter side so that the receiver has minimum complexity. Different spreading codes are considered at the transmitter to spread the signal efficiently over the entire spectrum. The bit error rate (BER) curves portray the efficiency of the codes in presence of multiple access interference (MAI) as well as inter symbol interference (ISI).
Abstract: As the web continues to grow exponentially, the idea
of crawling the entire web on a regular basis becomes less and less
feasible, so the need to include information on specific domain,
domain-specific search engines was proposed. As more information
becomes available on the World Wide Web, it becomes more difficult
to provide effective search tools for information access. Today,
people access web information through two main kinds of search
interfaces: Browsers (clicking and following hyperlinks) and Query
Engines (queries in the form of a set of keywords showing the topic
of interest) [2]. Better support is needed for expressing one's
information need and returning high quality search results by web
search tools. There appears to be a need for systems that do reasoning
under uncertainty and are flexible enough to recover from the
contradictions, inconsistencies, and irregularities that such reasoning
involves. In a multi-view problem, the features of the domain can be
partitioned into disjoint subsets (views) that are sufficient to learn the
target concept. Semi-supervised, multi-view algorithms, which
reduce the amount of labeled data required for learning, rely on the
assumptions that the views are compatible and uncorrelated. This
paper describes the use of semi-structured machine learning approach
with Active learning for the “Domain Specific Search Engines". A
domain-specific search engine is “An information access system that
allows access to all the information on the web that is relevant to a
particular domain. The proposed work shows that with the help of
this approach relevant data can be extracted with the minimum
queries fired by the user. It requires small number of labeled data and
pool of unlabelled data on which the learning algorithm is applied to
extract the required data.