Abstract: The urban transformation processes in its framework
and its general significance became a fundamental and vital subject
of consideration for both the developed and the developing societies.
It has become important to regulate the architectural systems adopted
by the city, to sustain the present development on one hand, and on
the other hand, to facilitate its future growth.
Thus, the study dealt with the phenomenon of urban
transformation of the Mediterranean cities, and the city of Alexandria
in particular, because of its significant historical and cultural legacy,
its historical architecture and its contemporary urbanization.
This article investigates the entirety of cities in the Mediterranean
region through the analysis of the relationship between inflation and
growth of these cities and the extent of the complexity of the city
barriers. We hope to analyze not only the internal transformations,
but the external relationships (both imperial and post-colonial) that
have shaped Alexandria city growth from the nineteenth century until
today.
Abstract: In this article, some methods are mentioned for developing the theatrical language by giving information of “theatrical language" since the arising of the language in obsolete terms, and today, and also by examining the problems. Being able to talk meaningfully in the theater stage is a skillful art. Maybe, to be able to convey the idea of the poet, his/her world outlook and his/her feelings from the bottom of the heart as such, also conveying the speech norms without breaking them to the ear of audience in a fascinating way in adverse of a repellent way is the most difficult one. Because of this, “the word is the mirror of the idea". The importance of the theatrical language should not be perceived as only a post, it is “as the yarn that the culture carpet is weaved from". Thereby, it is a tool which transposes our culture and our life style from generation to generation. At the time of creativeness, the “word" comes out from the poet, “the word and feeling" art comes out from the actor. If it was not so, the audience could read the texts of the work himself/herself instead of going to the theater in order to see the performance. The fundamental works by the Turkish, Kazakh and English scientists have been taken as a basis for the research done.
Abstract: The demands of taller structures are becoming imperative almost everywhere in the world in addition to the challenges of material and labor cost, project time line etc. This paper conducted a study keeping in view the challenging nature of high-rise construction with no generic rules for deflection minimizations and frequency control. The effects of cyclonic wind and provision of outriggers on 28-storey, 42-storey and 57-storey are examined in this paper and certain conclusions are made which would pave way for researchers to conduct further study in this particular area of civil engineering. The results show that plan dimensions have vital impacts on structural heights. Increase of height while keeping the plan dimensions same, leads to the reduction in the lateral rigidity. To achieve required stiffness increase of bracings sizes as well as introduction of additional lateral resisting system such as belt truss and outriggers is required.
Abstract: A potentially serious problem with current payment systems is that their underlying hard problems from number theory may be solved by either a quantum computer or unanticipated future advances in algorithms and hardware. A new quantum payment system is proposed in this paper. The suggested system makes use of fundamental principles of quantum mechanics to ensure the unconditional security without prior arrangements between customers and vendors. More specifically, the new system uses Greenberger-Home-Zeilinger (GHZ) states and Quantum Key Distribution to authenticate the vendors and guarantee the transaction integrity.
Abstract: The IFS is a scheme for describing and manipulating complex fractal attractors using simple mathematical models. More precisely, the most popular “fractal –based" algorithms for both representation and compression of computer images have involved some implementation of the method of Iterated Function Systems (IFS) on complete metric spaces. In this paper a new generalized space called Multi-Fuzzy Fractal Space was constructed. On these spases a distance function is defined, and its completeness is proved. The completeness property of this space ensures the existence of a fixed-point theorem for the family of continuous mappings. This theorem is the fundamental result on which the IFS methods are based and the fractals are built. The defined mappings are proved to satisfy some generalizations of the contraction condition.
Abstract: In this paper, a new adaptive Fourier decomposition
(AFD) based time-frequency speech analysis approach is proposed.
Given the fact that the fundamental frequency of speech signals often
undergo fluctuation, the classical short-time Fourier transform (STFT)
based spectrogram analysis suffers from the difficulty of window size
selection. AFD is a newly developed signal decomposition theory. It is
designed to deal with time-varying non-stationary signals. Its
outstanding characteristic is to provide instantaneous frequency for
each decomposed component, so the time-frequency analysis becomes
easier. Experiments are conducted based on the sample sentence in
TIMIT Acoustic-Phonetic Continuous Speech Corpus. The results
show that the AFD based time-frequency distribution outperforms the
STFT based one.
Abstract: This paper proposes an auto-classification algorithm
of Web pages using Data mining techniques. We consider the
problem of discovering association rules between terms in a set of
Web pages belonging to a category in a search engine database, and
present an auto-classification algorithm for solving this problem that
are fundamentally based on Apriori algorithm. The proposed
technique has two phases. The first phase is a training phase where
human experts determines the categories of different Web pages, and
the supervised Data mining algorithm will combine these categories
with appropriate weighted index terms according to the highest
supported rules among the most frequent words. The second phase is
the categorization phase where a web crawler will crawl through the
World Wide Web to build a database categorized according to the
result of the data mining approach. This database contains URLs and
their categories.
Abstract: Soft clays are defined as cohesive soil whose water
content is higher than its liquid limits. Thus, soil-cement mixing is
adopted to improve the ground conditions by enhancing the strength
and deformation characteristics of the soft clays. For the above
mentioned reasons, a series of laboratory tests were carried out to
study some fundamental mechanical properties of cement stabilized
soft clay. The test specimens were prepared by varying the portion of
ordinary Portland cement to the soft clay sample retrieved from the
test site of RECESS (Research Centre for Soft Soil). Comparisons
were made for both homogeneous and columnar system specimens
by relating the effects of cement stabilized clay of for 0, 5 and 10 %
cement and curing for 3, 28 and 56 days. The mechanical properties
examined included one-dimensional compressibility and undrained
shear strength. For the mechanical properties, both homogeneous and
columnar system specimens were prepared to examine the effect of
different cement contents and curing periods on the stabilized soil.
The one-dimensional compressibility test was conducted using an
oedometer, while a direct shear box was used for measuring the
undrained shear strength. The higher the value of cement content, the
greater is the enhancement of the yield stress and the decrease of
compression index. The value of cement content in a specimen is a
more active parameter than the curing period.
Abstract: Measurement of the quality of image compression is important for image processing application. In this paper, we propose an objective image quality assessment to measure the quality of gray scale compressed image, which is correlation well with subjective quality measurement (MOS) and least time taken. The new objective image quality measurement is developed from a few fundamental of objective measurements to evaluate the compressed image quality based on JPEG and JPEG2000. The reliability between each fundamental objective measurement and subjective measurement (MOS) is found. From the experimental results, we found that the Maximum Difference measurement (MD) and a new proposed measurement, Structural Content Laplacian Mean Square Error (SCLMSE), are the suitable measurements that can be used to evaluate the quality of JPEG200 and JPEG compressed image, respectively. In addition, MD and SCLMSE measurements are scaled to make them equivalent to MOS, given the rate of compressed image quality from 1 to 5 (unacceptable to excellent quality).
Abstract: During recent years, the traditional learning
approaches have undergone fundamental changes due to the
emergence of new technologies such as multimedia, hypermedia and
telecommunication. E-learning is a modern world phenomenon that
has come into existence in the information age and in a knowledgebased
society. E-learning has developed significantly within a short
period of time. Thus it is of a great significant to secure information,
allow a confident access and prevent unauthorized accesses. Making
use of individuals- physiologic or behavioral (biometric) properties is
a confident method to make the information secure. Among the
biometrics, fingerprint is more acceptable and most countries use it as
an efficient methods of identification. This article provides a new
method to compare the fingerprint comparison by pattern recognition
and image processing techniques. To verify fingerprint, the shortest
distance method is used together with perceptronic multilayer neural
network functioning based on minutiae. This method is highly
accurate in the extraction of minutiae and it accelerates comparisons
due to elimination of false minutiae and is more reliable compared
with methods that merely use directional images.
Abstract: Fundamental sensor-motor couplings form the backbone
of most mobile robot control tasks, and often need to be implemented
fast, efficiently and nevertheless reliably. Machine learning
techniques are therefore often used to obtain the desired sensor-motor
competences.
In this paper we present an alternative to established machine
learning methods such as artificial neural networks, that is very fast,
easy to implement, and has the distinct advantage that it generates
transparent, analysable sensor-motor couplings: system identification
through nonlinear polynomial mapping.
This work, which is part of the RobotMODIC project at the
universities of Essex and Sheffield, aims to develop a theoretical understanding
of the interaction between the robot and its environment.
One of the purposes of this research is to enable the principled design
of robot control programs.
As a first step towards this aim we model the behaviour of the
robot, as this emerges from its interaction with the environment, with
the NARMAX modelling method (Nonlinear, Auto-Regressive, Moving
Average models with eXogenous inputs). This method produces
explicit polynomial functions that can be subsequently analysed using
established mathematical methods.
In this paper we demonstrate the fidelity of the obtained NARMAX
models in the challenging task of robot route learning; we present a
set of experiments in which a Magellan Pro mobile robot was taught
to follow four different routes, always using the same mechanism to
obtain the required control law.
Abstract: The segmentation of mouth and lips is a fundamental
problem in facial image analyisis. In this paper we propose a method
for lip segmentation based on rg-color histogram. Statistical analysis
shows, using the rg-color-space is optimal for this purpose of a pure
color based segmentation. Initially a rough adaptive threshold selects
a histogram region, that assures that all pixels in that region are
skin pixels. Based on that pixels we build a gaussian model which
represents the skin pixels distribution and is utilized to obtain a
refined, optimal threshold. We are not incorporating shape or edge
information. In experiments we show the performance of our lip pixel
segmentation method compared to the ground truth of our dataset and
a conventional watershed algorithm.
Abstract: All Text processing systems allow their users to
search a pattern of string from a given text. String matching is
fundamental to database and text processing applications. Every text
editor must contain a mechanism to search the current document for
arbitrary strings. Spelling checkers scan an input text for words in the
dictionary and reject any strings that do not match. We store our
information in data bases so that later on we can retrieve the same
and this retrieval can be done by using various string matching
algorithms. This paper is describing a new string matching algorithm
for various applications. A new algorithm has been designed with the
help of Rabin Karp Matcher, to improve string matching process.
Abstract: As a tool for human spatial cognition and thinking, the map has been playing an important role. Maps are perhaps as fundamental to society as language and the written word. Economic and social development requires extensive and in-depth understanding of their own living environment, from the scope of the overall global to urban housing. This has brought unprecedented opportunities and challenges for traditional cartography . This paper first proposed the concept of scaleless-map and its basic characteristics, through the analysis of the existing multi-scale representation techniques. Then some strategies are presented for automated mapping compilation. Taking into account the demand of automated map compilation, detailed proposed the software - WJ workstation must have four technical features, which are generalization operators, symbol primitives, dynamically annotation and mapping process template. This paper provides a more systematic new idea and solution to improve the intelligence and automation of the scaleless cartography.
Abstract: The modeling of sound radiation is of fundamental importance for understanding the propagation of acoustic waves and, consequently, develop mechanisms for reducing acoustic noise. The propagation of acoustic waves, are involved in various phenomena such as radiation, absorption, transmission and reflection. The radiation is studied through the linear equation of the acoustic wave that is obtained through the equation for the Conservation of Momentum, equation of State and Continuity. From these equations, is the Helmholtz differential equation that describes the problem of acoustic radiation. In this paper we obtained the solution of the Helmholtz differential equation for an infinite cylinder in a pulsating through free and homogeneous. The analytical solution is implemented and the results are compared with the literature. A numerical formulation for this problem is obtained using the Boundary Element Method (BEM). This method has great power for solving certain acoustical problems in open field, compared to differential methods. BEM reduces the size of the problem, thereby simplifying the input data to be worked and reducing the computational time used.
Abstract: Avalanche release of snow has been modeled in the present studies. Snow is assumed to be represented by semi-solid and the governing equations have been studied from the concept of continuum approach. The dynamical equations have been solved for two different zones [starting zone and track zone] by using appropriate initial and boundary conditions. Effect of density (ρ), Eddy viscosity (η), Slope angle (θ), Slab depth (R) on the flow parameters have been observed in the present studies. Numerical methods have been employed for computing the non linear differential equations. One of the most interesting and fundamental innovation in the present studies is getting initial condition for the computation of velocity by numerical approach. This information of the velocity has obtained through the concept of fracture mechanics applicable to snow. The results on the flow parameters have found to be in qualitative agreement with the published results.
Abstract: fibers of pure cellulose can be made from some bacteria such as acetobacter xylinum. Bacterial cellulose fibers are very pure, tens of nm across and about 0.5 micron long. The fibers are very stiff and, although nobody seems to have measured the strength of individual fibers. Their stiffness up to 70 GPa. Fundamental strengths should be at least greater than those of the best commercial polymers, but best bulk strength seems to about the same as that of steel. They can potentially be produced in industrial quantities at greatly lowered cost and water content, and with triple the yield, by a new process. This article presents a critical review of the available information on the bacterial cellulose as a biological nonwoven fabric with special emphasis on its fermentative production and applications. Characteristics of bacterial cellulose biofabric with respect to its structure and physicochemical properties are discussed. Current and potential applications of bacterial cellulose in textile, nonwoven cloth, paper, films synthetic fiber coating, food, pharmaceutical and other industries are also presented.
Abstract: Despite the internet, which is one of the mass media
that has become quite common in recent years, the relationship of
Advertisement with Television and Cinema, which have always
drawn attention of researchers as basic media and where visual use is
in the foreground, have also become the subject of various studies.
Based on the assumption that the known fundamental effects of
advertisements on consumers are closely related to the creative
process of advertisements as well as the nature and characteristics of
the medium where they are used, these basic mass media (Television
and Cinema) and the consumer motivations of the advertisements
they broadcast have become a focus of study.
Given that the viewers of the mass media in question have shifted
from a passive position to a more active one especially in recent years
and approach contents of advertisements, as they do all contents, in a
more critical and “pitiless" manner, it is possible to say that
individuals make more use of advertisements than in the past and
combine their individual goals with the goals of the advertisements.
This study, which aims at finding out what the goals of these new
individual advertisement use are, how they are shaped by the distinct
characteristics of Television and Cinema, where visuality takes
precedence as basic mass media, and what kind of places they occupy
in the minds of consumers, has determined consumers- motivations
as: “Entertainment", “Escapism", “Play", “Monitoring/Discovery",
“Opposite Sex" and “Aspirations and Role Models".
This study intends to reveal the differences or similarities among
the needs and hence the gratifications of viewers who consume
advertisements on Television or at the Cinema, which are two basic
media where visuality is prioritized.
Abstract: We present an explicit expression to estimate driving voltage attenuation through RC networks representation of an ultrahigh- speed image sensor. Elmore delay metric for a fundamental RC chain is employed as the first-order approximation. By application of dimensional analysis to SPICE simulation data, we found a simple expression that significantly improves the accuracy of the approximation. Estimation error of the resultant expression for uniform RC networks is less than 2%. Similarly, another simple closed-form model to estimate 50 % delay through fundamental RC networks is also derived with sufficient accuracy. The framework of this analysis can be extended to address delay or attenuation issues of other VLSI structures.
Abstract: Sequences of execution of algorithms in an interactive
manner using multimedia tools are employed in this paper. It helps to
realize the concept of fundamentals of algorithms such as searching
and sorting method in a simple manner. Visualization gains more
attention than theoretical study and it is an easy way of learning
process. We propose methods for finding runtime sequence of each
algorithm in an interactive way and aims to overcome the drawbacks
of the existing character systems. System illustrates each and every
step clearly using text and animation. Comparisons of its time
complexity have been carried out and results show that our approach
provides better perceptive of algorithms.