Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: The speech signal conveys information about the
identity of the speaker. The area of speaker identification is
concerned with extracting the identity of the person speaking the
utterance. As speech interaction with computers becomes more
pervasive in activities such as the telephone, financial transactions
and information retrieval from speech databases, the utility of
automatically identifying a speaker is based solely on vocal
characteristic. This paper emphasizes on text dependent speaker
identification, which deals with detecting a particular speaker from a
known population. The system prompts the user to provide speech
utterance. System identifies the user by comparing the codebook of
speech utterance with those of the stored in the database and lists,
which contain the most likely speakers, could have given that speech
utterance. The speech signal is recorded for N speakers further the
features are extracted. Feature extraction is done by means of LPC
coefficients, calculating AMDF, and DFT. The neural network is
trained by applying these features as input parameters. The features
are stored in templates for further comparison. The features for the
speaker who has to be identified are extracted and compared with the
stored templates using Back Propogation Algorithm. Here, the
trained network corresponds to the output; the input is the extracted
features of the speaker to be identified. The network does the weight
adjustment and the best match is found to identify the speaker. The
number of epochs required to get the target decides the network
performance.
Abstract: To improve the classification rate of the face
recognition, features combination and a novel non-linear kernel are
proposed. The feature vector concatenates three different radius of
local binary patterns and Gabor wavelet features. Gabor features are
the mean, standard deviation and the skew of each scaling and
orientation parameter. The aim of the new kernel is to incorporate
the power of the kernel methods with the optimal balance between
the features. To verify the effectiveness of the proposed method,
numerous methods are tested by using four datasets, which are
consisting of various emotions, orientations, configuration,
expressions and lighting conditions. Empirical results show the
superiority of the proposed technique when compared to other
methods.
Abstract: The AL-MAJIRI school system is a variant of private
Arabic and Islamic schools which cater for the religious and moral development of Muslims. In the past, the system produced clerics,
scholars, judges, religious reformers, eminent teachers and great men who are worthy of emulation, particularly in northern Nigeria.
Gradually, the system lost its glory but continued to discharge its
educational responsibilities to a certain extent. This paper takes a
look at the activities of the AL-MAJIRI schools. The introduction
provides background information about Nigeria where the schools
operate. This is followed by an overview of the Nigerian educational system, the nature and the features of the AL-MAJIRI school system,
its weaknesses and the current challenges facing the schools. The paper concludes with emphasis on the urgent need for a comprehensive reform of the curriculum content of the schools. The step by step procedure required for the reform is discussed.
Abstract: This paper describes a CMOS four-quadrant
multiplier intended for use in the front-end receiver by utilizing the
square-law characteristic of the MOS transistor in the saturation
region. The circuit is based on 0.35 um CMOS technology simulated
using HSPICE software. The mixer has a third-order inter the power
consumption is 271uW from a single 1.2V power supply. One of the
features of the proposed design is using two MOS transistors
limitation to reduce the supply voltage, which leads to reduce the
power consumption. This technique provides a GHz bandwidth
response and low power consumption.
Abstract: The necessity of accurate and timely field data is
shared among organizations engaged in fundamentally different
activities, public services or commercial operations. Basically, there
are three major components in the process of the qualitative research:
data collection, interpretation and organization of data, and analytic
process. Representative technological advancements in terms of
innovation have been made in mobile devices (mobile phone, PDA-s,
tablets, laptops, etc). Resources that can be potentially applied on the
data collection activity for field researches in order to improve this
process.
This paper presents and discuss the main features of a mobile
phone based solution for field data collection, composed of basically
three modules: a survey editor, a server web application and a client
mobile application. The data gathering process begins with the
survey creation module, which enables the production of tailored
questionnaires. The field workforce receives the questionnaire(s) on
their mobile phones to collect the interviews responses and sending
them back to a server for immediate analysis.
Abstract: Increasing growth of information volume in the
internet causes an increasing need to develop new (semi)automatic
methods for retrieval of documents and ranking them according to
their relevance to the user query. In this paper, after a brief review
on ranking models, a new ontology based approach for ranking
HTML documents is proposed and evaluated in various
circumstances. Our approach is a combination of conceptual,
statistical and linguistic methods. This combination reserves the
precision of ranking without loosing the speed. Our approach
exploits natural language processing techniques to extract phrases
from documents and the query and doing stemming on words. Then
an ontology based conceptual method will be used to annotate
documents and expand the query. To expand a query the spread
activation algorithm is improved so that the expansion can be done
flexible and in various aspects. The annotated documents and the
expanded query will be processed to compute the relevance degree
exploiting statistical methods. The outstanding features of our
approach are (1) combining conceptual, statistical and linguistic
features of documents, (2) expanding the query with its related
concepts before comparing to documents, (3) extracting and using
both words and phrases to compute relevance degree, (4) improving
the spread activation algorithm to do the expansion based on
weighted combination of different conceptual relationships and (5)
allowing variable document vector dimensions. A ranking system
called ORank is developed to implement and test the proposed
model. The test results will be included at the end of the paper.
Abstract: Basic objective of this study is to create a regression
analysis method that can estimate the length of a plastic hinge which
is an important design parameter, by making use of the outcomes of
(lateral load-lateral displacement hysteretic curves) the experimental
studies conducted for the reinforced square concrete columns. For
this aim, 170 different square reinforced concrete column tests results
have been collected from the existing literature. The parameters
which are thought affecting the plastic hinge length such as crosssection
properties, features of material used, axial loading level,
confinement of the column, longitudinal reinforcement bars in the
columns etc. have been obtained from these 170 different square
reinforced concrete column tests. In the study, when determining the
length of plastic hinge, using the experimental test results, a
regression analysis have been separately tested and compared with
each other. In addition, the outcome of mentioned methods on
determination of plastic hinge length of the reinforced concrete
columns has been compared to other methods available in the
literature.
Abstract: In most of the cases, natural disasters lead to the
necessity of evacuating people. The quality of evacuation
management is dramatically improved by the use of information
provided by decision support systems, which become indispensable
in case of large scale evacuation operations. This paper presents a
best practice case study. In November 2007, officers from the
Emergency Situations Inspectorate “Crisana" of Bihor County from
Romania participated to a cross-border evacuation exercise, when
700 people have been evacuated from Netherlands to Belgium. One
of the main objectives of the exercise was the test of four different
decision support systems. Afterwards, based on that experience,
software system called TEVAC (Trans Border Evacuation) has been
developed “in house" by the experts of this institution. This original
software system was successfully tested in September 2008, during
the deployment of the international exercise EU-HUROMEX 2008,
the scenario involving real evacuation of 200 persons from Hungary
to Romania. Based on the lessons learned and results, starting from
April 2009, the TEVAC software is used by all Emergency
Situations Inspectorates all over Romania.
Abstract: A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.
Abstract: As days go by, we hear more and more about HIV,
Ebola, Bird Flu and other dreadful viruses which were unknown a
few decades ago. In both detecting and fighting viral diseases
ordinary methods have come across some basic and important
difficulties. Vaccination is by a sense introduction of the virus to the
immune system before the occurrence of the real case infection. It is
very successful against some viruses (e.g. Poliomyelitis), while
totally ineffective against some others (e.g. HIV or Hepatitis-C). On
the other hand, Anti-virus drugs are mostly some tools to control and
not to cure a viral disease. This could be a good motivation to try
alternative treatments. In this study, some key features of possible
physical-based alternative treatments for viral diseases are presented.
Electrification of body parts or fluids (especially blood) with micro
electric signals with adjusted current or frequency is also studied. The
main approach of this study is to find a suitable energy field, with
appropriate parameters that are able to kill or deactivate viruses. This
would be a lengthy, multi-disciplinary research which needs the
contribution of virology, physics, and signal processing experts. It
should be mentioned that all the claims made by alternative cures
researchers must be tested carefully and are not advisable at the time
being.
Abstract: The purpose of this paper is to present teacher candidates- beliefs about technology integration in their field of study, which is classroom teaching in this case. The study was conducted among the first year students in college of education in Turkey. This study is based on both quantitative and qualitative data. For the quantitative data- Likert scale was used and for the qualitative data pattern matching was employed. The primary findings showed that students defined educational technology as technologies that improve learning with their visual, easily accessible, and productive features. They also believe these technologies could affect their future students- learning positively.
Abstract: In this paper, we present a new and effective image indexing technique that extracts features directly from DCT domain. Our proposed approach is an object-based image indexing. For each block of size 8*8 in DCT domain a feature vector is extracted. Then, feature vectors of all blocks of image using a k-means algorithm is clustered into groups. Each cluster represents a special object of the image. Then we select some clusters that have largest members after clustering. The centroids of the selected clusters are taken as image feature vectors and indexed into the database. Also, we propose an approach for using of proposed image indexing method in automatic image classification. Experimental results on a database of 800 images from 8 semantic groups in automatic image classification are reported.
Abstract: This paper presents a new circuit arrangement for a
current-mode Wheatstone bridge that is suitable for low-voltage
integrated circuits implementation. Compared to the other proposed
circuits, this circuit features severe reduction of the elements number,
low supply voltage (1V) and low power consumption (
Abstract: Diagnosis can be achieved by building a model of a
certain organ under surveillance and comparing it with the real time
physiological measurements taken from the patient. This paper deals
with the presentation of the benefits of using Data Mining techniques
in the computer-aided diagnosis (CAD), focusing on the cancer
detection, in order to help doctors to make optimal decisions quickly
and accurately. In the field of the noninvasive diagnosis techniques,
the endoscopic ultrasound elastography (EUSE) is a recent elasticity
imaging technique, allowing characterizing the difference between
malignant and benign tumors. Digitalizing and summarizing the main
EUSE sample movies features in a vector form concern with the use
of the exploratory data analysis (EDA). Neural networks are then
trained on the corresponding EUSE sample movies vector input in
such a way that these intelligent systems are able to offer a very
precise and objective diagnosis, discriminating between benign and
malignant tumors. A concrete application of these Data Mining
techniques illustrates the suitability and the reliability of this
methodology in CAD.
Abstract: The quest of providing more secure identification
system has led to a rise in developing biometric systems. Dorsal
hand vein pattern is an emerging biometric which has attracted the
attention of many researchers, of late. Different approaches have
been used to extract the vein pattern and match them. In this work,
Principle Component Analysis (PCA) which is a method that has
been successfully applied on human faces and hand geometry is
applied on the dorsal hand vein pattern. PCA has been used to obtain
eigenveins which is a low dimensional representation of vein pattern
features. Low cost CCD cameras were used to obtain the vein
images. The extraction of the vein pattern was obtained by applying
morphology. We have applied noise reduction filters to enhance the
vein patterns. The system has been successfully tested on a database
of 200 images using a threshold value of 0.9. The results obtained are
encouraging.
Abstract: In this paper, we introduce a new method for elliptical
object identification. The proposed method adopts a hybrid scheme
which consists of Eigen values of covariance matrices, Circular
Hough transform and Bresenham-s raster scan algorithms. In this
approach we use the fact that the large Eigen values and small Eigen
values of covariance matrices are associated with the major and minor
axial lengths of the ellipse. The centre location of the ellipse can be
identified using circular Hough transform (CHT). Sparse matrix
technique is used to perform CHT. Since sparse matrices squeeze zero
elements and contain a small number of nonzero elements they
provide an advantage of matrix storage space and computational time.
Neighborhood suppression scheme is used to find the valid Hough
peaks. The accurate position of circumference pixels is identified
using raster scan algorithm which uses the geometrical symmetry
property. This method does not require the evaluation of tangents or
curvature of edge contours, which are generally very sensitive to
noise working conditions. The proposed method has the advantages of
small storage, high speed and accuracy in identifying the feature. The
new method has been tested on both synthetic and real images.
Several experiments have been conducted on various images with
considerable background noise to reveal the efficacy and robustness.
Experimental results about the accuracy of the proposed method,
comparisons with Hough transform and its variants and other
tangential based methods are reported.
Abstract: One of the main processes of supply chain
management is supplier selection process which its accurate
implementation can dramatically increase company competitiveness.
In presented article model developed based on the features of
second tiers suppliers and four scenarios are predicted in order to
help the decision maker (DM) in making up his/her mind. In addition
two tiers of suppliers have been considered as a chain of suppliers.
Then the proposed approach is solved by a method combined of
concepts of fuzzy set theory (FST) and linear programming (LP)
which has been nourished by real data extracted from an engineering
design and supplying parts company. At the end results reveal the
high importance of considering second tier suppliers features as
criteria for selecting the best supplier.
Abstract: Environmental awareness and depletion of the
petroleum resources are among vital factors that motivate a number
of researchers to explore the potential of reusing natural fiber as an
alternative composite material in industries such as packaging,
automotive and building constructions. Natural fibers are available in
abundance, low cost, lightweight polymer composite and most
importance its biodegradability features, which often called “ecofriendly"
materials. However, their applications are still limited due
to several factors like moisture absorption, poor wettability and large
scattering in mechanical properties. Among the main challenges on
natural fibers reinforced matrices composite is their inclination to
entangle and form fibers agglomerates during processing due to
fiber-fiber interaction. This tends to prevent better dispersion of the
fibers into the matrix, resulting in poor interfacial adhesion between
the hydrophobic matrix and the hydrophilic reinforced natural fiber.
Therefore, to overcome this challenge, fiber treatment process is one
common alternative that can be use to modify the fiber surface
topology by chemically, physically or mechanically technique.
Nevertheless, this paper attempt to focus on the effect of
mercerization treatment on mechanical properties enhancement of
natural fiber reinforced composite or so-called bio composite. It
specifically discussed on mercerization parameters, and natural fiber
reinforced composite mechanical properties enhancement.
Abstract: When the profile information of an existing road is
missing or not up-to-date and the parameters of the vertical
alignment are needed for engineering analysis, the engineer has to recreate
the geometric design features of the road alignment using
collected profile data. The profile data may be collected using
traditional surveying methods, global positioning systems, or digital
imagery. This paper develops a method that estimates the parameters
of the geometric features that best characterize the existing vertical
alignments in terms of tangents and the expressions of the curve, that
may be symmetrical, asymmetrical, reverse, and complex vertical
curves. The method is implemented using an Excel-based
optimization method that minimizes the differences between the
observed profile and the profiles estimated from the equations of the
vertical curve. The method uses a 'wireframe' representation of the
profile that makes the proposed method applicable to all types of
vertical curves. A secondary contribution of this paper is to introduce
the properties of the equal-arc asymmetrical curve that has been
recently developed in the highway geometric design field.