Abstract: As a result of the high cost of housing, the increasing population is forced to live in substandard housing and unhealthy conditions giving rise to poor residential neighborhoods. The paper examines the causes and characteristics of poor residential neighborhood. The paper finds the problems that have influence poor neighborhoods to; poverty, growth of informal sector and housing shortage. The paper asserts that poor residential neighborhoods have adverse effects on the people.
The secondary data was obtained from books, journals and seminar papers while primary data relating to building and environmental quality from structured questionnaire administered on sample of 500 household heads, from sampling frame of 5000 housing units.
The study reveals that majority of the respondents are poor and employed in informal sector. The paper suggests urban renewal and slum upgrading programs as methods in dealing with the situation and an improvement in the socio-economic circumstances of the inhabitants.
Abstract: Methods for organizing web data into groups in order
to analyze web-based hypertext data and facilitate data availability
are very important in terms of the number of documents available
online. Thereby, the task of clustering web-based document structures
has many applications, e.g., improving information retrieval on the
web, better understanding of user navigation behavior, improving web
users requests servicing, and increasing web information accessibility.
In this paper we investigate a new approach for clustering web-based
hypertexts on the basis of their graph structures. The hypertexts will
be represented as so called generalized trees which are more general
than usual directed rooted trees, e.g., DOM-Trees. As a important
preprocessing step we measure the structural similarity between the
generalized trees on the basis of a similarity measure d. Then,
we apply agglomerative clustering to the obtained similarity matrix
in order to create clusters of hypertext graph patterns representing
navigation structures. In the present paper we will run our approach
on a data set of hypertext structures and obtain good results in
Web Structure Mining. Furthermore we outline the application of
our approach in Web Usage Mining as future work.
Abstract: Ethnicity identification of face images is of interest in
many areas of application, but existing methods are few and limited.
This paper presents a fusion scheme that uses block-based uniform
local binary patterns and Haar wavelet transform to combine local
and global features. In particular, the LL subband coefficients of the
whole face are fused with the histograms of uniform local binary
patterns from block partitions of the face. We applied the principal
component analysis on the fused features and managed to reduce the
dimensionality of the feature space from 536 down to around 15
without sacrificing too much accuracy. We have conducted a number
of preliminary experiments using a collection of 746 subject face
images. The test results show good accuracy and demonstrate the
potential of fusing global and local features. The fusion approach is
robust, making it easy to further improve the identification at both
feature and score levels.
Abstract: A SnO2/CdS/CdTe heterojunction was fabricated by
thermal evaporation technique. The fabricated cells were annealed at
573K for periods of 60, 120 and 180 minutes. The structural
properties of the solar cells have been studied by using X-ray
diffraction. Capacitance- voltage measurements were studied for the
as-prepared and annealed cells at a frequency of 102 Hz. The
capacitance- voltage measurements indicated that these cells are
abrupt. The capacitance decreases with increasing annealing time.
The zero bias depletion region width and the carrier concentration
increased with increasing annealing time. The carrier transport
mechanism for the CdS/CdTe heterojunction in dark is tunneling
recombination. The ideality factor is 1.56 and the reverse bias
saturation current is 9.6×10-10A. The energy band lineup for the n-
CdS/p-CdTe heterojunction was investigated using current - voltage
and capacitance - voltage characteristics.
Abstract: The implementation of the new software and hardware-s technologies for tritium processing nuclear plants, and especially those with an experimental character or of new technology developments shows a coefficient of complexity due to issues raised by the implementation of the performing instrumentation and equipment into a unitary monitoring system of the nuclear technological process of tritium removal. Keeping the system-s flexibility is a demand of the nuclear experimental plants for which the change of configuration, process and parameters is something usual. The big amount of data that needs to be processed stored and accessed for real time simulation and optimization demands the achievement of the virtual technologic platform where the data acquiring, control and analysis systems of the technological process can be integrated with a developed technological monitoring system. Thus, integrated computing and monitoring systems needed for the supervising of the technological process will be executed, to be continued with the execution of optimization system, by choosing new and performed methods corresponding to the technological processes within the tritium removal processing nuclear plants. The developing software applications is executed with the support of the program packages dedicated to industrial processes and they will include acquisition and monitoring sub-modules, named “virtually" as well as the storage sub-module of the process data later required for the software of optimization and simulation of the technological process for tritium removal. The system plays and important role in the environment protection and durable development through new technologies, that is – the reduction of and fight against industrial accidents in the case of tritium processing nuclear plants. Research for monitoring optimisation of nuclear processes is also a major driving force for economic and social development.
Abstract: Today, incorrect use of lands and land use changes,
excessive grazing, no suitable using of agricultural farms, plowing on
steep slopes, road construct, building construct, mine excavation etc
have been caused increasing of soil erosion and sediment yield. For
erosion and sediment estimation one can use statistical and empirical
methods. This needs to identify land unit map and the map of
effective factors. However, these empirical methods are usually time
consuming and do not give accurate estimation of erosion. In this
study, we applied GIS techniques to estimate erosion and sediment of
Menderjan watershed at upstream Zayandehrud river in center of
Iran. Erosion faces at each land unit were defined on the basis of land
use, geology and land unit map using GIS. The UTM coordinates of
each erosion type that showed more erosion amounts such as rills and
gullies were inserted in GIS using GPS data. The frequency of
erosion indicators at each land unit, land use and their sediment yield
of these indices were calculated. Also using tendency analysis of
sediment yield changes in watershed outlet (Menderjan hydrometric
gauge station), was calculated related parameters and estimation
errors. The results of this study according to implemented watershed
management projects can be used for more rapid and more accurate
estimation of erosion than traditional methods. These results can also
be used for regional erosion assessment and can be used for remote
sensing image processing.
Abstract: The iris recognition technology is the most accurate,
fast and less invasive one compared to other biometric techniques
using for example fingerprints, face, retina, hand geometry, voice or
signature patterns. The system developed in this study has the
potential to play a key role in areas of high-risk security and can
enable organizations with means allowing only to the authorized
personnel a fast and secure way to gain access to such areas. The
paper aim is to perform the iris region detection and iris inner and
outer boundaries localization. The system was implemented on
windows platform using Visual C# programming language. It is easy
and efficient tool for image processing to get great performance
accuracy. In particular, the system includes two main parts. The first
is to preprocess the iris images by using Canny edge detection
methods, segments the iris region from the rest of the image and
determine the location of the iris boundaries by applying Hough
transform. The proposed system tested on 756 iris images from 60
eyes of CASIA iris database images.
Abstract: Power systems and transformer are intrinsic apparatus, therefore its reliability and safe operation is important to determine their operation conditions, and the industry uses quality control tests in the insulation design of oil filled transformers. Hence the service period effect on AC dielectric strength is significant. The effect of aging on transformer oil physical, chemical and electrical properties was studied using the international testing methods for the evaluation of transformer oil quality. The study was carried out on six transformers operate in the field and for monitoring periods over twenty years. The properties which are strongly time dependent were specified and those which have a great impact on the transformer oil acidity, breakdown voltage and dissolved gas analysis were defined. Several tests on the transformers oil were studied to know the time of purifying or changing it, moreover prediction of the characteristics of it under different operation conditions.
Abstract: The leaching rate of 137Cs from spent mix bead (anion and cation) exchange resins in a cement-bentonite matrix has been studied. Transport phenomena involved in the leaching of a radioactive material from a cement-bentonite matrix are investigated using three methods based on theoretical equations. These are: the diffusion equation for a plane source an equation for diffusion coupled to a firstorder equation and an empirical method employing a polynomial equation. The results presented in this paper are from a 25-year mortar and concrete testing project that will influence the design choices for radioactive waste packaging for a future Serbian radioactive waste disposal center.
Abstract: Particulate matter (PM) in ambient air is responsible
for adverse health effects in adults and children. Relatively little is
known about the concentrations, sources and health effects of PM in
indoor air. A monitoring study was conducted in Ankara by three
campaigns in order to measure PM levels in indoor and outdoor
environments to identify and quantify associations between sources
and concentrations. Approximately 82 homes (1st campaign for 42,
2nd campaign for 12, and 3rd campaign for 28), three rooms (living
room, baby-s room and living room used as a baby-s room) and
outdoor ambient at each home were sampled with Grimm
Environmental Dust Monitoring (EDM) 107, during different
seasonal periods of 2011 and 2012. In this study, the relationship
between indoor and outdoor PM levels for particulate matter less than
10 micrometer (.m) (PM10), particulate matter less than 2.5.m
(PM2.5) and particulate matter less than 1.0.m (PM1) were
investigated. The mean concentration of PM10, PM2.5, and PM1.0 at
living room used as baby-s room is higher than living and baby-s
room (or bedroom) for three sampling campaigns. It is concluded that
the household activities and environmental conditions are very
important for PM concentrations in the indoor environments during
the sampling periods. The amount of smokers, being near a main
street and/or construction activities increased the PM concentration.
This study is based on the assessment the relationship between indoor
and outdoor PM levels and the household activities and
environmental conditions
Abstract: Pattern recognition is the research area of Artificial
Intelligence that studies the operation and design of systems that
recognize patterns in the data. Important application areas are image
analysis, character recognition, fingerprint classification, speech
analysis, DNA sequence identification, man and machine
diagnostics, person identification and industrial inspection. The
interest in improving the classification systems of data analysis is
independent from the context of applications. In fact, in many
studies it is often the case to have to recognize and to distinguish
groups of various objects, which requires the need for valid
instruments capable to perform this task. The objective of this article
is to show several methodologies of Artificial Intelligence for data
classification applied to biomedical patterns. In particular, this work
deals with the realization of a Computer-Aided Detection system
(CADe) that is able to assist the radiologist in identifying types of
mammary tumor lesions. As an additional biomedical application of
the classification systems, we present a study conducted on blood
samples which shows how these methods may help to distinguish
between carriers of Thalassemia (or Mediterranean Anaemia) and
healthy subjects.
Abstract: This article analyses the relationship between
sovereign credit risk rating and gross domestic product for Central
and Eastern European Countries for the period 1996 – 2010. In order
to study the metioned relationship, we have used a numerical
transformation of the risk qualification, thus: we marked 0 the lowest
risk; then, we went on ascending, with a pace of 5, up to the score of
355 corresponding to the maximum risk. The used method of analysis
is that of econometric modelling with EViews 7.0. programme. This
software allows the analysis of data into a pannel type system,
involving a mix of periods of time and series of data for different
entities. The main conclusion of the work is the one confirming the
negative relationship between the sovereign credit risk and the gross
domestic product for the Central European and Eastern countries
during the reviewed period.
Abstract: The one of best robust search technique on large scale
search area is heuristic and meta heuristic approaches. Especially in
issue that the exploitation of combinatorial status in the large scale
search area prevents the solution of the problem via classical
calculating methods, so such problems is NP-complete. in this
research, the problem of winner determination in combinatorial
auctions have been formulated and by assessing older heuristic
functions, we solve the problem by using of genetic algorithm and
would show that this new method would result in better performance
in comparison to other heuristic function such as simulated annealing
greedy approach.
Abstract: The IFS is a scheme for describing and manipulating complex fractal attractors using simple mathematical models. More precisely, the most popular “fractal –based" algorithms for both representation and compression of computer images have involved some implementation of the method of Iterated Function Systems (IFS) on complete metric spaces. In this paper a new generalized space called Multi-Fuzzy Fractal Space was constructed. On these spases a distance function is defined, and its completeness is proved. The completeness property of this space ensures the existence of a fixed-point theorem for the family of continuous mappings. This theorem is the fundamental result on which the IFS methods are based and the fractals are built. The defined mappings are proved to satisfy some generalizations of the contraction condition.
Abstract: An efficient transient flow simulation for gas
pipelines and networks is presented. The proposed transient flow
simulation is based on the transfer function models and MATLABSimulink.
The equivalent transfer functions of the nonlinear
governing equations are derived for different types of the boundary
conditions. Next, a MATLAB-Simulink library is developed and
proposed considering any boundary condition type. To verify the
accuracy and the computational efficiency of the proposed
simulation, the results obtained are compared with those of the
conventional finite difference schemes (such as TVD, method of
lines, and other finite difference implicit and explicit schemes). The
effects of the flow inertia and the pipeline inclination are
incorporated in this simulation. It is shown that the proposed
simulation has a sufficient accuracy and it is computationally more
efficient than the other methods.
Abstract: Most of the well known methods for generating
Gaussian variables require at least one standard uniform distributed
value, for each Gaussian variable generated. The length of the
random number generator therefore, limits the number of
independent Gaussian distributed variables that can be generated
meanwhile the statistical solution of complex systems requires a
large number of random numbers for their statistical analysis. We
propose an alternative simple method of generating almost infinite
number of Gaussian distributed variables using a limited number of
standard uniform distributed random numbers.
Abstract: Fault-proneness of a software module is the
probability that the module contains faults. To predict faultproneness
of modules different techniques have been proposed which
includes statistical methods, machine learning techniques, neural
network techniques and clustering techniques. The aim of proposed
study is to explore whether metrics available in the early lifecycle
(i.e. requirement metrics), metrics available in the late lifecycle (i.e.
code metrics) and metrics available in the early lifecycle (i.e.
requirement metrics) combined with metrics available in the late
lifecycle (i.e. code metrics) can be used to identify fault prone
modules using Genetic Algorithm technique. This approach has been
tested with real time defect C Programming language datasets of
NASA software projects. The results show that the fusion of
requirement and code metric is the best prediction model for
detecting the faults as compared with commonly used code based
model.
Abstract: Increasing growth of information volume in the
internet causes an increasing need to develop new (semi)automatic
methods for retrieval of documents and ranking them according to
their relevance to the user query. In this paper, after a brief review
on ranking models, a new ontology based approach for ranking
HTML documents is proposed and evaluated in various
circumstances. Our approach is a combination of conceptual,
statistical and linguistic methods. This combination reserves the
precision of ranking without loosing the speed. Our approach
exploits natural language processing techniques for extracting
phrases and stemming words. Then an ontology based conceptual
method will be used to annotate documents and expand the query.
To expand a query the spread activation algorithm is improved so
that the expansion can be done in various aspects. The annotated
documents and the expanded query will be processed to compute
the relevance degree exploiting statistical methods. The outstanding
features of our approach are (1) combining conceptual, statistical
and linguistic features of documents, (2) expanding the query with
its related concepts before comparing to documents, (3) extracting
and using both words and phrases to compute relevance degree, (4)
improving the spread activation algorithm to do the expansion based
on weighted combination of different conceptual relationships and
(5) allowing variable document vector dimensions. A ranking
system called ORank is developed to implement and test the
proposed model. The test results will be included at the end of the
paper.
Abstract: Conventional approaches in the implementation of logic programming applications on embedded systems are solely of software nature. As a consequence, a compiler is needed that transforms the initial declarative logic program to its equivalent procedural one, to be programmed to the microprocessor. This approach increases the complexity of the final implementation and reduces the overall system's performance. On the contrary, presenting hardware implementations which are only capable of supporting logic programs prevents their use in applications where logic programs need to be intertwined with traditional procedural ones, for a specific application. We exploit HW/SW codesign methods to present a microprocessor, capable of supporting hybrid applications using both programming approaches. We take advantage of the close relationship between attribute grammar (AG) evaluation and knowledge engineering methods to present a programmable hardware parser that performs logic derivations and combine it with an extension of a conventional RISC microprocessor that performs the unification process to report the success or failure of those derivations. The extended RISC microprocessor is still capable of executing conventional procedural programs, thus hybrid applications can be implemented. The presented implementation is programmable, supports the execution of hybrid applications, increases the performance of logic derivations (experimental analysis yields an approximate 1000% increase in performance) and reduces the complexity of the final implemented code. The proposed hardware design is supported by a proposed extended C-language called C-AG.
Abstract: There are reports of gas and oil wells fire due to different accidents. Many different methods are used for fire fighting in gas and oil industry. Traditional fire extinguishing techniques are mostly faced with many problems and are usually time consuming and needs lots of equipments. Besides, they cause damages to facilities, and create health and environmental problems. This article proposes innovative approach in fire extinguishing techniques in oil and gas industry, especially applicable for burning oil wells located offshore. Fire extinguishment employing a turbojet is a novel approach which can help to extinguishment the fire in short period of time. Divergent and convergent turbojets modeled in laboratory scale along with a high pressure flame were used. Different experiments were conducted to determine the relationship between output discharges of trumpet and oil wells. The results were corrected and the relationship between dimensionless parameters of flame and fire extinguishment distances and also the output discharge of turbojet and oil wells in specified distances are demonstrated by specific curves.