Abstract: This paper presents a new classification algorithm using colour and texture for obstacle detection. Colour information is computationally cheap to learn and process. However in many cases, colour alone does not provide enough information for classification. Texture information can improve classification performance but usually comes at an expensive cost. Our algorithm uses both colour and texture features but texture is only needed when colour is unreliable. During the training stage, texture features are learned specifically to improve the performance of a colour classifier. The algorithm learns a set of simple texture features and only the most effective features are used in the classification stage. Therefore our algorithm has a very good classification rate while is still fast enough to run on a limited computer platform. The proposed algorithm was tested with a challenging outdoor image set. Test result shows the algorithm achieves a much better trade-off between classification performance and efficiency than a typical colour classifier.
Abstract: In unsupervised segmentation context, we propose a bi-dimensional hidden Markov chain model (X,Y) that we adapt to the image segmentation problem. The bi-dimensional observed process Y = (Y 1, Y 2) is such that Y 1 represents the noisy image and Y 2 represents a noisy supplementary information on the image, for example a noisy proportion of pixels of the same type in a neighborhood of the current pixel. The proposed model can be seen as a competitive alternative to the Hilbert-Peano scan. We propose a bayesian algorithm to estimate parameters of the considered model. The performance of this algorithm is globally favorable, compared to the bi-dimensional EM algorithm through numerical and visual data.
Abstract: Aerial and satellite images are information rich. They are also complex to analyze. For GIS systems, many features require fast and reliable extraction of roads and intersections. In this paper, we study efficient and reliable automatic extraction algorithms to address some difficult issues that are commonly seen in high resolution aerial and satellite images, nonetheless not well addressed in existing solutions, such as blurring, broken or missing road boundaries, lack of road profiles, heavy shadows, and interfering surrounding objects. The new scheme is based on a new method, namely reference circle, to properly identify the pixels that belong to the same road and use this information to recover the whole road network. This feature is invariable to the shape and direction of roads and tolerates heavy noise and disturbances. Road extraction based on reference circles is much more noise tolerant and flexible than the previous edge-detection based algorithms. The scheme is able to extract roads reliably from images with complex contents and heavy obstructions, such as the high resolution aerial/satellite images available from Google maps.
Abstract: Compliance requires an effective communication
within an enterprise as well as towards a company-s external
environment. This requirement commences with the
implementation of compliance within large scale compliance
projects and still persists in the compliance reporting within
standard operations. On the one hand the understanding of
compliance necessities within the organization is promoted.
On the other hand reduction of asymmetric information with
compliance stakeholders is achieved. To reach this goal, a
central reporting must provide a consolidated view of different
compliance efforts- statuses. A concept which could be
adapted for this purpose is the balanced scorecard by Kaplan /
Norton. This concept has not been analyzed in detail
concerning its adequacy for a holistic compliance reporting
starting in compliance projects until later usage in regularly
compliance operations.
At first, this paper evaluates if a holistic compliance
reporting can be designed by using the balanced scorecard
concept. The current status of compliance reporting clearly
shows that scorecards are generally accepted as a compliance
reporting tool and are already used for corporate governance
reporting. Additional specialized compliance IT - solutions
exist in the market. After the scorecard-s adequacy is
thoroughly examined and proofed, an example strategy map as
the basis to derive a compliance balanced scorecard is defined.
This definition answers the question on proceeding in
designing a compliance reporting tool.
Abstract: Out of all visual arts including: painting, sculpture,
graphics, photography, architecture, and others, architecture is by far
the most complex one, because the art category is only one of its
determinants. Architecture, to some extent includes other arts which
can significantly influence the shaping of an urban space (artistic
interventions). These arts largely shape the visual culture in
combination with other categories: film, TV, Internet, information
technologies that are "changing the world" etc. In the area of
architecture and urbanism, visual culture is achieved through the
aspects of visual spatial effects. In this context, a complex visual
deliberation about designing urban areas in order to contribute to the
urban visual culture, and with it restore the cultural identity of the
city, is becoming almost the primary concept of contemporary urban
and architectural practice. Research in this paper relate to the city of
Niksic and its place in the visual urban culture. We are looking at the
city’s existing visual effects and determining the directions of
transformability of its physical structure in order to achieve the visual
realization of an urban area and the renewal of cultural identity of a
modern city.
Abstract: the article analyzes the development prospects of
education system in Kazakhstan. Education is among key sources of
culture and social mobility. Modern education must become civic
which means availability of high quality education to all people
irrespective of their racial, ethnic, religious, social, gender and any
other differences. Socially focused nature of modernization of
Kazakhstan-s society is predicated upon formation of a civic
education model in the future. Kazakhstan-s education system
undergoes intensive reforms first of all intended to achieve
international education standards and integration into the global
educational and information space.
Abstract: The design of a complete expansion that allows for
compact representation of certain relevant classes of signals is a
central problem in signal processing applications. Achieving such a
representation means knowing the signal features for the purpose of
denoising, classification, interpolation and forecasting. Multilayer
Neural Networks are relatively a new class of techniques that are
mathematically proven to approximate any continuous function
arbitrarily well. Radial Basis Function Networks, which make use of
Gaussian activation function, are also shown to be a universal
approximator. In this age of ever-increasing digitization in the
storage, processing, analysis and communication of information,
there are numerous examples of applications where one needs to
construct a continuously defined function or numerical algorithm to
approximate, represent and reconstruct the given discrete data of a
signal. Many a times one wishes to manipulate the data in a way that
requires information not included explicitly in the data, which is
done through interpolation and/or extrapolation.
Tidal data are a very perfect example of time series and many
statistical techniques have been applied for tidal data analysis and
representation. ANN is recent addition to such techniques. In the
present paper we describe the time series representation capabilities
of a special type of ANN- Radial Basis Function networks and
present the results of tidal data representation using RBF. Tidal data
analysis & representation is one of the important requirements in
marine science for forecasting.
Abstract: Users of computer systems may often require the
private transfer of messages/communications between parties across
a network. Information warfare and the protection and dominance of
information in the military context is a prime example of an
application area in which the confidentiality of data needs to be
maintained. The safe transportation of critical data is therefore often
a vital requirement for many private communications. However,
unwanted interception/sniffing of communications is also a
possibility. An elementary stealthy transfer scheme is therefore
proposed by the authors. This scheme makes use of encoding,
splitting of a message and the use of a hashing algorithm to verify the
correctness of the reconstructed message. For this proof-of-concept
purpose, the authors have experimented with the random sending of
encoded parts of a message and the construction thereof to
demonstrate how data can stealthily be transferred across a network
so as to prevent the obvious retrieval of data.
Abstract: In open settings, the participants in virtual
organization are autonomous and there is no central authority to
ensure the felicity of their interactions. When agents interact in such
settings, each relies upon being able to model the trustworthiness of
the agents with whom it interacts. Fundamentally, such models must
consider the past behavior of the other parties in order to predict their
future behavior. Further, it is sensible for the agents to share
information via referrals to trustworthy agents. In this article, trust is
a bet on the future contingent actions of others" and enumerates six
major factors supporting it: (1) reputation, (2) performance, (3)
appearance, (4) accountability, (5) precommitment, and (6)
contextual facilitation.
Abstract: In this study, a 3D combustion chamber was simulated
using FLUENT 6.32. Aim to obtain detailed information on
combustion characteristics and _ nitrogen oxides in the furnace and
the effect of oxygen enrichment in a combustion process. Oxygenenriched
combustion is an effective way to reduce emissions. This
paper analyzes NO emission, including thermal NO and prompt NO.
Flow rate ratio of air to fuel is varied as 1.3, 3.2 and 5.1 and the
oxygen enriched flow rates are 28, 54 and 68 lit/min. The 3D
Reynolds Averaged Navier Stokes (RANS) equations with standard
k-ε turbulence model are solved together by Fluent 6.32 software.
First order upwind scheme is used to model governing equations and
the SIMPLE algorithm is used as pressure velocity coupling. Results
show that for AF=1.3, increase the oxygen flow rate of oxygen
reduction in NO emissions is Lance. Moreover, in a fixed oxygen
enrichment condition, increasing the air to fuel ratio will increase the
temperature peak, but not the NO emission rate. As a result, oxygen
enrichment can reduce the NO emission at this kind of furnace in low
air to fuel rates.
Abstract: The amount of the information being churned out by the field of biology has jumped manifold and now requires the extensive use of computer techniques for the management of this information. The predominance of biological information such as protein sequence similarity in the biological information sea is key information for detecting protein evolutionary relationship. Protein sequence similarity typically implies homology, which in turn may imply structural and functional similarities. In this work, we propose, a learning method for detecting remote protein homology. The proposed method uses a transformation that converts protein sequence into fixed-dimensional representative feature vectors. Each feature vector records the sensitivity of a protein sequence to a set of amino acids substrings generated from the protein sequences of interest. These features are then used in conjunction with support vector machines for the detection of the protein remote homology. The proposed method is tested and evaluated on two different benchmark protein datasets and it-s able to deliver improvements over most of the existing homology detection methods.
Abstract: XML files contain data which is in well formatted manner. By studying the format or semantics of the grammar it will be helpful for fast retrieval of the data. There are many algorithms which describes about searching the data from XML files. There are no. of approaches which uses data structure or are related to the contents of the document. In these cases user must know about the structure of the document and information retrieval techniques using NLPs is related to content of the document. Hence the result may be irrelevant or not so successful and may take more time to search.. This paper presents fast XML retrieval techniques by using new indexing technique and the concept of RXML. When indexing an XML document, the system takes into account both the document content and the document structure and assigns the value to each tag from file. To query the system, a user is not constrained about fixed format of query.
Abstract: Information and communication technology (ICT) is
essential to the operation of business, and create many employment
opportunities. High volumes of students graduate in ICT however
students struggle to find job placement. A discrepancy exists between
graduate skills and industry skill requirements. To address the need
for ICT skills required, universities must create programs to meet the
demands of a changing ICT industry. This requires a partnership
between industry, universities and other stakeholders. This situation
may be viewed as a critical systems thinking problem situation as
there are various role players each with their own needs and
requirements. Jackson states a typical critical systems methods has a
pluralistic nature. This paper explores the applicability and suitability
of Maslow and Dooyeweerd to guide understanding and make
recommendations for change in ICT WIL, to foster an all-inclusive
understanding of the situation by stakeholders. The above methods
provide tools for understanding softer issues beyond the skills
required. The study findings suggest that besides skills requirements,
a deeper understanding and empowering students from being a
student to a professional need to be understood and addressed.
Abstract: The recognition of handwritten numeral is an
important area of research for its applications in post office, banks
and other organizations. This paper presents automatic recognition of
handwritten Kannada numerals based on structural features. Five
different types of features, namely, profile based 10-segment string,
water reservoir; vertical and horizontal strokes, end points and
average boundary length from the minimal bounding box are used in
the recognition of numeral. The effect of each feature and their
combination in the numeral classification is analyzed using nearest
neighbor classifiers. It is common to combine multiple categories of
features into a single feature vector for the classification. Instead,
separate classifiers can be used to classify based on each visual
feature individually and the final classification can be obtained based
on the combination of separate base classification results. One
popular approach is to combine the classifier results into a feature
vector and leaving the decision to next level classifier. This method
is extended to extract a better information, possibility distribution,
from the base classifiers in resolving the conflicts among the
classification results. Here, we use fuzzy k Nearest Neighbor (fuzzy
k-NN) as base classifier for individual feature sets, the results of
which together forms the feature vector for the final k Nearest
Neighbor (k-NN) classifier. Testing is done, using different features,
individually and in combination, on a database containing 1600
samples of different numerals and the results are compared with the
results of different existing methods.
Abstract: Knowledge is indispensable but voluminous knowledge becomes a bottleneck for efficient processing. A great challenge for data mining activity is the generation of large number of potential rules as a result of mining process. In fact sometimes result size is comparable to the original data. Traditional data mining pruning activities such as support do not sufficiently reduce the huge rule space. Moreover, many practical applications are characterized by continual change of data and knowledge, thereby making knowledge voluminous with each change. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. Michalski & Winston proposed Censored Production Rules (CPRs), as an extension of production rules, that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence, are tight or there is simply no information available as to whether it holds or not. Thus the 'If P Then D' part of the CPR expresses important information while the Unless C part acts only as a switch changes the polarity of D to ~D. In this paper a scheme based on Dempster-Shafer Theory (DST) interpretation of a CPR is suggested for discovering CPRs from the discovered flat PRs. The discovery of CPRs from flat rules would result in considerable reduction of the already discovered rules. The proposed scheme incrementally incorporates new knowledge and also reduces the size of knowledge base considerably with each episode. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested cumulative learning scheme would be useful in mining data streams.
Abstract: Most of fuzzy clustering algorithms have some
discrepancies, e.g. they are not able to detect clusters with convex
shapes, the number of the clusters should be a priori known, they
suffer from numerical problems, like sensitiveness to the
initialization, etc. This paper studies the synergistic combination of
the hierarchical and graph theoretic minimal spanning tree based
clustering algorithm with the partitional Gath-Geva fuzzy clustering
algorithm. The aim of this hybridization is to increase the robustness
and consistency of the clustering results and to decrease the number
of the heuristically defined parameters of these algorithms to
decrease the influence of the user on the clustering results. For the
analysis of the resulted fuzzy clusters a new fuzzy similarity measure
based tool has been presented. The calculated similarities of the
clusters can be used for the hierarchical clustering of the resulted
fuzzy clusters, which information is useful for cluster merging and
for the visualization of the clustering results. As the examples used
for the illustration of the operation of the new algorithm will show,
the proposed algorithm can detect clusters from data with arbitrary
shape and does not suffer from the numerical problems of the
classical Gath-Geva fuzzy clustering algorithm.
Abstract: Imaging is defined as the process of obtaining
geometric images either two dimensional or three dimensional by scanning or digitizing the existing objects or products. In this research, it applied to retrieve 3D information of the human skin
surface in medical application. This research focuses on analyzing
and determining volume of leg ulcers using imaging devices. Volume
determination is one of the important criteria in clinical assessment of leg ulcer. The volume and size of the leg ulcer wound will give the
indication on responding to treatment whether healing or worsening.
Different imaging techniques are expected to give different result (and accuracies) in generating data and images. Midpoint projection
algorithm was used to reconstruct the cavity to solid model and compute the volume. Misinterpretation of the results can affect the
treatment efficacy. The objectives of this paper is to compare the
accuracy between two 3D data acquisition method, which is laser
triangulation and structured light methods, It was shown that using models with known volume, that structured-light-based 3D technique
produces better accuracy compared with laser triangulation data
acquisition method for leg ulcer volume determination.
Abstract: Realistic 3D face model is desired in various
applications such as face recognition, games, avatars, animations, and
etc. Construction of 3D face model is composed of 1) building a face
shape model and 2) rendering the face shape model. Thus, building a
realistic 3D face shape model is an essential step for realistic 3D face
model. Recently, 3D morphable model is successfully introduced to
deal with the various human face shapes. 3D dense correspondence
problem should be precedently resolved for constructing a realistic 3D
dense morphable face shape model. Several approaches to 3D dense
correspondence problem in 3D face modeling have been proposed
previously, and among them optical flow based algorithms and TPS
(Thin Plate Spline) based algorithms are representative. Optical flow
based algorithms require texture information of faces, which is
sensitive to variation of illumination. In TPS based algorithms
proposed so far, TPS process is performed on the 2D projection
representation in cylindrical coordinates of the 3D face data, not
directly on the 3D face data and thus errors due to distortion in data
during 2D TPS process may be inevitable.
In this paper, we propose a new 3D dense correspondence algorithm
for 3D dense morphable face shape modeling. The proposed algorithm
does not need texture information and applies TPS directly on 3D face
data. Through construction procedures, it is observed that the proposed
algorithm constructs realistic 3D face morphable model reliably and
fast.
Abstract: The adoption of building information modeling (BIM)
is increasing in the construction industry. However, quantity
surveyors are slow in adoption compared to other professions due to
lack of awareness of the BIM’s potential in their profession. It is still
unclear on how BIM application can enhance quantity surveyors’
work performance and project performance. The aim of this research
is to identify the capabilities of BIM in quantity surveying practices
and examine the relationship between BIM capabilities and project
performance. Questionnaire survey and interviews were adopted for
data collection. Literature reviews identified there are eleven BIM
capabilities in quantity surveying practice. Questionnaire results
showed that there are several BIM capabilities significantly
correlated with project performance in time, cost and quality aspects
and the results were validated through interviews. These findings
show that BIM has the capabilities to enhance quantity surveyors’
performances and subsequently improved project performance.
Abstract: A multi-block algorithm and its implementation in two-dimensional finite element numerical model CCHE2D are presented. In addition to a conventional Lagrangian Interpolation Method (LIM), a novel interpolation method, called Consistent Interpolation Method (CIM), is proposed for more accurate information transfer across the interfaces. The consistent interpolation solves the governing equations over the auxiliary elements constructed around the interpolation nodes using the same numerical scheme used for the internal computational nodes. With the CIM, the momentum conservation can be maintained as well as the mass conservation. An imbalance correction scheme is used to enforce the conservation laws (mass and momentum) across the interfaces. Comparisons of the LIM and the CIM are made using several flow simulation examples. It is shown that the proposed CIM is physically more accurate and produces satisfactory results efficiently.