Abstract: Optimization of filter banks based on the knowledge of input statistics has been of interest for a long time. Finite impulse response (FIR) Compaction filters are used in the design of optimal signal adapted orthonormal FIR filter banks. In this paper we discuss three different approaches for the design of interpolated finite impulse response (IFIR) compaction filters. In the first method, the magnitude squared response satisfies Nyquist constraint approximately. In the second and third methods Nyquist constraint is exactly satisfied. These methods yield FIR compaction filters whose response is comparable with that of the existing methods. At the same time, IFIR filters enjoy significant saving in the number of multipliers and can be implemented efficiently. Since eigenfilter approach is used here, the method is less complex. Design of IFIR filters in the least square sense is presented.
Abstract: Due to growing environmental concerns of the cement
industry, alternative cement technologies have become an area of
increasing interest. It is now believed that new binders are
indispensable for enhanced environmental and durability
performance. Self-compacting Geopolymer concrete is an innovative
method and improved way of concreting operation that does not
require vibration for placing it and is produced by complete
elimination of ordinary Portland cement.
This paper documents the assessment of the compressive strength
and workability characteristics of low-calcium fly ash based selfcompacting
geopolymer concrete. The essential workability
properties of the freshly prepared Self-compacting Geopolymer
concrete such as filling ability, passing ability and segregation
resistance were evaluated by using Slump flow, V-funnel, L-box and
J-ring test methods. The fundamental requirements of high
flowability and segregation resistance as specified by guidelines on
Self Compacting Concrete by EFNARC were satisfied. In addition,
compressive strength was determined and the test results are included
here. This paper also reports the effect of extra water, curing time and
curing temperature on the compressive strength of self-compacting
geopolymer concrete. The test results show that extra water in the
concrete mix plays a significant role. Also, longer curing time and
curing the concrete specimens at higher temperatures will result in
higher compressive strength.
Abstract: Visually impaired people find it extremely difficult to
acquire basic and vital information necessary for their living.
Therefore, they are at a very high risk of being socially excluded as a
result of poor access to information. In recent years, several attempts
have been made in improving the communication methods for
visually impaired people which involve tactile sensation such as
finger Braille, manual alphabets and the print on palm method and
several other electronic devices. But, there are some problems which
arise in such methods such as lack of privacy and lack of
compatibility to computer environment. This paper describes a low
cost Braille hand glove for blind people using slot sensors and
vibration motors with the help of which they can read and write emails,
text messages and read e-books. This glove allows the person
to type characters based on different Braille combination using six
slot sensors. The vibration in six different positions of the glove
which matches to the Braille code allows them to read characters.
Abstract: Physical urban form is recognized to be the media for
human transactions. It directly influences the travel demand of people
in a specific urban area and the amount of energy used for
transportation. Distorted, sprawling form often creates sustainability
problems in urban areas. It is declared in EU strategic planning
documents that compact urban form and mixed land use pattern must
be given the main focus to achieve better sustainability in urban
areas, but the methods to measure and compare these characteristics
are still not clear.
This paper presents the simple methods to measure the spatial
characteristics of urban form by analyzing the location and
distribution of objects in an urban environment. The extended CA
(cellular automata) model is used to simulate urban development
scenarios.
Abstract: Most of the image watermarking methods, using the properties of the human visual system (HVS), have been proposed in literature. The component of the visual threshold is usually related to either the spatial contrast sensitivity function (CSF) or the visual masking. Especially on the contrast masking, most methods have not mention to the effect near to the edge region. Since the HVS is sensitive what happens on the edge area. This paper proposes ultrasound image watermarking using the visual threshold corresponding to the HVS in which the coefficients in a DCT-block have been classified based on the texture, edge, and plain area. This classification method enables not only useful for imperceptibility when the watermark is insert into an image but also achievable a robustness of watermark detection. A comparison of the proposed method with other methods has been carried out which shown that the proposed method robusts to blockwise memoryless manipulations, and also robust against noise addition.
Abstract: This paper presents three models which enable the
customisation of Universal Description, Discovery and Integration
(UDDI) query results, based on some pre-defined and/or real-time
changing parameters. These proposed models detail the requirements,
design and techniques which make ranking of Web service discovery
results from a service registry possible. Our contribution is two fold:
First, we present an extension to the UDDI inquiry capabilities. This
enables a private UDDI registry owner to customise or rank the query
results, based on its business requirements. Second, our proposal
utilises existing technologies and standards which require minimal
changes to existing UDDI interfaces or its data structures. We believe
these models will serve as valuable reference for enhancing the
service discovery methods within a private UDDI registry
environment.
Abstract: Bio-chips are used for experiments on genes and
contain various information such as genes, samples and so on. The
two-dimensional bio-chips, in which one axis represent genes and the
other represent samples, are widely being used these days. Instead of
experimenting with real genes which cost lots of money and much
time to get the results, bio-chips are being used for biological
experiments. And extracting data from the bio-chips with high
accuracy and finding out the patterns or useful information from such
data is very important. Bio-chip analysis systems extract data from
various kinds of bio-chips and mine the data in order to get useful
information. One of the commonly used methods to mine the data is
classification. The algorithm that is used to classify the data can be
various depending on the data types or number characteristics and so
on. Considering that bio-chip data is extremely large, an algorithm that
imitates the ecosystem such as the ant algorithm is suitable to use as an
algorithm for classification. This paper focuses on finding the
classification rules from the bio-chip data using the Ant Colony
algorithm which imitates the ecosystem. The developed system takes
in consideration the accuracy of the discovered rules when it applies it
to the bio-chip data in order to predict the classes.
Abstract: A trend in agent community or enterprises is that they are shifting from closed to open architectures composed of a large number of autonomous agents. One of its implications could be that interface agent framework is getting more important in multi-agent system (MAS); so that systems constructed for different application domains could share a common understanding in human computer interface (HCI) methods, as well as human-agent and agent-agent interfaces. However, interface agent framework usually receives less attention than other aspects of MAS. In this paper, we will propose an interface web agent framework which is based on our former project called WAF and a Distributed HCI template. A group of new functionalities and implications will be discussed, such as web agent presentation, off-line agent reference, reconfigurable activation map of agents, etc. Their enabling techniques and current standards (e.g. existing ontological framework) are also suggested and shown by examples from our own implementation in WAF.
Abstract: Self-Excited Induction Generator (SEIG) builds up voltage while it enters in its magnetic saturation region. Due to non-linear magnetic characteristics, the performance analysis of SEIG involves cumbersome mathematical computations. The dependence of air-gap voltage on saturated magnetizing reactance can only be established at rated frequency by conducting a laboratory test commonly known as synchronous run test. But, there is no laboratory method to determine saturated magnetizing reactance and air-gap voltage of SEIG at varying speed, terminal capacitance and other loading conditions. For overall analysis of SEIG, prior information of magnetizing reactance, generated frequency and air-gap voltage is essentially required. Thus, analytical methods are the only alternative to determine these variables. Non-existence of direct mathematical relationship of these variables for different terminal conditions has forced the researchers to evolve new computational techniques. Artificial Neural Networks (ANNs) are very useful for solution of such complex problems, as they do not require any a priori information about the system. In this paper, an attempt is made to use cascaded neural networks to first determine the generated frequency and magnetizing reactance with varying terminal conditions and then air-gap voltage of SEIG. The results obtained from the ANN model are used to evaluate the overall performance of SEIG and are found to be in good agreement with experimental results. Hence, it is concluded that analysis of SEIG can be carried out effectively using ANNs.
Abstract: Polynomial bases and normal bases are both used for
elliptic curve cryptosystems, but field arithmetic operations such as
multiplication, inversion and doubling for each basis are implemented
by different methods. In general, it is said that normal bases, especially
optimal normal bases (ONB) which are special cases on normal bases,
are efficient for the implementation in hardware in comparison with
polynomial bases. However there seems to be more examined by
implementing and analyzing these systems under similar condition. In
this paper, we designed field arithmetic operators for each basis over
GF(2233), which field has a polynomial basis recommended by SEC2
and a type-II ONB both, and analyzed these implementation results.
And, in addition, we predicted the efficiency of two elliptic curve
cryptosystems using these field arithmetic operators.
Abstract: The main objective of this paper is to determine the
isolated effect of silica fume on tensile, compressive and flexure strengths on high strength lightweight concrete. Many experiments
were carried out by replacing cement with different percentages of silica fume at different constant water-binder ratio keeping other mix
design variables constant. The silica fume was replaced by 0%, 5%,
10%, 15%, 20% and 25% for a water-binder ratios ranging from 0.26
to 0.42. For all mixes, split tensile, compressive and flexure strengths
were determined at 28 days. The results showed that the tensile, compressive and flexure strengths increased with silica fume incorporation but the optimum replacement percentage is not
constant because it depends on the water–cementitious material (w/cm) ratio of the mix. Based on the results, a relationship between
split tensile, compressive and flexure strengths of silica fume concrete was developed using statistical methods.
Abstract: Medical imaging uses the advantage of digital
technology in imaging and teleradiology. In teleradiology systems
large amount of data is acquired, stored and transmitted. A major
technology that may help to solve the problems associated with the
massive data storage and data transfer capacity is data compression
and decompression. There are many methods of image compression
available. They are classified as lossless and lossy compression
methods. In lossy compression method the decompressed image
contains some distortion. Fractal image compression (FIC) is a lossy
compression method. In fractal image compression an image is
coded as a set of contractive transformations in a complete metric
space. The set of contractive transformations is guaranteed to
produce an approximation to the original image. In this paper FIC is
achieved by PIFS using quadtree partitioning. PIFS is applied on
different images like , Ultrasound, CT Scan, Angiogram, X-ray,
Mammograms. In each modality approximately twenty images are
considered and the average values of compression ratio and PSNR
values are arrived. In this method of fractal encoding, the
parameter, tolerance factor Tmax, is varied from 1 to 10, keeping the
other standard parameters constant. For all modalities of images the
compression ratio and Peak Signal to Noise Ratio (PSNR) are
computed and studied. The quality of the decompressed image is
arrived by PSNR values. From the results it is observed that the
compression ratio increases with the tolerance factor and
mammogram has the highest compression ratio. The quality of the
image is not degraded upto an optimum value of tolerance factor,
Tmax, equal to 8, because of the properties of fractal compression.
Abstract: Apart from geometry, functionality is one of the most
significant hallmarks of a product. The functionality of a product can
be considered as the fundamental justification for a product
existence. Therefore a functional analysis including a complete and
reliable descriptor has a high potential to improve product
development process in various fields especially in knowledge-based
design. One of the important applications of the functional analysis
and indexing is in retrieval and design reuse concept. More than 75%
of design activity for a new product development contains reusing
earlier and existing design know-how. Thus, analysis and
categorization of product functions concluded by functional
indexing, influences directly in design optimization. This paper
elucidates and evaluates major classes for functional analysis by
discussing their major methods. Moreover it is finalized by
presenting a noble hybrid approach for functional analysis.
Abstract: Hydrogen that used as fuel in fuel cell vehicles can be
produced from renewable sources such as wind, solar, and hydro
technologies. PV-electrolyzer is one of the promising methods to
produce hydrogen with zero pollution emission. Hydrogen
production from a PV-electrolyzer system depends on the efficiency
of the electrolyzer and photovoltaic array, and sun irradiance at that
site. In this study, the amount of hydrogen is obtained using
mathematical equations for difference driving distance and sun peak
hours. The results show that the minimum of 99 PV modules are used
to generate 1.75 kgH2 per day for two vehicles.
Abstract: 2D/3D registration is a special case of medical image
registration which is of particular interest to surgeons. Applications
of 2D/3D registration are [1] radiotherapy planning and treatment
verification, spinal surgery, hip replacement, neurointerventions and
aortic stenting. The purpose of this paper is to provide a literature
review of the main methods for image registration for the 2D/3D
case. At the end of the paper an algorithm is proposed for 2D/3D
registration based on the Chebyssev polynomials iteration loop.
Abstract: Knowledge Discovery in Databases (KDD) is the process of extracting previously unknown, hidden and interesting patterns from a huge amount of data stored in databases. Data mining is a stage of the KDD process that aims at selecting and applying a particular data mining algorithm to extract an interesting and useful knowledge. It is highly expected that data mining methods will find interesting patterns according to some measures, from databases. It is of vital importance to define good measures of interestingness that would allow the system to discover only the useful patterns. Measures of interestingness are divided into objective and subjective measures. Objective measures are those that depend only on the structure of a pattern and which can be quantified by using statistical methods. While, subjective measures depend only on the subjectivity and understandability of the user who examine the patterns. These subjective measures are further divided into actionable, unexpected and novel. The key issues that faces data mining community is how to make actions on the basis of discovered knowledge. For a pattern to be actionable, the user subjectivity is captured by providing his/her background knowledge about domain. Here, we consider the actionability of the discovered knowledge as a measure of interestingness and raise important issues which need to be addressed to discover actionable knowledge.
Abstract: When consistently innovative business-models can
give companies a competitive advantage, longitudinal empirical
research, which can reflect dynamic business-model changes, has yet
to prove a definitive connection. This study consequently employs a
dynamic perspective in conjunction with innovation theory to examine
the relationship between the types of business-model innovation and
firm value. This study tries to examine various types of
business-model innovation in high-end and low-end technology
industries such as HTC and the 7-Eleven chain stores with research
periods of 14 years and 32 years, respectively. The empirical results
suggest that adopting radical business-model innovation in addition to
expanding new target markets can successfully lead to a competitive
advantage. Sustained advanced technological competences and
service/product innovation are the key successful factors in high-end
and low-end technology industry business-models respectively. In
sum up, the business-model innovation can yield a higher market value
and financial value in high-end technology industries than low-end
ones.
Abstract: Date palm (Phoenix dactylifera L.) seeds are waste streams which are considered a major problem to the food industry. They contain potentially useful protein (10-15% of the whole date-s weight). Global production, industrialisation and utilisation of dates are increasing steadily. The worldwide production of date palm fruit has increased from 1.8 million tons in 1961 to 6.9 million tons in 2005, thus from the global production of dates are almost 800.000 tonnes of date palm seeds are not currently used [1]. The current study was carried out to convert the date palm seeds into useful protein powder. Compositional analysis showed that the seeds were rich in protein and fat 5.64 and 8.14% respectively. We used several laboratory scale methods to extract proteins from seed to produce a high protein powder. These methods included simple acid or alkali extraction, with or without ultrafiltration and phenol trichloroacetic acid with acetone precipitation (Ph/TCA method). The highest protein content powder (68%) was obtained by Ph/TCA method with yield of material (44%) whereas; the use of just alkali extraction gave the lowest protein content of 8%, and a yield of 32%.
Abstract: Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of features selection methods to reduce the dimensionality of the document-representation vector. Four feature selection methods are evaluated: Random Selection, Information Gain (IG), Support Vector Machine (called SVM_FS) and Genetic Algorithm with SVM (GA_FS). We showed that the best results were obtained with SVM_FS and GA_FS methods for a relatively small dimension of the features vector comparative with the IG method that involves longer vectors, for quite similar classification accuracies. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).
Abstract: This study focuses on examining why the range of
experience with respect to HIV infection is so diverse, especially in
regard to the latency period. An agent-based approach in modelling
the infection is used to extract high-level behaviour which cannot be
obtained analytically from the set of interaction rules at the cellular
level. A prototype model encompasses local variation in baseline
properties, contributing to the individual disease experience, and is
included in a network which mimics the chain of lymph nodes. The
model also accounts for stochastic events such as viral mutations.
The size and complexity of the model require major computational
effort and parallelisation methods are used.