Abstract: The use of Inverse Discrete Fourier Transform (IDFT) implemented in the form of Inverse Fourier Transform (IFFT) is one of the standard method of reconstructing Magnetic Resonance Imaging (MRI) from uniformly sampled K-space data. In this tutorial, three of the major problems associated with the use of IFFT in MRI reconstruction are highlighted. The tutorial also gives brief introduction to MRI physics; MRI system from instrumentation point of view; K-space signal and the process of IDFT and IFFT for One and two dimensional (1D and 2D) data.
Abstract: This study was aimed to study the probability about
the production of fiberboard made of durian rind through latex with
phenolic resin as binding agent. The durian rind underwent the
boiling process with NaOH [7], [8] and then the fiber from durian
rind was formed into fiberboard through heat press. This means that
durian rind could be used as replacement for plywood in plywood
industry by using durian fiber as composite material with adhesive
substance. This research would study the probability about the
production of fiberboard made of durian rind through latex with
phenolic resin as binding agent. At first, durian rind was split,
exposed to light, boiled and steamed in order to gain durian fiber.
Then, fiberboard was tested with the density of 600 Kg/m3 and 800
Kg/m3. in order to find a suitable ratio of durian fiber and latex.
Afterwards, mechanical properties were tested according to the
standards of ASTM and JIS A5905-1994. After the suitable ratio was
known, the test results would be compared with medium density
fiberboard (MDF) and other related research studies. According to
the results, fiberboard made of durian rind through latex with
phenolic resin at the density of 800 Kg/m3 at ratio of 1:1, the
moisture was measured to be 5.05% with specific gravity (ASTM D
2395-07a) of 0.81, density (JIS A 5905-1994) of 0.88 g/m3, tensile
strength, hardness (ASTM D2240), flexibility or elongation at break
yielded similar values as the ones by medium density fiberboard
(MDF).
Abstract: The Kansei engineering is a technology which
converts human feelings into quantitative terms and helps designers
develop new products that meet customers- expectation. Standard
Kansei engineering procedure involves finding relationships between
human feelings and design elements of which many researchers have
found forward and backward relationship through various soft
computing techniques. In this paper, we proposed the framework of
Kansei engineering linking relationship not only between human
feelings and design elements, but also the whole part of product, by
constructing association rules. In this experiment, we obtain input
from emotion score that subjects rate when they see the whole part of
the product by applying semantic differentials. Then, association
rules are constructed to discover the combination of design element
which affects the human feeling. The results of our experiment
suggest the pattern of relationship of design elements according to
human feelings which can be derived from the whole part of product.
Abstract: Securing instream flows for aquatic ecosystems is
critical for sustainable water management and the promotion of
human and environmental health. Using a case study from the semiarid
region of southern Alberta (Canada) this paper considers how
the determination of instream flow standards requires judgments with
respect to: (1) The relationship between instream flow indicators and
assessments of overall environmental health; (2) The indicators used
to determine adequate instream flows, and; (3) The assumptions
underlying efforts to model instream flows given data constraints. It
argues that judgments in each of these areas have an inherently
ethical component because instream flows have direct effects on the
water(s) available to meet obligations to humans and non-humans.
The conclusion expands from the case study to generic issues
regarding instream flows, the growing water ethics literature and
prospects for linking science to policy.
Abstract: In this paper, the dam-reservoir interaction is
analyzed using a finite element approach. The fluid is assumed to be
incompressible, irrotational and inviscid. The assumed boundary
conditions are that the interface of the dam and reservoir is vertical
and the bottom of reservoir is rigid and horizontal. The governing
equation for these boundary conditions is implemented in the
developed finite element code considering the horizontal and vertical
earthquake components. The weighted residual standard Galerkin
finite element technique with 8-node elements is used to discretize
the equation that produces a symmetric matrix equation for the damreservoir
system. A new boundary condition is proposed for
truncating surface of unbounded fluid domain to show the energy
dissipation in the reservoir, through radiation in the infinite upstream
direction. The Sommerfeld-s and perfect damping boundary
conditions are also implemented for a truncated boundary to compare
with the proposed far end boundary. The results are compared with
an analytical solution to demonstrate the accuracy of the proposed
formulation and other truncated boundary conditions in modeling the
hydrodynamic response of an infinite reservoir.
Abstract: In this paper, an approach for finding optimized
layouts for connecting PV units delivering maximum array output
power is suggested. The approach is based on considering the
different varying parameters of PV units that might be extracted from
a general two-diode model. These are mainly, solar irradiation,
reverse saturation currents, ideality factors, series and shunt
resistances in addition to operating temperature. The approach has
been tested on 19 possible 2×3 configurations and allowed to
determine the optimized configurations as well as examine the effects
of the different units- parameters on the maximum output power.
Thus, using this approach, standard arrays with n×m units can be
configured for maximum generated power and allows designing PV
based systems having reduced surfaces to fit specific required power,
as it is the case for solar cars and other mobile systems.
Abstract: The purpose of this paper is to present a Dynamic
Time Warping technique which reduces significantly the data
processing time and memory size of multi-dimensional time series
sampled by the biometric smart pen device BiSP. The acquisition
device is a novel ballpoint pen equipped with a diversity of sensors
for monitoring the kinematics and dynamics of handwriting
movement. The DTW algorithm has been applied for time series
analysis of five different sensor channels providing pressure,
acceleration and tilt data of the pen generated during handwriting on
a paper pad. But the standard DTW has processing time and memory
space problems which limit its practical use for online handwriting
recognition. To face with this problem the DTW has been applied to
the sum of the five sensor signals after an adequate down-sampling
of the data. Preliminary results have shown that processing time and
memory size could significantly be reduced without deterioration of
performance in single character and word recognition. Further
excellent accuracy in recognition was achieved which is mainly due
to the reduced dynamic time warping RDTW technique and a novel
pen device BiSP.
Abstract: Research has suggested that implicit learning tasks
may rely on episodic processing to generate above chance
performance on the standard classification tasks. The current
research examines the invariant features task (McGeorge and Burton,
1990) and argues that such episodic processing is indeed important.
The results of the experiment suggest that both rejection and
similarity strategies are used by participants in this task to
simultaneously reject unfamiliar items and to accept (falsely) familiar
items. Primarily these decisions are based on the presence of low or
high frequency goal based features of the stimuli presented in the
incidental learning phase. It is proposed that a goal based analysis of
the incidental learning task provides a simple step in understanding
which features of the episodic processing are most important for
explaining the match between incidental, implicit learning and test
performance.
Abstract: A comparative study on the feasibility of producing instant high fibre plantain flour for diabetic fufu by blending soy residence with different plantain (Musa spp) varieties (Horn, false Horn and French), all sieved at 60 mesh, mixed in ratio of 60:40 was analyzed for their passing properties using standard analytical method. Results show that VIIIS60 had the highest peak viscosity (303.75 RVU), Trough value (182.08 RVU), final viscosity (284.50 RVU), and lowest in breakdown viscosity (79.58 RVU), set back value (88.17 RVU), peak time (4.36min), pasting temperature (81.18°C) and differed significantly (p
Abstract: Alcohol and water extracts of Cymbopogon citratus
was investigated for anti-bacterial properties and phytochemical
constituents. The extract was screened against four gram-negative
bacteria Escherichia coli, Klebsiella pneumoniae, Pseudomonas
aeruginosa, Proteus vulgaris) and two grampositive bacteria Bacillus
subtilis and Staphylococcus aureus at four different concentrations
(1:1, 1:5, 1:10 and 1:20) using disc diffusion method. The antibacterial
examination was by disc diffusion techniques, while the
photochemical constituents were investigated using standard
chemical methods. Results showed that the extracts inhibited the
growth of standard and local strains of the organisms used. The
treatments were significantly different (P = 0.05). The minimum
inhibitory concentration of the extracts against the tested
microorganisms ranged between 150mg/ml and 50mg/ml. The
alcohol extracts were found to be generally more effective than the
water extract. The photochemical analysis revealed the presence of
alkaloids and phenol but absence of cardiac and cyanogenic
glycosides. The presence of alkaloid and phenols were inferred as
being responsible for the anti-bacterial properties of the extracts.
Abstract: An approach to develop the FPGA of a flexible key
RSA encryption engine that can be used as a standard device in the
secured communication system is presented. The VHDL modeling of
this RSA encryption engine has the unique characteristics of
supporting multiple key sizes, thus can easily be fit into the systems
that require different levels of security. A simple nested loop addition
and subtraction have been used in order to implement the RSA
operation. This has made the processing time faster and used
comparatively smaller amount of space in the FPGA. The hardware
design is targeted on Altera STRATIX II device and determined that
the flexible key RSA encryption engine can be best suited in the
device named EP2S30F484C3. The RSA encryption implementation
has made use of 13,779 units of logic elements and achieved a clock
frequency of 17.77MHz. It has been verified that this RSA
encryption engine can perform 32-bit, 256-bit and 1024-bit
encryption operation in less than 41.585us, 531.515us and 790.61us
respectively.
Abstract: Trace element speciation of an integrated soil
amendment matrix was studied with a modified BCR sequential
extraction procedure. The analysis included pseudo-total
concentration determinations according to USEPA 3051A and
relevant physicochemical properties by standardized methods. Based
on the results, the soil amendment matrix possessed neutralization
capacity comparable to commercial fertilizers. Additionally, the
pseudo-total concentrations of all trace elements included in the
Finnish regulation for agricultural fertilizers were lower than the
respective statutory limit values. According to chemical speciation,
the lability of trace elements increased in the following order: Hg <
Cr < Co < Cu < As < Zn < Ni < Pb < Cd < V < Mo < Ba. The
validity of the BCR approach as a tool for chemical speciation was
confirmed by the additional acid digestion phase. Recovery of trace
elements during the procedure assured the validity of the approach
and indicated good quality of the analytical work.
Abstract: This paper presents a new version of the SVM mixture algorithm initially proposed by Kwok for classification and regression problems. For both cases, a slight modification of the mixture model leads to a standard SVM training problem, to the existence of an exact solution and allows the direct use of well known decomposition and working set selection algorithms. Only the regression case is considered in this paper but classification has been addressed in a very similar way. This method has been successfully applied to engine pollutants emission modeling.
Abstract: Different types of Islamic debts have been
increasingly utilized as preferred means of debt funding by
Malaysian private firms in recent years. This study examines the
impact of Islamic debts announcement on private firms- stock
returns. Our sample includes forty five listed companies on Bursa
Malaysia involved in issuing of Islamic debts during 2005 to 2008.
The abnormal returns and cumulative average abnormal returns are
calculated and tested using standard event study methodology. The
results show that a significant, negative abnormal return occurs one
day before announcement date. This negative abnormal return is
representing market participant-s adverse attitude toward Islamic
private debt announcement during the research period.
Abstract: The existing image coding standards generally degrades at low bit-rates because of the underlying block based Discrete Cosine Transform scheme. Over the past decade, the success of wavelets in solving many different problems has contributed to its unprecedented popularity. Due to implementation constraints scalar wavelets do not posses all the properties such as orthogonality, short support, linear phase symmetry, and a high order of approximation through vanishing moments simultaneously, which are very much essential for signal processing. New class of wavelets called 'Multiwavelets' which posses more than one scaling function overcomes this problem. This paper presents a new image coding scheme based on non linear approximation of multiwavelet coefficients along with multistage vector quantization. The performance of the proposed scheme is compared with the results obtained from scalar wavelets.
Abstract: This paper presents the application of a signal intensity
independent similarity criterion for rigid and non-rigid body
registration of binary objects. The criterion is defined as the
weighted ratio image of two images. The ratio is computed on a
voxel per voxel basis and weighting is performed by setting the raios
between signal and background voxels to a standard high value. The
mean squared value of the weighted ratio is computed over the union
of the signal areas of the two images and it is minimized using the
Chebyshev polynomial approximation.
Abstract: Knowing about the customer behavior in a grocery has
been a long-standing issue in the retailing industry. The advent of
RFID has made it easier to collect moving data for an individual
shopper's behavior. Most of the previous studies used the traditional
statistical clustering technique to find the major characteristics of
customer behavior, especially shopping path. However, in using the
clustering technique, due to various spatial constraints in the store,
standard clustering methods are not feasible because moving data such
as the shopping path should be adjusted in advance of the analysis,
which is time-consuming and causes data distortion. To alleviate this
problem, we propose a new approach to spatial pattern clustering
based on the longest common subsequence. Experimental results using
real data obtained from a grocery confirm the good performance of the
proposed method in finding the hot spot, dead spot and major path
patterns of customer movements.
Abstract: Data security in u-Health system can be an important
issue because wireless network is vulnerable to hacking. However, it is
not easy to implement a proper security algorithm in an embedded
u-health monitoring because of hardware constraints such as low
performance, power consumption and limited memory size and etc. To
secure data that contain personal and biosignal information, we
implemented several security algorithms such as Blowfish, data
encryption standard (DES), advanced encryption standard (AES) and
Rivest Cipher 4 (RC4) for our u-Health monitoring system and the
results were successful. Under the same experimental conditions, we
compared these algorithms. RC4 had the fastest execution time.
Memory usage was the most efficient for DES. However, considering
performance and safety capability, however, we concluded that AES
was the most appropriate algorithm for a personal u-Health monitoring
system.
Abstract: In this paper, we propose effective system for digital music retrieval. We divided proposed system into Client and Server. Client part consists of pre-processing and Content-based feature extraction stages. In pre-processing stage, we minimized Time code Gap that is occurred among same music contents. As content-based feature, first-order differentiated MFCC were used. These presented approximately envelop of music feature sequences. Server part included Music Server and Music Matching stage. Extracted features from 1,000 digital music files were stored in Music Server. In Music Matching stage, we found retrieval result through similarity measure by DTW. In experiment, we used 450 queries. These were made by mixing different compression standards and sound qualities from 50 digital music files. Retrieval accurate indicated 97% and retrieval time was average 15ms in every single query. Out experiment proved that proposed system is effective in retrieve digital music and robust at various user environments of web.
Abstract: Nowadays companies strive to survive in a
competitive global environment. To speed up product
development/modifications, it is suggested to adopt a collaborative
product development approach. However, despite the advantages of
new IT improvements still many CAx systems work separately and
locally. Collaborative design and manufacture requires a product
information model that supports related CAx product data models. To
solve this problem many solutions are proposed, which the most
successful one is adopting the STEP standard as a product data model
to develop a collaborative CAx platform. However, the improvement
of the STEP-s Application Protocols (APs) over the time, huge
number of STEP AP-s and cc-s, the high costs of implementation,
costly process for conversion of older CAx software files to the STEP
neutral file format; and lack of STEP knowledge, that usually slows
down the implementation of the STEP standard in collaborative data
exchange, management and integration should be considered. In this
paper the requirements for a successful collaborative CAx system is
discussed. The STEP standard capability for product data integration
and its shortcomings as well as the dominant platforms for supporting
CAx collaboration management and product data integration are
reviewed. Finally a platform named LAYMOD to fulfil the
requirements of CAx collaborative environment and integrating the
product data is proposed. The platform is a layered platform to enable
global collaboration among different CAx software
packages/developers. It also adopts the STEP modular architecture
and the XML data structures to enable collaboration between CAx
software packages as well as overcoming the STEP standard
limitations. The architecture and procedures of LAYMOD platform
to manage collaboration and avoid contradicts in product data
integration are introduced.