Abstract: Money laundering has been described by many as the lifeblood of crime and is a major threat to the economic and social well-being of societies. It has been recognized that the banking system has long been the central element of money laundering. This is in part due to the complexity and confidentiality of the banking system itself. It is generally accepted that effective anti-money laundering (AML) measures adopted by banks will make it tougher for criminals to get their "dirty money" into the financial system. In fact, for law enforcement agencies, banks are considered to be an important source of valuable information for the detection of money laundering. However, from the banks- perspective, the main reason for their existence is to make as much profits as possible. Hence their cultural and commercial interests are totally distinct from that of the law enforcement authorities. Undoubtedly, AML laws create a major dilemma for banks as they produce a significant shift in the way banks interact with their customers. Furthermore, the implementation of the laws not only creates significant compliance problems for banks, but also has the potential to adversely affect the operations of banks. As such, it is legitimate to ask whether these laws are effective in preventing money launderers from using banks, or whether they simply put an unreasonable burden on banks and their customers. This paper attempts to address these issues and analyze them against the background of the Malaysian AML laws. It must be said that effective coordination between AML regulator and the banking industry is vital to minimize problems faced by the banks and thereby to ensure effective implementation of the laws in combating money laundering.
Abstract: A series of microarray experiments produces observations
of differential expression for thousands of genes across multiple
conditions.
Principal component analysis(PCA) has been widely used in
multivariate data analysis to reduce the dimensionality of the data in
order to simplify subsequent analysis and allow for summarization of
the data in a parsimonious manner. PCA, which can be implemented
via a singular value decomposition(SVD), is useful for analysis of
microarray data.
For application of PCA using SVD we use the DNA microarray
data for the small round blue cell tumors(SRBCT) of childhood
by Khan et al.(2001). To decide the number of components which
account for sufficient amount of information we draw scree plot.
Biplot, a graphic display associated with PCA, reveals important
features that exhibit relationship between variables and also the
relationship of variables with observations.
Abstract: The crystalline quality of the AlGaN/GaN high electron mobility transistor (HEMT) structure grown on a 200 mm silicon substrate has been investigated using UV-visible micro- Raman scattering and photoluminescence (PL). The visible Raman scattering probes the whole nitride stack with the Si substrate and shows the presence of a small component of residual in-plane stress in the thick GaN buffer resulting from a wafer bowing, while the UV micro-Raman indicates a tensile interfacial stress induced at the top GaN/AlGaN/AlN layers. PL shows a good crystal quality GaN channel where the yellow band intensity is very low compared to that of the near-band-edge transition. The uniformity of this sample is shown by measurements from several points across the epiwafer.
Abstract: In this empirical research, how marketing managers evaluate their firms- performances and decide to make innovation is examined. They use some standards which are past performance of the firm, target performance of the firm, competitor performance, and average performance of the industry to compare and evaluate the firms- performances. It is hypothesized that marketing managers and owners of the firm compare the firms- current performance with these four standards at the same time to decide when to make innovation relating to any aspects of the firm, either management style or products. Relationship between the comparison of the firm-s performance with these standards and innovation are searched in the same regression model. The results of the regression analysis are discussed and some recommendations are made for future studies and applicants.
Abstract: Embedded systems need to respect stringent real
time constraints. Various hardware components included in such
systems such as cache memories exhibit variability and therefore
affect execution time. Indeed, a cache memory access from an
embedded microprocessor might result in a cache hit where the
data is available or a cache miss and the data need to be fetched
with an additional delay from an external memory. It is therefore
highly desirable to predict future memory accesses during
execution in order to appropriately prefetch data without incurring
delays. In this paper, we evaluate the potential of several artificial
neural networks for the prediction of instruction memory
addresses. Neural network have the potential to tackle the nonlinear
behavior observed in memory accesses during program
execution and their demonstrated numerous hardware
implementation emphasize this choice over traditional forecasting
techniques for their inclusion in embedded systems. However,
embedded applications execute millions of instructions and
therefore millions of addresses to be predicted. This very
challenging problem of neural network based prediction of large
time series is approached in this paper by evaluating various neural
network architectures based on the recurrent neural network
paradigm with pre-processing based on the Self Organizing Map
(SOM) classification technique.
Abstract: The objectives of this research are to produce
prototype coconut oil based solvent offset printing inks and to
analyze a basic quality of printing work derived from coconut oil
based solvent offset printing inks, by mean of bringing coconut oil
for producing varnish and bringing such varnish to produce black
offset printing inks. Then, analysis of qualities i.e. CIELAB value,
density value, and dot gain value of printing work from coconut oil
based solvent offset printing inks which printed on gloss-coated
woodfree paper weighs 130 grams were done. The research result of
coconut oil based solvent offset printing inks indicated that the
suitable varnish formulation is using 51% of coconut oil, 36% of
phenolic resin, and 14% of solvent oil 14%, while the result of
producing black offset ink displayed that the suitable formula of
printing ink is using varnish mixed with 20% of coconut oil, and the
analyzing printing work of coconut oil based solvent offset printing
inks which printed on paper, the results were as follows: CIELAB
value of black offset printing ink is at L* = 31.90, a* = 0.27, and b* =
1.86, density value is at 1.27 and dot gain value was high at mid tone
area of image area.
Abstract: Requirements are critical to system validation as they guide all subsequent stages of systems development. Inadequately specified requirements generate systems that require major revisions or cause system failure entirely. Use Cases have become the main vehicle for requirements capture in many current Object Oriented (OO) development methodologies, and a means for developers to communicate with different stakeholders. In this paper we present the results of a laboratory experiment that explored whether different types of use case format are equally effective in facilitating high knowledge user-s understanding. Results showed that the provision of diagrams along with the textual use case descriptions significantly improved user comprehension of system requirements in both familiar and unfamiliar application domains. However, when comparing groups that received models of textual description accompanied with diagrams of different level of details (simple and detailed) we found no significant difference in performance.
Abstract: Echocardiography imaging is one of the most common diagnostic tests that are widely used for assessing the abnormalities of the regional heart ventricle function. The main goal of the image enhancement task in 2D-echocardiography (2DE) is to solve two major anatomical structure problems; speckle noise and low quality. Therefore, speckle noise reduction is one of the important steps that used as a pre-processing to reduce the distortion effects in 2DE image segmentation. In this paper, we present the common filters that based on some form of low-pass spatial smoothing filters such as Mean, Gaussian, and Median. The Laplacian filter was used as a high-pass sharpening filter. A comparative analysis was presented to test the effectiveness of these filters after being applied to original 2DE images of 4-chamber and 2-chamber views. Three statistical quantity measures: root mean square error (RMSE), peak signal-to-ratio (PSNR) and signal-tonoise ratio (SNR) are used to evaluate the filter performance quantitatively on the output enhanced image.
Abstract: The National Agricultural Biotechnology Information
Center (NABIC) plays a leading role in the biotechnology information
database for agricultural plants in Korea. Since 2002, we have
concentrated on functional genomics of major crops, building an
integrated biotechnology database for agro-biotech information that
focuses on bioinformatics of major agricultural resources such as rice,
Chinese cabbage, and microorganisms. In the NABIC,
integration-based biotechnology database provides useful information
through a user-friendly web interface that allows analysis of genome
infrastructure, multiple plants, microbial resources, and living
modified organisms.
Abstract: Flexible Job Shop Problem (FJSP) is an extension of
classical Job Shop Problem (JSP). The FJSP extends the routing
flexibility of the JSP, i.e assigning machine to an operation. Thus it
makes it more difficult than the JSP. In this study, Cooperative Coevolutionary
Genetic Algorithm (CCGA) is presented to solve the
FJSP. Makespan (time needed to complete all jobs) is used as the
performance evaluation for CCGA. In order to test performance and
efficiency of our CCGA the benchmark problems are solved.
Computational result shows that the proposed CCGA is comparable
with other approaches.
Abstract: Falling has been one of the major concerns and threats
to the independence of the elderly in their daily lives. With the
worldwide significant growth of the aging population, it is essential
to have a promising solution of fall detection which is able to operate
at high accuracy in real-time and supports large scale implementation
using multiple cameras. Field Programmable Gate Array (FPGA) is a
highly promising tool to be used as a hardware accelerator in many
emerging embedded vision based system. Thus, it is the main
objective of this paper to present an FPGA-based solution of visual
based fall detection to meet stringent real-time requirements with
high accuracy. The hardware architecture of visual based fall
detection which utilizes the pixel locality to reduce memory accesses
is proposed. By exploiting the parallel and pipeline architecture of
FPGA, our hardware implementation of visual based fall detection
using FGPA is able to achieve a performance of 60fps for a series of
video analytical functions at VGA resolutions (640x480). The results
of this work show that FPGA has great potentials and impacts in
enabling large scale vision system in the future healthcare industry
due to its flexibility and scalability.
Abstract: Research in quantum computation is looking for the consequences of having information encoding, processing and communication exploit the laws of quantum physics, i.e. the laws which govern the ultimate knowledge that we have, today, of the foreign world of elementary particles, as described by quantum mechanics. This paper starts with a short survey of the principles which underlie quantum computing, and of some of the major breakthroughs brought by the first ten to fifteen years of research in this domain; quantum algorithms and quantum teleportation are very biefly presented. The next sections are devoted to one among the many directions of current research in the quantum computation paradigm, namely quantum programming languages and their semantics. A few other hot topics and open problems in quantum information processing and communication are mentionned in few words in the concluding remarks, the most difficult of them being the physical implementation of a quantum computer. The interested reader will find a list of useful references at the end of the paper.
Abstract: In this manuscript, a wavelet-based blind
watermarking scheme has been proposed as a means to provide
security to authenticity of a fingerprint. The information used for
identification or verification of a fingerprint mainly lies in its
minutiae. By robust watermarking of the minutiae in the fingerprint
image itself, the useful information can be extracted accurately even
if the fingerprint is severely degraded. The minutiae are converted in
a binary watermark and embedding these watermarks in the detail
regions increases the robustness of watermarking, at little to no
additional impact on image quality. It has been experimentally shown
that when the minutiae is embedded into wavelet detail coefficients
of a fingerprint image in spread spectrum fashion using a
pseudorandom sequence, the robustness is observed to have a
proportional response while perceptual invisibility has an inversely
proportional response to amplification factor “K". The DWT-based
technique has been found to be very robust against noises,
geometrical distortions filtering and JPEG compression attacks and is
also found to give remarkably better performance than DCT-based
technique in terms of correlation coefficient and number of erroneous
minutiae.
Abstract: An immunomodulator bioproduct is prepared in a
batch bioprocess with a modified bacterium Pseudomonas
aeruginosa. The bioprocess is performed in 100 L Bioengineering
bioreactor with 42 L cultivation medium made of peptone, meat
extract and sodium chloride. The optimal bioprocess parameters were
determined: temperature – 37 0C, agitation speed - 300 rpm, aeration
rate – 40 L/min, pressure – 0.5 bar, Dow Corning Antifoam M-max.
4 % of the medium volume, duration - 6 hours. This kind of
bioprocesses are appreciated as difficult to control because their
dynamic behavior is highly nonlinear and time varying. The aim of
the paper is to present (by comparison) different models based on
experimental data.
The analysis criteria were modeling error and convergence rate.
The estimated values and the modeling analysis were done by using
the Table Curve 2D.
The preliminary conclusions indicate Andrews-s model with a
maximum specific growth rate of the bacterium in the range of
0.8 h-1.
Abstract: In this paper, we propose a modified version of the
Constant Modulus Algorithm (CMA) tailored for blind Decision
Feedback Equalizer (DFE) of first order Markovian time varying
channels. The proposed NonStationary CMA (NSCMA) is designed
so that it explicitly takes into account the Markovian structure of
the channel nonstationarity. Hence, unlike the classical CMA, the
NSCMA is not blind with respect to the channel time variations.
This greatly helps the equalizer in the case of realistic channels, and
avoids frequent transmissions of training sequences.
This paper develops a theoretical analysis of the steady state
performance of the CMA and the NSCMA for DFEs within a time
varying context. Therefore, approximate expressions of the mean
square errors are derived. We prove that in the steady state, the
NSCMA exhibits better performance than the classical CMA. These
new results are confirmed by simulation.
Through an experimental study, we demonstrate that the Bit Error
Rate (BER) is reduced by the NSCMA-DFE, and the improvement
of the BER achieved by the NSCMA-DFE is as significant as the
channel time variations are severe.
Abstract: There is a real threat on the VIPs personal pages on
the Social Network Sites (SNS). The real threats to these pages is
violation of privacy and theft of identity through creating fake pages
that exploit their names and pictures to attract the victims and spread
of lies. In this paper, we propose a new secure architecture that
improves the trusting and finds an effective solution to reduce fake
pages and possibility of recognizing VIP pages on SNS. The
proposed architecture works as a third party that is added to
Facebook to provide the trust service to personal pages for VIPs.
Through this mechanism, it works to ensure the real identity of the
applicant through the electronic authentication of personal
information by storing this information within content of their
website. As a result, the significance of the proposed architecture is
that it secures and provides trust to the VIPs personal pages.
Furthermore, it can help to discover fake page, protect the privacy,
reduce crimes of personality-theft, and increase the sense of trust and
satisfaction by friends and admirers in interacting with SNS.
Abstract: This paper presents an application of level sets for the segmentation of abdominal and thoracic aortic aneurysms in CTA
datasets. An important challenge in reliably detecting aortic is the
need to overcome problems associated with intensity
inhomogeneities. Level sets are part of an important class of methods
that utilize partial differential equations (PDEs) and have been extensively applied in image segmentation. A kernel function in the
level set formulation aids the suppression of noise in the extracted
regions of interest and then guides the motion of the evolving contour
for the detection of weak boundaries. The speed of curve evolution
has been significantly improved with a resulting decrease in segmentation time compared with previous implementations of level
sets, and are shown to be more effective than other approaches in
coping with intensity inhomogeneities. We have applied the Courant
Friedrichs Levy (CFL) condition as stability criterion for our algorithm.
Abstract: Developing a stable early warning system (EWS)
model that is capable to give an accurate prediction is a challenging
task. This paper introduces k-nearest neighbour (k-NN) method
which never been applied in predicting currency crisis before with the
aim of increasing the prediction accuracy. The proposed k-NN
performance depends on the choice of a distance that is used where in
our analysis; we take the Euclidean distance and the Manhattan as a
consideration. For the comparison, we employ three other methods
which are logistic regression analysis (logit), back-propagation neural
network (NN) and sequential minimal optimization (SMO). The
analysis using datasets from 8 countries and 13 macro-economic
indicators for each country shows that the proposed k-NN method
with k = 4 and Manhattan distance performs better than the other
methods.
Abstract: The paper presents a one-dimensional transient
mathematical model of compressible thermal multi-component gas
mixture flows in pipes. The set of the mass, momentum and enthalpy
conservation equations for gas phase is solved. Thermo-physical
properties of multi-component gas mixture are calculated by solving
the Equation of State (EOS) model. The Soave-Redlich-Kwong
(SRK-EOS) model is chosen. Gas mixture viscosity is calculated on
the basis of the Lee-Gonzales-Eakin (LGE) correlation. Numerical
analysis on rapid decompression in conventional dry gases is
performed by using the proposed mathematical model. The model is
validated on measured values of the decompression wave speed in
dry natural gas mixtures. All predictions show excellent agreement
with the experimental data at high and low pressure. The presented
model predicts the decompression in dry natural gas mixtures much
better than GASDECOM and OLGA codes, which are the most
frequently-used codes in oil and gas pipeline transport service.
Abstract: Skin color based tracking techniques often assume a
static skin color model obtained either from an offline set of library
images or the first few frames of a video stream. These models
can show a weak performance in presence of changing lighting or
imaging conditions. We propose an adaptive skin color model based
on the Gaussian mixture model to handle the changing conditions.
Initial estimation of the number and weights of skin color clusters
are obtained using a modified form of the general Expectation
maximization algorithm, The model adapts to changes in imaging
conditions and refines the model parameters dynamically using spatial
and temporal constraints. Experimental results show that the method
can be used in effectively tracking of hand and face regions.