Abstract: This paper presents the use of a newly created network
structure known as a Self-Delaying Dynamic Network (SDN) to
create a high resolution image from a set of time stepped input
frames. These SDNs are non-recurrent temporal neural networks
which can process time sampled data. SDNs can store input data
for a lifecycle and feature dynamic logic based connections between
layers. Several low resolution images and one high resolution image
of a scene were presented to the SDN during training by a Genetic
Algorithm. The SDN was trained to process the input frames in order
to recreate the high resolution image. The trained SDN was then used
to enhance a number of unseen noisy image sets. The quality of high
resolution images produced by the SDN is compared to that of high
resolution images generated using Bi-Cubic interpolation. The SDN
produced images are superior in several ways to the images produced
using Bi-Cubic interpolation.
Abstract: Money laundering has been described by many as the lifeblood of crime and is a major threat to the economic and social well-being of societies. It has been recognized that the banking system has long been the central element of money laundering. This is in part due to the complexity and confidentiality of the banking system itself. It is generally accepted that effective anti-money laundering (AML) measures adopted by banks will make it tougher for criminals to get their "dirty money" into the financial system. In fact, for law enforcement agencies, banks are considered to be an important source of valuable information for the detection of money laundering. However, from the banks- perspective, the main reason for their existence is to make as much profits as possible. Hence their cultural and commercial interests are totally distinct from that of the law enforcement authorities. Undoubtedly, AML laws create a major dilemma for banks as they produce a significant shift in the way banks interact with their customers. Furthermore, the implementation of the laws not only creates significant compliance problems for banks, but also has the potential to adversely affect the operations of banks. As such, it is legitimate to ask whether these laws are effective in preventing money launderers from using banks, or whether they simply put an unreasonable burden on banks and their customers. This paper attempts to address these issues and analyze them against the background of the Malaysian AML laws. It must be said that effective coordination between AML regulator and the banking industry is vital to minimize problems faced by the banks and thereby to ensure effective implementation of the laws in combating money laundering.
Abstract: Comparison of two approaches for the simulation of
the dynamic behaviour of a permanent magnet linear actuator is
presented. These are full coupled model, where the electromagnetic
field, electric circuit and mechanical motion problems are solved
simultaneously, and decoupled model, where first a set of static
magnetic filed analysis is carried out and then the electric circuit and
mechanical motion equations are solved employing bi-cubic spline
approximations of the field analysis results. The results show that the
proposed decoupled model is of satisfactory accuracy and gives more
flexibility when the actuator response is required to be estimated for
different external conditions, e.g. external circuit parameters or
mechanical loads.
Abstract: Multi-energy systems will enhance the system
reliability and power quality. This paper presents an integrated
approach for the design and operation of distributed energy resources
(DER) systems, based on energy hub modeling. A multi-objective
optimization model is developed by considering an integrated view of
electricity and natural gas network to analyze the optimal design and
operating condition of DER systems, by considering two conflicting
objectives, namely, minimization of total cost and the minimization
of environmental impact which is assessed in terms of CO2
emissions. The mathematical model considers energy demands of the
site, local climate data, and utility tariff structure, as well as technical
and financial characteristics of the candidate DER technologies. To
provide energy demands, energy systems including photovoltaic, and
co-generation systems, boiler, central power grid are considered. As
an illustrative example, a hotel in Iran demonstrates potential
applications of the proposed method. The results prove that
increasing the satisfaction degree of environmental objective leads to
increased total cost.
Abstract: A series of microarray experiments produces observations
of differential expression for thousands of genes across multiple
conditions.
Principal component analysis(PCA) has been widely used in
multivariate data analysis to reduce the dimensionality of the data in
order to simplify subsequent analysis and allow for summarization of
the data in a parsimonious manner. PCA, which can be implemented
via a singular value decomposition(SVD), is useful for analysis of
microarray data.
For application of PCA using SVD we use the DNA microarray
data for the small round blue cell tumors(SRBCT) of childhood
by Khan et al.(2001). To decide the number of components which
account for sufficient amount of information we draw scree plot.
Biplot, a graphic display associated with PCA, reveals important
features that exhibit relationship between variables and also the
relationship of variables with observations.
Abstract: We present a novel scheme to evaluate sinusoidal functions with low complexity and high precision using cubic spline interpolation. To this end, two different approaches are proposed to find the interpolating polynomial of sin(x) within the range [- π , π]. The first one deals with only a single data point while the other with two to keep the realization cost as low as possible. An approximation error optimization technique for cubic spline interpolation is introduced next and is shown to increase the interpolator accuracy without increasing complexity of the associated hardware. The architectures for the proposed approaches are also developed, which exhibit flexibility of implementation with low power requirement.
Abstract: The crystalline quality of the AlGaN/GaN high electron mobility transistor (HEMT) structure grown on a 200 mm silicon substrate has been investigated using UV-visible micro- Raman scattering and photoluminescence (PL). The visible Raman scattering probes the whole nitride stack with the Si substrate and shows the presence of a small component of residual in-plane stress in the thick GaN buffer resulting from a wafer bowing, while the UV micro-Raman indicates a tensile interfacial stress induced at the top GaN/AlGaN/AlN layers. PL shows a good crystal quality GaN channel where the yellow band intensity is very low compared to that of the near-band-edge transition. The uniformity of this sample is shown by measurements from several points across the epiwafer.
Abstract: In this empirical research, how marketing managers evaluate their firms- performances and decide to make innovation is examined. They use some standards which are past performance of the firm, target performance of the firm, competitor performance, and average performance of the industry to compare and evaluate the firms- performances. It is hypothesized that marketing managers and owners of the firm compare the firms- current performance with these four standards at the same time to decide when to make innovation relating to any aspects of the firm, either management style or products. Relationship between the comparison of the firm-s performance with these standards and innovation are searched in the same regression model. The results of the regression analysis are discussed and some recommendations are made for future studies and applicants.
Abstract: Embedded systems need to respect stringent real
time constraints. Various hardware components included in such
systems such as cache memories exhibit variability and therefore
affect execution time. Indeed, a cache memory access from an
embedded microprocessor might result in a cache hit where the
data is available or a cache miss and the data need to be fetched
with an additional delay from an external memory. It is therefore
highly desirable to predict future memory accesses during
execution in order to appropriately prefetch data without incurring
delays. In this paper, we evaluate the potential of several artificial
neural networks for the prediction of instruction memory
addresses. Neural network have the potential to tackle the nonlinear
behavior observed in memory accesses during program
execution and their demonstrated numerous hardware
implementation emphasize this choice over traditional forecasting
techniques for their inclusion in embedded systems. However,
embedded applications execute millions of instructions and
therefore millions of addresses to be predicted. This very
challenging problem of neural network based prediction of large
time series is approached in this paper by evaluating various neural
network architectures based on the recurrent neural network
paradigm with pre-processing based on the Self Organizing Map
(SOM) classification technique.
Abstract: The objectives of this research are to produce
prototype coconut oil based solvent offset printing inks and to
analyze a basic quality of printing work derived from coconut oil
based solvent offset printing inks, by mean of bringing coconut oil
for producing varnish and bringing such varnish to produce black
offset printing inks. Then, analysis of qualities i.e. CIELAB value,
density value, and dot gain value of printing work from coconut oil
based solvent offset printing inks which printed on gloss-coated
woodfree paper weighs 130 grams were done. The research result of
coconut oil based solvent offset printing inks indicated that the
suitable varnish formulation is using 51% of coconut oil, 36% of
phenolic resin, and 14% of solvent oil 14%, while the result of
producing black offset ink displayed that the suitable formula of
printing ink is using varnish mixed with 20% of coconut oil, and the
analyzing printing work of coconut oil based solvent offset printing
inks which printed on paper, the results were as follows: CIELAB
value of black offset printing ink is at L* = 31.90, a* = 0.27, and b* =
1.86, density value is at 1.27 and dot gain value was high at mid tone
area of image area.
Abstract: Requirements are critical to system validation as they guide all subsequent stages of systems development. Inadequately specified requirements generate systems that require major revisions or cause system failure entirely. Use Cases have become the main vehicle for requirements capture in many current Object Oriented (OO) development methodologies, and a means for developers to communicate with different stakeholders. In this paper we present the results of a laboratory experiment that explored whether different types of use case format are equally effective in facilitating high knowledge user-s understanding. Results showed that the provision of diagrams along with the textual use case descriptions significantly improved user comprehension of system requirements in both familiar and unfamiliar application domains. However, when comparing groups that received models of textual description accompanied with diagrams of different level of details (simple and detailed) we found no significant difference in performance.
Abstract: Echocardiography imaging is one of the most common diagnostic tests that are widely used for assessing the abnormalities of the regional heart ventricle function. The main goal of the image enhancement task in 2D-echocardiography (2DE) is to solve two major anatomical structure problems; speckle noise and low quality. Therefore, speckle noise reduction is one of the important steps that used as a pre-processing to reduce the distortion effects in 2DE image segmentation. In this paper, we present the common filters that based on some form of low-pass spatial smoothing filters such as Mean, Gaussian, and Median. The Laplacian filter was used as a high-pass sharpening filter. A comparative analysis was presented to test the effectiveness of these filters after being applied to original 2DE images of 4-chamber and 2-chamber views. Three statistical quantity measures: root mean square error (RMSE), peak signal-to-ratio (PSNR) and signal-tonoise ratio (SNR) are used to evaluate the filter performance quantitatively on the output enhanced image.
Abstract: The purpose of this paper is to improve electromagnetic characteristics on grounding grid by applying the conductive concrete. The conductive concrete in this study is under an extra high voltage (EHV, 345kV) system located in a high-tech industrial park or science park. Instead of surrounding soil of grounding grid, the application of conductive concrete can reduce equipment damage and body damage caused by switching surges. The focus of the two cases on the EHV distribution system in a high-tech industrial park is presented to analyze four soil material styles. By comparing several soil material styles, the study results have shown that the conductive concrete can effectively reduce the negative damages caused by electromagnetic transient. The adoption of the style of grounding grid located 1.0 (m) underground and conductive concrete located from the ground surface to 1.25 (m) underground can obviously improve the electromagnetic characteristics so as to advance protective efficiency.
Abstract: The National Agricultural Biotechnology Information
Center (NABIC) plays a leading role in the biotechnology information
database for agricultural plants in Korea. Since 2002, we have
concentrated on functional genomics of major crops, building an
integrated biotechnology database for agro-biotech information that
focuses on bioinformatics of major agricultural resources such as rice,
Chinese cabbage, and microorganisms. In the NABIC,
integration-based biotechnology database provides useful information
through a user-friendly web interface that allows analysis of genome
infrastructure, multiple plants, microbial resources, and living
modified organisms.
Abstract: In modern human computer interaction systems
(HCI), emotion recognition is becoming an imperative characteristic.
The quest for effective and reliable emotion recognition in HCI has
resulted in a need for better face detection, feature extraction and
classification. In this paper we present results of feature space analysis
after briefly explaining our fully automatic vision based emotion
recognition method. We demonstrate the compactness of the feature
space and show how the 2d/3d based method achieves superior features
for the purpose of emotion classification. Also it is exposed that
through feature normalization a widely person independent feature
space is created. As a consequence, the classifier architecture has
only a minor influence on the classification result. This is particularly
elucidated with the help of confusion matrices. For this purpose
advanced classification algorithms, such as Support Vector Machines
and Artificial Neural Networks are employed, as well as the simple k-
Nearest Neighbor classifier.
Abstract: Falling has been one of the major concerns and threats
to the independence of the elderly in their daily lives. With the
worldwide significant growth of the aging population, it is essential
to have a promising solution of fall detection which is able to operate
at high accuracy in real-time and supports large scale implementation
using multiple cameras. Field Programmable Gate Array (FPGA) is a
highly promising tool to be used as a hardware accelerator in many
emerging embedded vision based system. Thus, it is the main
objective of this paper to present an FPGA-based solution of visual
based fall detection to meet stringent real-time requirements with
high accuracy. The hardware architecture of visual based fall
detection which utilizes the pixel locality to reduce memory accesses
is proposed. By exploiting the parallel and pipeline architecture of
FPGA, our hardware implementation of visual based fall detection
using FGPA is able to achieve a performance of 60fps for a series of
video analytical functions at VGA resolutions (640x480). The results
of this work show that FPGA has great potentials and impacts in
enabling large scale vision system in the future healthcare industry
due to its flexibility and scalability.
Abstract: With the development of Internet and databases application techniques, the demand that lots of databases in the Internet are permitted to remote query and access for authorized users becomes common, and the problem that how to protect the copyright of relational databases arises. This paper simply introduces the knowledge of cloud model firstly, includes cloud generators and similar cloud. And then combined with the property of the cloud, a method of protecting relational databases copyright with cloud watermark is proposed according to the idea of digital watermark and the property of relational databases. Meanwhile, the corresponding watermark algorithms such as cloud watermark embedding algorithm and detection algorithm are proposed. Then, some experiments are run and the results are analyzed to validate the correctness and feasibility of the watermark scheme. In the end, the foreground of watermarking relational database and its research direction are prospected.
Abstract: Research in quantum computation is looking for the consequences of having information encoding, processing and communication exploit the laws of quantum physics, i.e. the laws which govern the ultimate knowledge that we have, today, of the foreign world of elementary particles, as described by quantum mechanics. This paper starts with a short survey of the principles which underlie quantum computing, and of some of the major breakthroughs brought by the first ten to fifteen years of research in this domain; quantum algorithms and quantum teleportation are very biefly presented. The next sections are devoted to one among the many directions of current research in the quantum computation paradigm, namely quantum programming languages and their semantics. A few other hot topics and open problems in quantum information processing and communication are mentionned in few words in the concluding remarks, the most difficult of them being the physical implementation of a quantum computer. The interested reader will find a list of useful references at the end of the paper.
Abstract: A review of the literature found that Domestic
violence and child maltreatment co-occur in many families, the
purpose of this study attempts to emphasize the factors relating to
intra-family relationships (order point of view) on violence against
the children, For this purpose a survey technique on the sample size
amounted 200 students of governmental guidance schools of city of
Gilanegharb in country of Iran were considered. For measurement of
violence against the children (VAC) the CTS scaled has been used
.The results showed that children have experienced the violence more
than once during the last year. degree of order in family is high.
Explanation result indicated that the order variables in family
including collective thinking, empathy, communal co-circumstance
have significant effects on VAC.
Abstract: In this manuscript, a wavelet-based blind
watermarking scheme has been proposed as a means to provide
security to authenticity of a fingerprint. The information used for
identification or verification of a fingerprint mainly lies in its
minutiae. By robust watermarking of the minutiae in the fingerprint
image itself, the useful information can be extracted accurately even
if the fingerprint is severely degraded. The minutiae are converted in
a binary watermark and embedding these watermarks in the detail
regions increases the robustness of watermarking, at little to no
additional impact on image quality. It has been experimentally shown
that when the minutiae is embedded into wavelet detail coefficients
of a fingerprint image in spread spectrum fashion using a
pseudorandom sequence, the robustness is observed to have a
proportional response while perceptual invisibility has an inversely
proportional response to amplification factor “K". The DWT-based
technique has been found to be very robust against noises,
geometrical distortions filtering and JPEG compression attacks and is
also found to give remarkably better performance than DCT-based
technique in terms of correlation coefficient and number of erroneous
minutiae.