Abstract: A new blind gray-level watermarking scheme is described. In the proposed method, the host image is first divided into 4*4 non-overlapping blocks. For each block, two first AC coefficients of its Hadamard transform are then estimated using DC coefficients of its neighbor blocks. A gray-level watermark is then added into estimated values. Since embedding watermark does not change the DC coefficients, watermark extracting could be done by estimating AC coefficients and comparing them with their actual values. Several experiments are made and results suggest the robustness of the proposed algorithm.
Abstract: Recently, there are significant improvements in the
capabilities of mobile devices; rendering large terrain is tedious
because of the constraint in resources of mobile devices. This
paper focuses on the implementation of terrain rendering on
mobile device to observe some issues and current constraints
occurred. Experiments are performed using two datasets with
results based on rendering speed and appearance to ascertain both
the issues and constraints. The result shows a downfall of frame
rate performance because of the increase of triangles. Since the
resolution between computer and mobile device is different, the
terrain surface on mobile device looks more unrealistic compared
to on a computer. Thus, more attention in the development of
terrain rendering on mobile devices is required. The problems
highlighted in this paper will be the focus of future research and
will be a great importance for 3D visualization on mobile device.
Abstract: In this research, a biofiltration process to remove
ammonia gas from gas stream using agricultural residue biofilter
medias is studied. The experiments were conducted in laboratoryscale
biofilter. The biofilter medias were a mixture of manure
fertilizer and bagasse at various ratios i.e., 1:3, 1:5 and 1:7. The
experiments were performed for a period of 40 days. The empty bed
retention time (EBRT) is 78s. The moisture content of biofilter media
was maintained at 45-60% using water. The results showed that the
agricultural residues (manure fertilizer and bagasse) are suitable as
biofilter media for ammonia gas removal in biofiltration process.
The maximum efficiency of ammonia gas removal is observed
from the 1:5 of manure fertilizer: bagasse ratio at 89.93%. The
biofiltration is more effective at low ammonia gas concentration. In
addition, the mixture ratio of biofilter media is not a significant factor
in biofiltration operation while the most significant factor for
biofiltration operation is the inlet ammonia gas concentration.
Abstract: In this paper, we present a new method for solving quadratic programming problems, not strictly convex. Constraints of the problem are linear equalities and inequalities, with bounded variables. The suggested method combines the active-set strategies and support methods. The algorithm of the method and numerical experiments are presented, while comparing our approach with the active set method on randomly generated problems.
Abstract: Response surface methodology with Box–Benhken (BB) design of experiment approach has been utilized to study the mechanism of interface slip damping in layered and jointed tack welded beams with varying surface roughness. The design utilizes the initial amplitude of excitation, tack length and surface roughness at the interfaces to develop the model for the logarithmic damping decrement of the layered and jointed welded structures. Statistically designed experiments have been performed to estimate the coefficients in the mathematical model, predict the response, and check the adequacy of the model. Comparison of predicted and experimental response values outside the design conditions have shown good correspondence, implying that empirical model derived from response surface approach can be effectively used to describe the mechanism of interface slip damping in layered and jointed tack welded structures.
Abstract: The MFCAV Riemann solver is practically used in many Lagrangian or ALE methods due to its merit of sharp shock profiles and rarefaction corners, though very often with numerical oscillations. By viewing it as a modification of the WWAM Riemann solver, we apply the MFCAV Riemann solver to the Lagrangian method recently developed by Maire. P. H et. al.. The numerical experiments show that the application is successful in that the shock profiles and rarefaction corners are sharpened compared with results obtained using other Riemann solvers. Though there are still numerical oscillations, they are within the range of the MFCAV applied in onther Lagrangian methods.
Abstract: The Institute of Product Development is dealing
with the development, design and dimensioning of micro components
and systems as a member of the Collaborative Research
Centre 499 “Design, Production and Quality Assurance of
Molded micro components made of Metallic and Ceramic Materials".
Because of technological restrictions in the miniaturization
of conventional manufacturing techniques, shape and
material deviations cannot be scaled down in the same proportion
as the micro parts, rendering components with relatively
wide tolerance fields. Systems that include such components
should be designed with this particularity in mind, often requiring
large clearance. On the end, the output of such systems
results variable and prone to dynamical instability. To save
production time and resources, every study of these effects
should happen early in the product development process and
base on computer simulation to avoid costly prototypes. A
suitable method is proposed here and exemplary applied to a
micro technology demonstrator developed by the CRC499. It
consists of a one stage planetary gear train in a sun-planet-ring
configuration, with input through the sun gear and output
through the carrier. The simulation procedure relies on ordinary
Multi Body Simulation methods and subsequently adds
other techniques to further investigate details of the system-s
behavior and to predict its response. The selection of the relevant
parameters and output functions followed the engineering
standards for regular sized gear trains. The first step is to
quantify the variability and to reveal the most critical points of
the system, performed through a whole-mechanism Sensitivity
Analysis. Due to the lack of previous knowledge about the system-s
behavior, different DOE methods involving small and
large amount of experiments were selected to perform the SA.
In this particular case the parameter space can be divided into
two well defined groups, one of them containing the gear-s profile
information and the other the components- spatial location.
This has been exploited to explore the different DOE techniques
more promptly. A reduced set of parameters is derived for
further investigation and to feed the final optimization process,
whether as optimization parameters or as external perturbation
collective. The 10 most relevant perturbation factors and 4 to 6
prospective variable parameters are considered in a new, simplified
model. All of the parameters are affected by the mentioned
production variability. The objective functions of interest
are based on scalar output-s variability measures, so the
problem becomes an optimization under robustness and reliability constrains. The study shows an initial step on the development
path of a method to design and optimize complex micro
mechanisms composed of wide tolerated elements accounting
for the robustness and reliability of the systems- output.
Abstract: In this paper we proposed a novel method to acquire
the ROI (Region of interest) of unsupervised and touch-less palmprint
captured from a web camera in real-time. We use Viola-Jones
approach and skin model to get the target area in real time. Then an
innovative course-to-fine approach to detect the key points on the hand
is described. A new algorithm is used to find the candidate key points
coarsely and quickly. In finely stage, we verify the hand key points
with the shape context descriptor. To make the user much comfortable,
it can process the hand image with different poses, even the hand is
closed. Experiments show promising result by using the proposed
method in various conditions.
Abstract: Quantitative methods of economic decision-making as
the methodological base of the so called operational research
represent an important set of tools for managing complex economic
systems,both at the microeconomic level and on the macroeconomic
scale. Mathematical models of controlled and controlling processes
allow, by means of artificial experiments, obtaining information
foroptimalor optimum approaching managerial decision-making.The
quantitative methods of economic decision-making usually include a
methodology known as structural analysis -an analysisof
interdisciplinary production-consumption relations.
Abstract: Heart sound is an acoustic signal and many techniques
used nowadays for human recognition tasks borrow speech recognition
techniques. One popular choice for feature extraction of accoustic
signals is the Mel Frequency Cepstral Coefficients (MFCC) which
maps the signal onto a non-linear Mel-Scale that mimics the human
hearing. However the Mel-Scale is almost linear in the frequency
region of heart sounds and thus should produce similar results with
the standard cepstral coefficients (CC). In this paper, MFCC is
investigated to see if it produces superior results for PCG based
human identification system compared to CC. Results show that the
MFCC system is still superior to CC despite linear filter-banks in
the lower frequency range, giving up to 95% correct recognition rate
for MFCC and 90% for CC. Further experiments show that the high
recognition rate is due to the implementation of filter-banks and not
from Mel-Scaling.
Abstract: This paper proposes an architecture of dynamically
reconfigurable arithmetic circuit. Dynamic reconfiguration is a
technique to realize required functions by changing hardware
construction during operations. The proposed circuit is based on a
complex number multiply-accumulation circuit which is used
frequently in the field of digital signal processing. In addition, the
proposed circuit performs real number double precision arithmetic
operations. The data formats are single and double precision floating
point number based on IEEE754. The proposed circuit is designed
using VHDL, and verified the correct operation by simulations and
experiments.
Abstract: Simultaneous Saccharification and Fermentation (SSF) of sugarcane bagasse by cellulase and Pachysolen tannophilus MTCC *1077 were investigated in the present study. Important process variables for ethanol production form pretreated bagasse were optimized using Response Surface Methodology (RSM) based on central composite design (CCD) experiments. A 23 five level CCD experiments with central and axial points was used to develop a statistical model for the optimization of process variables such as incubation temperature (25–45°) X1, pH (5.0–7.0) X2 and fermentation time (24–120 h) X3. Data obtained from RSM on ethanol production were subjected to the analysis of variance (ANOVA) and analyzed using a second order polynomial equation and contour plots were used to study the interactions among three relevant variables of the fermentation process. The fermentation experiments were carried out using an online monitored modular fermenter 2L capacity. The processing parameters setup for reaching a maximum response for ethanol production was obtained when applying the optimum values for temperature (32°C), pH (5.6) and fermentation time (110 h). Maximum ethanol concentration (3.36 g/l) was obtained from 50 g/l pretreated sugarcane bagasse at the optimized process conditions in aerobic batch fermentation. Kinetic models such as Monod, Modified Logistic model, Modified Logistic incorporated Leudeking – Piret model and Modified Logistic incorporated Modified Leudeking – Piret model have been evaluated and the constants were predicted.
Abstract: The study of the interaction between humans and
computers has been emerging during the last few years. This
interaction will be more powerful if computers are able to perceive
and respond to human nonverbal communication such as emotions. In
this study, we present the image-based approach to emotion
classification through lower facial expression. We employ a set of
feature points in the lower face image according to the particular face
model used and consider their motion across each emotive expression
of images. The vector of displacements of all feature points input to
the Adaptive Support Vector Machines (A-SVMs) classifier that
classify it into seven basic emotions scheme, namely neutral, angry,
disgust, fear, happy, sad and surprise. The system was tested on the
Japanese Female Facial Expression (JAFFE) dataset of frontal view
facial expressions [7]. Our experiments on emotion classification
through lower facial expressions demonstrate the robustness of
Adaptive SVM classifier and verify the high efficiency of our
approach.
Abstract: Over the past decades, automatic face recognition has become a highly active research area, mainly due to the countless application possibilities in both the private as well as the public sector. Numerous algorithms have been proposed in the literature to cope with the problem of face recognition, nevertheless, a group of methods commonly referred to as appearance based have emerged as the dominant solution to the face recognition problem. Many comparative studies concerned with the performance of appearance based methods have already been presented in the literature, not rarely with inconclusive and often with contradictory results. No consent has been reached within the scientific community regarding the relative ranking of the efficiency of appearance based methods for the face recognition task, let alone regarding their susceptibility to appearance changes induced by various environmental factors. To tackle these open issues, this paper assess the performance of the three dominant appearance based methods: principal component analysis, linear discriminant analysis and independent component analysis, and compares them on equal footing (i.e., with the same preprocessing procedure, with optimized parameters for the best possible performance, etc.) in face verification experiments on the publicly available XM2VTS database. In addition to the comparative analysis on the XM2VTS database, ten degraded versions of the database are also employed in the experiments to evaluate the susceptibility of the appearance based methods on various image degradations which can occur in "real-life" operating conditions. Our experimental results suggest that linear discriminant analysis ensures the most consistent verification rates across the tested databases.
Abstract: The selection of appropriate requirements for product
releases can make a big difference in a product success. The selection
of requirements is done by different requirements prioritization
techniques. These techniques are based on pre-defined and
systematic steps to calculate the requirements relative weight.
Prioritization is complicated by new development settings, shifting
from traditional co-located development to geographically distributed
development. Stakeholders, connected to a project, are distributed all
over the world. These geographically distributions of stakeholders
make it hard to prioritize requirements as each stakeholder have their
own perception and expectations of the requirements in a software
project. This paper discusses limitations of the Analytical Hierarchy
Process with respect to geographically distributed stakeholders-
(GDS) prioritization of requirements. This paper also provides a
solution, in the form of a modified AHP, in order to prioritize
requirements for GDS. We will conduct two experiments in this
paper and will analyze the results in order to discuss AHP limitations
with respect to GDS. The modified AHP variant is also validated in
this paper.
Abstract: The present study addresses problems and solutions
related to new functional food production. Wheat (Triticum aestivum
L) bran obtained from industrial mill company “Dobeles
dzirnavieks”, was used to investigate them as raw material like
nutrients for Bifidobacterium lactis Bb-12. Enzymatic hydrolysis of
wheat bran starch was carried out by α-amylase from Bacillus
amyloliquefaciens (Sigma Aldrich). The Viscozyme L purchased
from (Sigma Aldrich) were used for reducing released sugar.
Bifidibacterium lactis Bb-12 purchased from (Probio-Tec® CHR
Hansen) was cultivated in enzymatically hydrolysed wheat bran
mash. All procedures ensured the number of active Bifidobacterium
lactis Bb-12 in the final product reached 105 CFUg-1. After enzymatic
and bacterial fermentations sample were freeze dried for analysis of
chemical compounds. All experiments were performed at Faculty of
Food Technology of Latvia University of Agriculture in January-
March 2013. The obtained results show that both types of wheat bran
(enzymatically treated and non-treated) influenced the fermentative
activity and number of Bifidibacterium lactis Bb-12 viable in wheat
bran mash. Amount of acidity strongly increase during the wheat
bran mash fermentation. The main objective of this work was to
create low-energy functional enzymatically and bacterially treated
food from wheat bran using enzymatic hydrolysis of carbohydrates
and following cultivation of Bifidobacterium lactis Bb-12.
Abstract: The two-dimensional gel electrophoresis method
(2-DE) is widely used in Proteomics to separate thousands of proteins
in a sample. By comparing the protein expression levels of proteins in
a normal sample with those in a diseased one, it is possible to identify
a meaningful set of marker proteins for the targeted disease. The major
shortcomings of this approach involve inherent noises and irregular
geometric distortions of spots observed in 2-DE images. Various
experimental conditions can be the major causes of these problems. In
the protein analysis of samples, these problems eventually lead to
incorrect conclusions. In order to minimize the influence of these
problems, this paper proposes a partition based pair extension method
that performs spot-matching on a set of gel images multiple times and
segregates more reliable mapping results which can improve the
accuracy of gel image analysis. The improved accuracy of the
proposed method is analyzed through various experiments on real
2-DE images of human liver tissues.
Abstract: This study aims to segment objects using the K-means
algorithm for texture features. Firstly, the algorithm transforms color
images into gray images. This paper describes a novel technique for
the extraction of texture features in an image. Then, in a group of
similar features, objects and backgrounds are differentiated by using
the K-means algorithm. Finally, this paper proposes a new object
segmentation algorithm using the morphological technique. The
experiments described include the segmentation of single and multiple
objects featured in this paper. The region of an object can be
accurately segmented out. The results can help to perform image
retrieval and analyze features of an object, as are shown in this paper.
Abstract: Simultaneous effects of temperature, immersion time, salt concentration, sucrose concentration, pressure and convective dryer temperature on the combined osmotic dehydration - convective drying of edible button mushrooms were investigated. Experiments were designed according to Central Composite Design with six factors each at five different levels. Response Surface Methodology (RSM) was used to determine the optimum processing conditions that yield maximum water loss and rehydration ratio and minimum solid gain and shrinkage in osmotic-convective drying of edible button mushrooms. Applying surfaces profiler and contour plots optimum operation conditions were found to be temperature of 39 °C, immersion time of 164 min, salt concentration of 14%, sucrose concentration of 53%, pressure of 600 mbar and drying temperature of 40 °C. At these optimum conditions, water loss, solid gain, rehydration ratio and shrinkage were found to be 63.38 (g/100 g initial sample), 3.17 (g/100 g initial sample), 2.26 and 7.15%, respectively.
Abstract: Using Dynamic Bayesian Networks (DBN) to model genetic regulatory networks from gene expression data is one of the major paradigms for inferring the interactions among genes. Averaging a collection of models for predicting network is desired, rather than relying on a single high scoring model. In this paper, two kinds of model searching approaches are compared, which are Greedy hill-climbing Search with Restarts (GSR) and Markov Chain Monte Carlo (MCMC) methods. The GSR is preferred in many papers, but there is no such comparison study about which one is better for DBN models. Different types of experiments have been carried out to try to give a benchmark test to these approaches. Our experimental results demonstrated that on average the MCMC methods outperform the GSR in accuracy of predicted network, and having the comparable performance in time efficiency. By proposing the different variations of MCMC and employing simulated annealing strategy, the MCMC methods become more efficient and stable. Apart from comparisons between these approaches, another objective of this study is to investigate the feasibility of using DBN modeling approaches for inferring gene networks from few snapshots of high dimensional gene profiles. Through synthetic data experiments as well as systematic data experiments, the experimental results revealed how the performances of these approaches can be influenced as the target gene network varies in the network size, data size, as well as system complexity.