Abstract: With the development of Internet and databases application techniques, the demand that lots of databases in the Internet are permitted to remote query and access for authorized users becomes common, and the problem that how to protect the copyright of relational databases arises. This paper simply introduces the knowledge of cloud model firstly, includes cloud generators and similar cloud. And then combined with the property of the cloud, a method of protecting relational databases copyright with cloud watermark is proposed according to the idea of digital watermark and the property of relational databases. Meanwhile, the corresponding watermark algorithms such as cloud watermark embedding algorithm and detection algorithm are proposed. Then, some experiments are run and the results are analyzed to validate the correctness and feasibility of the watermark scheme. In the end, the foreground of watermarking relational database and its research direction are prospected.
Abstract: In this manuscript, a wavelet-based blind
watermarking scheme has been proposed as a means to provide
security to authenticity of a fingerprint. The information used for
identification or verification of a fingerprint mainly lies in its
minutiae. By robust watermarking of the minutiae in the fingerprint
image itself, the useful information can be extracted accurately even
if the fingerprint is severely degraded. The minutiae are converted in
a binary watermark and embedding these watermarks in the detail
regions increases the robustness of watermarking, at little to no
additional impact on image quality. It has been experimentally shown
that when the minutiae is embedded into wavelet detail coefficients
of a fingerprint image in spread spectrum fashion using a
pseudorandom sequence, the robustness is observed to have a
proportional response while perceptual invisibility has an inversely
proportional response to amplification factor “K". The DWT-based
technique has been found to be very robust against noises,
geometrical distortions filtering and JPEG compression attacks and is
also found to give remarkably better performance than DCT-based
technique in terms of correlation coefficient and number of erroneous
minutiae.
Abstract: The most severe damage of the turbine rotor is its
distortion. The rotor straightening process must lead, at the first
stage, to removal of the stresses from the material by annealing and
next, to straightening of the plastic distortion without leaving any
stress by hot spotting. The straightening method does not produce
stress accumulations and the heating technique, developed
specifically for solid forged rotors and disks, enables to avoid local
overheating and structural changes in the material. This process also
does not leave stresses in the shaft material. An experimental study
of hot spotting is carried out on a large turbine rotor and some of the
most important effective parameters that must be considered on
annealing and hot spotting processes are investigated in this paper.
Abstract: Skin color based tracking techniques often assume a
static skin color model obtained either from an offline set of library
images or the first few frames of a video stream. These models
can show a weak performance in presence of changing lighting or
imaging conditions. We propose an adaptive skin color model based
on the Gaussian mixture model to handle the changing conditions.
Initial estimation of the number and weights of skin color clusters
are obtained using a modified form of the general Expectation
maximization algorithm, The model adapts to changes in imaging
conditions and refines the model parameters dynamically using spatial
and temporal constraints. Experimental results show that the method
can be used in effectively tracking of hand and face regions.
Abstract: Batch fermentation of 5, 10 and 25 g/L biodiesel
derived crude glycerol was carried out at 30, 37 and 450C by
Clostridium pasteurianum cells immobilized on silica. Maximum
yield of 1,3-propanediol (PDO) (0.60 mol/mol), and ethanol (0.26
mol/mol) were obtained from 10 g/L crude glycerol at 30 and 370C
respectively. Maximum yield of butanol (0.28 mol/mol substrate
added) was obtained at 370C with 25 g/L substrate. None of the three
products were detected at 45oC even after 10 days of fermentation.
Only traces of ethanol (0.01 mol/mol) were detected at 450C with 5
g/L substrate. The results obtained for 25 g/L substrate utilization
were fitted in first order rate equation to obtain the values of rate
constant at three different temperatures for bioconversion of glycerol.
First order rate constants for bioconversion of glycerol at 30, 37 and
45oC were found to be 0.198, 0.294 and 0.029/day respectively.
Activation energy (Ea) for crude glycerol bioconversion was
calculated to be 57.62 kcal/mol.
Abstract: This research aims to create a model for analysis of student motivation behavior on e-Learning based on association rule mining techniques in case of the Information Technology for Communication and Learning Course at Suan Sunandha Rajabhat University. The model was created under association rules, one of the data mining techniques with minimum confidence. The results showed that the student motivation behavior model by using association rule technique can indicate the important variables that influence the student motivation behavior on e-Learning.
Abstract: In this paper, a novel contrast enhancement technique
for contrast enhancement of a low-contrast satellite image has been
proposed based on the singular value decomposition (SVD) and
discrete cosine transform (DCT). The singular value matrix
represents the intensity information of the given image and any
change on the singular values change the intensity of the input image.
The proposed technique converts the image into the SVD-DCT
domain and after normalizing the singular value matrix; the enhanced
image is reconstructed by using inverse DCT. The visual and
quantitative results suggest that the proposed SVD-DCT method
clearly shows the increased efficiency and flexibility of the proposed
method over the exiting methods such as Linear Contrast Stretching
technique, GHE technique, DWT-SVD technique, DWT technique,
Decorrelation Stretching technique, Gamma Correction method
based techniques.
Abstract: One of the difficulties of the vibration-based damage identification methods is the nonuniqueness of the results of damage identification. The different damage locations and severity may cause the identical response signal, which is even more severe for detection of the multiple damage. This paper proposes a new strategy for damage detection to avoid this nonuniqueness. This strategy firstly determines the approximates damage area based on the statistical pattern recognition method using the dynamic strain signal measured by the distributed fiber Bragg grating, and then accurately evaluates the damage information based on the Bayesian model updating method using the experimental modal data. The stochastic simulation method is then used to compute the high-dimensional integral in the Bayesian problem. Finally, an experiment of the plate structure, simulating one part of mechanical structure, is used to verify the effectiveness of this approach.
Abstract: In this paper, a new method of information fusion – DSmT (Dezert and Smarandache Theory) is introduced to apply to managing and dealing with the uncertain information from robot map building. Here we build grid map form sonar sensors and laser range finder (LRF). The uncertainty mainly comes from sonar sensors and LRF. Aiming to the uncertainty in static environment, we propose Classic DSm (DSmC) model for sonar sensors and laser range finder, and construct the general basic belief assignment function (gbbaf) respectively. Generally speaking, the evidence sources are unreliable in physical system, so we must consider the discounting theory before we apply DSmT. At last, Pioneer II mobile robot serves as a simulation experimental platform. We build 3D grid map of belief layout, then mainly compare the effect of building map using DSmT and DST. Through this simulation experiment, it proves that DSmT is very successful and valid, especially in dealing with highly conflicting information. In short, this study not only finds a new method for building map under static environment, but also supplies with a theory foundation for us to further apply Hybrid DSmT (DSmH) to dynamic unknown environment and multi-robots- building map together.
Abstract: Although Face detection is not a recent activity in the
field of image processing, it is still an open area for research. The
greatest step in this field is the work reported by Viola and its recent
analogous is Huang et al. Both of them use similar features and also
similar training process. The former is just for detecting upright
faces, but the latter can detect multi-view faces in still grayscale
images using new features called 'sparse feature'. Finding these
features is very time consuming and inefficient by proposed methods.
Here, we propose a new approach for finding sparse features using a
genetic algorithm system. This method requires less computational
cost and gets more effective features in learning process for face
detection that causes more accuracy.
Abstract: This paper examines the interplay of policy options
and cost-effective technology in providing sustainable distance
education. A case study has been conducted among the learners and
teachers. The emergence of learning technologies through CD,
internet, and mobile is increasingly adopted by distance institutes for
quick delivery and cost-effective factors. Their sustainability is
conditioned by the structure of learners and well as the teaching
community. The structure of learners in terms of rural and urban
background revealed similarity in adoption and utilization of mobile
learning. In other words, the technology transcended the rural-urban
dichotomy. The teaching community was divided into two groups on
policy issues. This study revealed both cost-effective as well as
sustainability impacts on different learners groups divided by rural
and urban location.
Abstract: The identification and classification of the spine deformity play an important role when considering surgical planning for adolescent patients with idiopathic scoliosis. The subject of this article is the Lenke classification of scoliotic spines using Cobb angle measurements. The purpose is two-fold: (1) design a rulebased diagram to assist clinicians in the classification process and (2) investigate a computer classifier which improves the classification time and accuracy. The rule-based diagram efficiency was evaluated in a series of scoliotic classifications by 10 clinicians. The computer classifier was tested on a radiographic measurement database of 603 patients. Classification accuracy was 93% using the rule-based diagram and 99% for the computer classifier. Both the computer classifier and the rule based diagram can efficiently assist clinicians in their Lenke classification of spine scoliosis.
Abstract: An experimental study of anaerobic treatment was performed by hybrid upflow anaerobic sludge blanket (HUASB) reactor to treat produced water (PW) of an onshore crude oil terminal (COD: 1597 mg/L, NH3-N: 14.7 mg/L, phenol: 13.8 mg/L, BOD5: 862 mg/L, sodium: 6240 mg/L and chloride 9530 mg/L). The produced water with high salinity and other toxic substances will inhibit the methanogens performance if there is no adaptation on biomass before anaerobic digestion. COD removal from produced water was investigated at five different dilutions of produced water and tap water (TW) without any nutrient addition and pre-treatment. The dilution ratios were 1PW:4TW, 2PW:3TW, 3PW:2TW, 4PW:1TW and 5PW:0TW. The reactor was evaluated at mesophilic operating condition (35 ± 2 °C) at 5 days of HRT for 250 days continuous feed. The average COD removals for 1PW:4TW, 2PW:3TW, 3PW:2TW, 4PW:1TW and 5PW:0TW were found to be approximately 76.1%, 73.8%, 70.3%, 46.3% and 61.82% respectively, with final average effluent COD of 123.7 mg/L, 240 mg/L, 294 mg/L, 589 mg/L and 738 mg/L, respectively.
Abstract: The charge-pump circuit is an important component in a phase-locked loop (PLL). The charge-pump converts Up and Down signals from the phase/frequency detector (PFD) into current. A conventional CMOS charge-pump circuit consists of two switched current sources that pump charge into or out of the loop filter according to two logical inputs. The mismatch between the charging current and the discharging current causes phase offset and reference spurs in a PLL. We propose a new charge-pump circuit to reduce the current mismatch by using a regulated cascode circuit. The proposed charge-pump circuit is designed and simulated by spectre with TSMC 0.18-μm 1.8-V CMOS technology.
Abstract: In order to improve the effect of isolation structure, the
principles and behaviours of the base-isolation system are studied, and
the types and characteristics of the base-isolation are also discussed.
Compared to the traditional aseismatic structures, the base isolation
structures decrease the seismic response obviously: the total structural
aseismatic value decreases to 1/4-1/32 and the seismic shear stress in
the upper structure decreases to 1/14-1/23. In the huge seism, the
structure can have an obvious aseismatic effect.
Abstract: In this research the separation efficiency of deoiling hydrocyclone is evaluated using three-dimensional simulation of multiphase flow based on Eulerian-Eulerian finite volume method. The mixture approach of Reynolds Stress Model is also employed to capture the features of turbulent multiphase swirling flow. The obtained separation efficiency of Colman's design is compared with available experimental data and showed that the separation curve of deoiling hydrocyclones can be predicted using numerical simulation.
Abstract: The zero truncated model is usually used in modeling
count data without zero. It is the opposite of zero inflated model.
Zero truncated Poisson and zero truncated negative binomial models
are discussed and used by some researchers in analyzing the
abundance of rare species and hospital stay. Zero truncated models
are used as the base in developing hurdle models. In this study, we
developed a new model, the zero truncated strict arcsine model,
which can be used as an alternative model in modeling count data
without zero and with extra variation. Two simulated and one real
life data sets are used and fitted into this developed model. The
results show that the model provides a good fit to the data. Maximum
likelihood estimation method is used in estimating the parameters.
Abstract: To fight against the economic crisis, French
Government, like many others in Europe, has decided to give a boost
to high-speed line projects. This paper explores the implementation
and decision-making process in TGV projects, their evolutions,
especially since the Mediterranean TGV-line. This project was
probably the most controversial, but paradoxically represents today a
huge success for all the actors involved.
What kind of lessons we can learn from this experience? How to
evaluate the impact of this project on TGV-line planning? How can
we characterize this implementation and decision-making process
regards to the sustainability challenges?
The construction of Mediterranean TGV-line was the occasion to
make several innovations: to introduce more dialog into the decisionmaking
process, to take into account the environment, to introduce a
new project management and technological innovations. That-s why
this project appears today as an example in terms of integration of
sustainable development.
In this paper we examine the different kinds of innovations
developed in this project, by using concepts from sociology of
innovation to understand how these solutions emerged in a
controversial situation. Then we analyze the lessons which were
drawn from this decision-making process (in the immediacy and a
posteriori) and the way in which procedures evolved: creation of new
tools and devices (public consultation, project management...).
Finally we try to highlight the impact of this evolution on TGV
projects governance. In particular, new methods of implementation
and financing involve a reconfiguration of the system of actors. The
aim of this paper is to define the impact of this reconfiguration on
negotiations between stakeholders.
Abstract: Medical image data hiding has strict constrains such
as high imperceptibility, high capacity and high robustness.
Achieving these three requirements simultaneously is highly
cumbersome. Some works have been reported in the literature on
data hiding, watermarking and stegnography which are suitable for
telemedicine applications. None is reliable in all aspects. Electronic
Patient Report (EPR) data hiding for telemedicine demand it blind
and reversible. This paper proposes a novel approach to blind
reversible data hiding based on integer wavelet transform.
Experimental results shows that this scheme outperforms the prior
arts in terms of zero BER (Bit Error Rate), higher PSNR (Peak Signal
to Noise Ratio), and large EPR data embedding capacity with
WPSNR (Weighted Peak Signal to Noise Ratio) around 53 dB,
compared with the existing reversible data hiding schemes.
Abstract: Most scientific programs have large input and output
data sets that require out-of-core programming or use virtual memory
management (VMM). Out-of-core programming is very error-prone
and tedious; as a result, it is generally avoided. However, in many
instance, VMM is not an effective approach because it often results
in substantial performance reduction. In contrast, compiler driven I/O
management will allow a program-s data sets to be retrieved in parts,
called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a
compiler combined with a user level runtime system that can be used
to replace standard VMM for out-of-core programs. We describe
Comanche and demonstrate on a number of representative problems
that it substantially out-performs VMM. Significantly our system
does not require any special services from the operating system and
does not require modification of the operating system kernel.