Abstract: In high bitrate information hiding techniques, 1 bit is
embedded within each 4 x 4 Discrete Cosine Transform (DCT)
coefficient block by means of vector quantization, then the hidden bit
can be effectively extracted in terminal end. In this paper high bitrate
information hiding algorithms are summarized, and the scheme of
video in video is implemented. Experimental result shows that the host
video which is embedded numerous auxiliary information have little
visually quality decline. Peak Signal to Noise Ratio (PSNR)Y of host
video only degrades 0.22dB in average, while the hidden information
has a high percentage of survives and keeps a high robustness in
H.264/AVC compression, the average Bit Error Rate(BER) of hiding
information is 0.015%.
Abstract: The angular distribution of Compton scattering of two
quanta originating in the annihilation of a positron with an electron
is investigated as a quantum key distribution (QKD) mechanism in
the gamma spectral range. The geometry of coincident Compton
scattering is observed on the two sides as a way to obtain partially
correlated readings on the quantum channel. We derive the noise
probability density function of a conceptually equivalent prepare
and measure quantum channel in order to evaluate the limits of the
concept in terms of the device secrecy capacity and estimate it at
roughly 1.9 bits per 1 000 annihilation events. The high error rate
is well above the tolerable error rates of the common reconciliation
protocols; therefore, the proposed key agreement protocol by public
discussion requires key reconciliation using classical error-correcting
codes. We constructed a prototype device based on the readily
available monolithic detectors in the least complex setup.
Abstract: The current work focuses on rephrasing the harmful
effects of mercury that is being released from a number of sources.
Most of the sources are from the industrial waste water. Different
techniques of mercury removal have been discussed and a brief
comparison among these has been made. The experimental work has
been conducted for two most widely used methods of mercury
removal and comparison in terms of their efficiency has been made.
Abstract: Face and facial expressions play essential roles in
interpersonal communication. Most of the current works on the facial
expression recognition attempt to recognize a small set of the
prototypic expressions such as happy, surprise, anger, sad, disgust
and fear. However the most of the human emotions are
communicated by changes in one or two of discrete features. In this
paper, we develop a facial expressions synthesis system, based on the
facial characteristic points (FCP's) tracking in the frontal image
sequences. Selected FCP's are automatically tracked using a crosscorrelation
based optical flow. The proposed synthesis system uses a
simple deformable facial features model with a few set of control
points that can be tracked in original facial image sequences.
Abstract: Since the advent of the information era, the Internet has
brought various positive effects in everyday life. Nevertheless,
recently, problems and side-effects have been noted. Internet
witch-trials and spread of pornography are only a few of these
problems.In this study, problems and causes of malicious replies on
internet boards were analyzed, using the key ideas of game theory. The
study provides a mathematical model for the internet reply game to
devise three possible plans that could efficiently counteract malicious
replies. Furthermore, seven specific measures that comply with one of
the three plans were proposed and evaluated according to the
importance and utility of each measure using the orthogonal array
survey and SPSS conjoint analysis.The conclusion was that the most
effective measure would be forbidding unsigned user access to
malicious replies. Also notable was that some analytically proposed
measures, when implemented, could backfire and encourage malicious
replies.
Abstract: This article summarizes ways to verify neutron
fluence for neutron transmutation doping of silicon with phosphorus
on the LVR-15 reactor. Neutron fluence is determined using
activation detectors placed along the crystal in a strip or encapsulated
in a rod holder. Holders are placed at the centre of a water-filled
capsule or in an aluminum or silicon ingot that simulates a real single
crystal. If the diameter of the crystal is significantly less than the
capsule diameter and water from the primary circuit enters the free
space in the capsule, neutron interaction in the water changes neutron
fluence, affecting axial irradiation homogeneity. The effect of
moving the capsule vertically in the channel relative to maximum
neutron fluence in the reactor core was also measured. Even a small
shift of the capsule-s centre causes great irradiation inhomogeneity.
This effect was measured using activation detectors, and was also
confirmed by MCNP calculation.
Abstract: Discretization of spatial derivatives is an important
issue in meshfree methods especially when the derivative terms
contain non-linear coefficients. In this paper, various methods used
for discretization of second-order spatial derivatives are investigated
in the context of Smoothed Particle Hydrodynamics. Three popular
forms (i.e. "double summation", "second-order kernel derivation",
and "difference scheme") are studied using one-dimensional unsteady
heat conduction equation. To assess these schemes, transient response
to a step function initial condition is considered. Due to parabolic
nature of the heat equation, one can expect smooth and monotone
solutions. It is shown, however in this paper, that regardless of
the type of kernel function used and the size of smoothing radius,
the double summation discretization form leads to non-physical
oscillations which persist in the solution. Also, results show that when
a second-order kernel derivative is used, a high-order kernel function
shall be employed in such a way that the distance of inflection
point from origin in the kernel function be less than the nearest
particle distance. Otherwise, solutions may exhibit oscillations near
discontinuities unlike the "difference scheme" which unconditionally
produces monotone results.
Abstract: The LHP is a two-phase device with extremely high
effective thermal conductivity that utilizes the thermodynamic
pressure difference to circulate a cooling fluid. A thermodynamics
analytical model is developed to explore different parameters effects
on a Loop Heat Pipe (LHP).. The effects of pipe length, pipe
diameter, condenser temperature, and heat load are reported. As pipe
length increases and/or pipe diameter decreases, a higher temperature
is expected in the evaporator.
Abstract: The motion planning technique described in this paper has been developed to eliminate or reduce the residual vibrations of belt-driven rotary platforms, while maintaining unchanged the motion time and the total angular displacement of the platform. The proposed approach is based on a suitable choice of the motion command given to the servomotor that drives the mechanical device; this command is defined by some numerical coefficients which determine the shape of the displacement, velocity and acceleration profiles. Using a numerical optimization technique, these coefficients can be changed without altering the continuity conditions imposed on the displacement and its time derivatives at the initial and final time instants. The proposed technique can be easily and quickly implemented on an actual device, since it requires only a simple modification of the motion command profile mapped in the memory of the electronic motion controller.
Abstract: This paper is devoted to predict laminar and turbulent
heating rates around blunt re-entry spacecraft at hypersonic
conditions. Heating calculation of a hypersonic body is normally
performed during the critical part of its flight trajectory. The
procedure is of an inverse method, where a shock wave is assumed,
and the body shape that supports this shock, as well as the flowfield
between the shock and body, are calculated. For simplicity the
normal momentum equation is replaced with a second order pressure
relation; this simplification significantly reduces computation time.
The geometries specified in this research, are parabola and ellipsoids
which may have conical after bodies. An excellent agreement is
observed between the results obtained in this paper and those
calculated by others- research. Since this method is much faster than
Navier-Stokes solutions, it can be used in preliminary design,
parametric study of hypersonic vehicles.
Abstract: In syntactic pattern recognition a pattern can be
represented by a graph. Given an unknown pattern represented by
a graph g, the problem of recognition is to determine if the graph g
belongs to a language L(G) generated by a graph grammar G. The
so-called IE graphs have been defined in [1] for a description of
patterns. The IE graphs are generated by so-called ETPL(k) graph
grammars defined in [1]. An efficient, parsing algorithm for ETPL(k)
graph grammars for syntactic recognition of patterns represented by
IE graphs has been presented in [1]. In practice, structural
descriptions may contain pattern distortions, so that the assignment
of a graph g, representing an unknown pattern, to
a graph language L(G) generated by an ETPL(k) graph grammar G is
rejected by the ETPL(k) type parsing. Therefore, there is a need for
constructing effective parsing algorithms for recognition of distorted
patterns. The purpose of this paper is to present a new approach to
syntactic recognition of distorted patterns. To take into account all
variations of a distorted pattern under study, a probabilistic
description of the pattern is needed. A random IE graph approach is
proposed here for such a description ([2]).
Abstract: Value engineering is an efficacious contraption for
administrators to make up their minds. Value perusals proffer the
gaffers a suitable instrument to decrease the expenditures of the life
span, quality amelioration, structural improvement, curtailment of the
construction schedule, longevity prolongation or a merging of the
aforementioned cases. Subjecting organizers to pressures on one
hand and their accountability towards their pertinent fields together
with inherent risks and ambiguities of other options on the other hand
set some comptrollers in a dilemma utilization of risk management
and the value engineering in projects manipulation with regard to
complexities of implementing projects can be wielded as a
contraption to identify and efface each item which wreaks
unnecessary expenses and time squandering sans inflicting any
damages upon the essential project applications. Of course It should
be noted that implementation of risk management and value
engineering with regard to the betterment of efficiency and functions
may lead to the project implementation timing elongation. Here time
revamping does not refer to time diminishing in the whole cases. his
article deals with risk and value engineering conceptualizations at
first. The germane reverberations effectuated due to its execution in
Iran Khodro Corporation are regarded together with the joint features
and amalgamation of the aforesaid entia; hence the proposed
blueprint is submitted to be taken advantage of in engineering and
industrial projects including Iran Khodro Corporation.
Abstract: Text similarity measurement is a fundamental issue in
many textual applications such as document clustering, classification,
summarization and question answering. However, prevailing approaches
based on Vector Space Model (VSM) more or less suffer
from the limitation of Bag of Words (BOW), which ignores the semantic
relationship among words. Enriching document representation
with background knowledge from Wikipedia is proven to be an effective
way to solve this problem, but most existing methods still
cannot avoid similar flaws of BOW in a new vector space. In this
paper, we propose a novel text similarity measurement which goes
beyond VSM and can find semantic affinity between documents.
Specifically, it is a unified graph model that exploits Wikipedia as
background knowledge and synthesizes both document representation
and similarity computation. The experimental results on two different
datasets show that our approach significantly improves VSM-based
methods in both text clustering and classification.
Abstract: This paper presents an efficient VLSI architecture
design to achieve real time video processing using Full-Search Block
Matching (FSBM) algorithm. The design employs parallel bank
architecture with minimum latency, maximum throughput, and full
hardware utilization. We use nine parallel processors in our
architecture and each controlled by a state machine. State machine
control implementation makes the design very simple and cost
effective. The design is implemented using VHDL and the
programming techniques we incorporated makes the design
completely programmable in the sense that the search ranges and the
block sizes can be varied to suit any given requirements. The design
can operate at frequencies up to 36 MHz and it can function in QCIF
and CIF video resolution at 1.46 MHz and 5.86 MHz, respectively.
Abstract: An important structuring mechanism for knowledge bases is building clusters based on the content of their knowledge objects. The objects are clustered based on the principle of maximizing the intraclass similarity and minimizing the interclass similarity. Clustering can also facilitate taxonomy formation, that is, the organization of observations into a hierarchy of classes that group similar events together. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. In this paper, a set of related HPRs is called a cluster and is represented by a HPR-tree. This paper discusses an algorithm based on cumulative learning scenario for dynamic structuring of clusters. The proposed scheme incrementally incorporates new knowledge into the set of clusters from the previous episodes and also maintains summary of clusters as Synopsis to be used in the future episodes. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested incremental structuring of clusters would be useful in mining data streams.
Abstract: In order to evaluation the effects of natural, biological
and chemical fertilizers on grain yield and chickpea quality, field
experiments were carried out in 2007 and 2008 growing seasons. In
this research the effects of different organic, chemical and biological
fertilizers were investigated on grain yield and quality of chickpea.
Experimental units were arranged in split-split plots based on
randomized complete blocks with three replications. The highest
amounts of yield and yield components were obtained in G1×N5
interaction. Significant increasing of N, P, K, Fe and Mg content in
leaves and grains emphasized on superiority of mentioned treatment
because each one of these nutrients has an approved role in
chlorophyll synthesis and photosynthesis ability of the crop. The
combined application of compost, farmyard manure and chemical
phosphorus (N5) had the best grain quality due to high protein, starch
and total sugar contents, low crude fiber and reduced cooking time.
Abstract: This paper introduces and proves new concept of salt
dissolving in water as very tiny solid sodium chloride particles of
nanovolumes, from this point of view salt water can be desalinated by
collision with special surface characterized by smoothness upon nano
level, high rigidity, high hardness under appropriate conditions of
water launching in the form of thin laminar flow under suitable speed
and angle of incidence to get desalinated water.
Abstract: Atherosclerosis is the condition in which an artery
wall thickens as the result of a build-up of fatty materials such as
cholesterol. It is a syndrome affecting arterial blood vessels, a
chronic inflammatory response in the walls of arteries, in large part
due to the accumulation of macrophage white blood cells and
promoted by low density (especially small particle) lipoproteins
(plasma proteins that carry cholesterol and triglycerides) without
adequate removal of fats and cholesterol from the macrophages by
functional high density lipoproteins (HDL). It is commonly referred
to as a hardening or furring of the arteries. It is caused by the
formation of multiple plaques within the arteries.
Abstract: This paper summaries basic principles and concepts of
intelligent controls, implemented in humanoid robotics as well as
recent algorithms being devised for advanced control of humanoid
robots. Secondly, this paper presents a new approach neuro-fuzzy
system. We have included some simulating results from our
computational intelligence technique that will be applied to our
humanoid robot. Subsequently, we determine a relationship between
joint trajectories and located forces on robot-s foot through a
proposed neuro-fuzzy technique.
Abstract: Literature review revealed the importance of the
adoption of marketing Relationship for loyalty and retaining
profitable customer (Customer Relationship Management). LPQ
satisfaction will reinforce the loyalty and customer brand
attachment. Customer will communicate the operator to others. The
focus of this study is to examine the relationship between the
LPPQ and the WOM recommendations through: customer
satisfaction, loyalty and attachment. The results show that LPQ
affect positively the satisfaction, negatively the loyalty. LPQ has an
indirectly effect on WOM recommendations but through the
satisfaction and attachment. The mediating effect of satisfaction in
the relationship between LPQ and Loyalty is rejected. This finding
can be explained by the nature of mobile sector in Tunisia.