Abstract: With the advent of digital cinema and digital
broadcasting, copyright protection of video data has been one of the
most important issues.
We present a novel method of watermarking for video image data
based on the hardware and digital wavelet transform techniques and
name it as “traceable watermarking" because the watermarked data is
constructed before the transmission process and traced after it has been
received by an authorized user.
In our method, we embed the watermark to the lowest part of each
image frame in decoded video by using a hardware LSI.
Digital Cinema is an important application for traceable
watermarking since digital cinema system makes use of watermarking
technology during content encoding, encryption, transmission,
decoding and all the intermediate process to be done in digital cinema
systems. The watermark is embedded into the randomly selected
movie frames using hash functions.
Embedded watermark information can be extracted from the
decoded video data. For that, there is no need to access original movie
data. Our experimental results show that proposed traceable
watermarking method for digital cinema system is much better than the
convenient watermarking techniques in terms of robustness, image
quality, speed, simplicity and robust structure.
Abstract: The new idea of analyze of power system failure with
use of artificial neural network is proposed. An analysis of the
possibility of simulating phenomena accompanying system faults and
restitution is described. It was indicated that the universal model for
the simulation of phenomena in whole analyzed range does not exist.
The main classic method of search of optimal structure and
parameter identification are described shortly. The example with
results of calculation is shown.
Abstract: In this work, several ASP solutions were flooded into
fractured models initially saturated with heavy oil at a constant flow
rate and different geometrical characteristics of fracture. The ASP
solutions are constituted from 2 polymers i.e. a synthetic polymer,
hydrolyzed polyacrylamide as well as a biopolymer, a surfactant and
2types of alkaline. The results showed that using synthetic
hydrolyzed polyacrylamide polymer increases ultimate oil recovery;
however, type of alkaline does not play a significant rule on oil
recovery. In addition, position of the injection well respect to the
fracture system has remarkable effects on ASP flooding. For instance
increasing angle of fractures with mean flow direction causes more
oil recovery and delays breakthrough time. This work can be
accounted as a comprehensive survey on ASP flooding which
considers most of effective factors in this chemical EOR method.
Abstract: An Automated Rapid Maxillary Expander (ARME) is
a specially designed microcontroller-based orthodontic appliance to
overcome the shortcomings imposed by the traditional maxillary
expansion appliances. This new device is operates by automatically
widening the maxilla (upper jaw) by expanding the midpalatal suture
[1]. The ARME appliance that has been developed is a combination
of modified butterfly expander appliance, micro gear, micro motor,
and microcontroller to automatically produce light and continuous
pressure to expand the maxilla. For this study, the functionality of the
system is verified through laboratory tests by measure the forced
applied to the teeth each time the maxilla expands. The laboratory
test results show that the developed appliance meets the desired
performance specifications consistently.
Abstract: Semantic Web Technologies enable machines to
interpret data published in a machine-interpretable form on the web.
At the present time, only human beings are able to understand the
product information published online. The emerging semantic Web
technologies have the potential to deeply influence the further
development of the Internet Economy. In this paper we propose a
scenario based research approach to predict the effects of these new
technologies on electronic markets and business models of traders
and intermediaries and customers. Over 300 million searches are
conducted everyday on the Internet by people trying to find what
they need. A majority of these searches are in the domain of
consumer ecommerce, where a web user is looking for something to
buy. This represents a huge cost in terms of people hours and an
enormous drain of resources. Agent enabled semantic search will
have a dramatic impact on the precision of these searches. It will
reduce and possibly eliminate information asymmetry where a better
informed buyer gets the best value. By impacting this key
determinant of market prices semantic web will foster the evolution
of different business and economic models. We submit that there is a
need for developing these futuristic models based on our current
understanding of e-commerce models and nascent semantic web
technologies. We believe these business models will encourage
mainstream web developers and businesses to join the “semantic web
revolution."
Abstract: The identification and classification of the spine deformity play an important role when considering surgical planning for adolescent patients with idiopathic scoliosis. The subject of this article is the Lenke classification of scoliotic spines using Cobb angle measurements. The purpose is two-fold: (1) design a rulebased diagram to assist clinicians in the classification process and (2) investigate a computer classifier which improves the classification time and accuracy. The rule-based diagram efficiency was evaluated in a series of scoliotic classifications by 10 clinicians. The computer classifier was tested on a radiographic measurement database of 603 patients. Classification accuracy was 93% using the rule-based diagram and 99% for the computer classifier. Both the computer classifier and the rule based diagram can efficiently assist clinicians in their Lenke classification of spine scoliosis.
Abstract: The charge-pump circuit is an important component in a phase-locked loop (PLL). The charge-pump converts Up and Down signals from the phase/frequency detector (PFD) into current. A conventional CMOS charge-pump circuit consists of two switched current sources that pump charge into or out of the loop filter according to two logical inputs. The mismatch between the charging current and the discharging current causes phase offset and reference spurs in a PLL. We propose a new charge-pump circuit to reduce the current mismatch by using a regulated cascode circuit. The proposed charge-pump circuit is designed and simulated by spectre with TSMC 0.18-μm 1.8-V CMOS technology.
Abstract: The amounts of radioactivity in the igneous rocks
have been investigated; samples were collected from the total of eight
basalt rock types in the northeastern of Kurdistan region/Iraq. The
activity concentration of 226Ra (238U) series, 228Ac (232Th) series, 40K
and 137Cs were measured using Planar HPGe and NaI(Tl) detectors.
Along the study area the radium equivalent activities Raeq in Bq/Kg
of samples under investigation were found in the range of 22.16 to
77.31 Bq/Kg with an average value of 44.8 Bq/Kg, this value is much
below the internationally accepted value of 370 Bq/Kg. To estimate
the health effects of this natural radioactive composition, the average
values of absorbed gamma dose rate D (55 nGyh-1), Indoor and
outdoor annual effective dose rates Eied (0.11 mSvy-1) . and Eoed
(0.03 mSvy-1), External hazard index Hex (0.138) and internal hazard
index Hin(0.154), and representative level index Iγr (0.386) have been
calculated and found to be lower than the worldwide average values.
Abstract: Fake finger submission attack is a major problem in fingerprint recognition systems. In this paper, we introduce an aliveness detection method based on multiple static features, which derived from a single fingerprint image. The static features are comprised of individual pore spacing, residual noise and several first order statistics. Specifically, correlation filter is adopted to address individual pore spacing. The multiple static features are useful to reflect the physiological and statistical characteristics of live and fake fingerprint. The classification can be made by calculating the liveness scores from each feature and fusing the scores through a classifier. In our dataset, we compare nine classifiers and the best classification rate at 85% is attained by using a Reduced Multivariate Polynomial classifier. Our approach is faster and more convenient for aliveness check for field applications.
Abstract: The increasing usage of antibiotics in the animal
farming industry is an emerging worldwide problem contributing to
the development of antibiotic resistance. The purpose of this work was
to investigate the prevalence and antibiotic resistance profile of
bacterial isolates collected from aquatic environments and meats in a
peri-urban community in Daejeon, Korea. In an antibacterial
susceptibility test, the bacterial isolates showed a high incidence of
resistance (~ 26.04 %) to cefazolin, tetracycline, gentamycin,
norfloxacin, erythromycin and vancomycin. The results from a test for
multiple antibiotic resistance indicated that the isolates were
displaying an approximately 5-fold increase in the incidence of
multiple antibiotic resistance to combinations of two different
antibiotics compared to combinations of three or more antibiotics.
Most of the isolates showed multi-antibiotic resistance, and the
resistance patterns were similar among the sampling groups.
Sequencing data analysis of 16S rRNA showed that most of the
resistant isolates appeared to be dominated by the classes
Betaproteobacteria and Gammaproteobacteria in the phylum
Proteobacteria.
Abstract: This paper aims to provide a conceptual framework to examine competitive disadvantage of banks that suffer from poor performance. Banks generate revenues mainly from the interest rate spread on taking deposits and making loans while collecting fees in the process. To maximize firm value, banks seek loan growth and expense control while managing risk associated with loans with respect to non-performing borrowers or narrowing interest spread between assets and liabilities. Competitive disadvantage refers to the failure to access imitable resources and to build managing capabilities to gain sustainable return given appropriate risk management. This paper proposes a four-quadrant framework of organizational typology is subsequently proposed to examine the features of competitive disadvantage in the banking sector. A resource configuration model, which is extracted from CAMEL indicators to examine the underlying features of bank failures.
Abstract: In this research the separation efficiency of deoiling hydrocyclone is evaluated using three-dimensional simulation of multiphase flow based on Eulerian-Eulerian finite volume method. The mixture approach of Reynolds Stress Model is also employed to capture the features of turbulent multiphase swirling flow. The obtained separation efficiency of Colman's design is compared with available experimental data and showed that the separation curve of deoiling hydrocyclones can be predicted using numerical simulation.
Abstract: Nowadays, there is little information, concerning the
heat shield systems, and this information is not completely reliable to
use in so many cases. for example, the precise calculation cannot be
done for various materials. In addition, the real scale test has two
disadvantages: high cost and low flexibility, and for each case we
must perform a new test. Hence, using numerical modeling program
that calculates the surface recession rate and interior temperature
distribution is necessary. Also, numerical solution of governing
equation for non-charring material ablation is presented in order to
anticipate the recession rate and the heat response of non-charring
heat shields. the governing equation is nonlinear and the Newton-
Rafson method along with TDMA algorithm is used to solve this
nonlinear equation system. Using Newton- Rafson method for
solving the governing equation is one of the advantages of the
solving method because this method is simple and it can be easily
generalized to more difficult problems. The obtained results
compared with reliable sources in order to examine the accuracy of
compiling code.
Abstract: Fluid flow and heat transfer of vertical full cone
embedded in porous media is studied in this paper. Nonlinear
differential equation arising from similarity solution of inverted cone
(subjected to wall temperature boundary conditions) embedded in
porous medium is solved using a hybrid neural network- particle
swarm optimization method.
To aim this purpose, a trial solution of the differential equation is
defined as sum of two parts. The first part satisfies the initial/
boundary conditions and does contain an adjustable parameter and
the second part which is constructed so as not to affect the
initial/boundary conditions and involves adjustable parameters (the
weights and biases) for a multi-layer perceptron neural network.
Particle swarm optimization (PSO) is applied to find adjustable
parameters of trial solution (in first and second part). The obtained
solution in comparison with the numerical ones represents a
remarkable accuracy.
Abstract: Recently, a lot of attention has been devoted to
advanced techniques of system modeling. PNN(polynomial neural
network) is a GMDH-type algorithm (Group Method of Data
Handling) which is one of the useful method for modeling nonlinear
systems but PNN performance depends strongly on the number of
input variables and the order of polynomial which are determined by
trial and error. In this paper, we introduce GPNN (genetic
polynomial neural network) to improve the performance of PNN.
GPNN determines the number of input variables and the order of all
neurons with GA (genetic algorithm). We use GA to search between
all possible values for the number of input variables and the order of
polynomial. GPNN performance is obtained by two nonlinear
systems. the quadratic equation and the time series Dow Jones stock
index are two case studies for obtaining the GPNN performance.
Abstract: Chest pain is one of the most prevalent complaints
among adults that cause the people to attend to medical centers. The
aim was to determine the prevalence and risk factors of chest pain
among over 30 years old people in Tehran. In this cross-sectional
study, 787 adults took part from Apr 2005 until Apr 2006. The
sampling method was random cluster sampling and there were 25
clusters. In each cluster, interviews were performed with 32 over 30
years old, people lived in those houses. In cases with chest pain, extra
questions asked. The prevalence of CP was 9% (71 cases). Of them
21 cases (6.5%) were in 41-60 year age ranges and the remainders
were over 61 year old. 19 cases (26.8%) mentioned CP in resting
state and all of the cases had exertion onset CP. The CP duration was
10 minutes or less in all of the cases and in most of them (84.5%), the
location of pain mentioned left anterior part of chest, left anterior part
of sternum and or left arm. There was positive history of myocardial
infarction in 12 cases (17%). There was significant relation between
CP and age, sex and between history of myocardial infarction and
marital state of study people. Our results are similar to other studies-
results in most parts, however it is necessary to perform
supplementary tests and follow up studies to differentiate between
cardiac and non-cardiac CP exactly.
Abstract: Thyroid cancer-s overall contribution to the
worldwide cancer burden is relatively small, but incidence rates have increased over the last three decades throughout the world. This trend has been hypothesised to reflect a combination of technological advances enabling increased detection, but also changes in
environmental factors, including population exposure to ionising radiation from fallout, diagnostic tests and treatment for benign and
malignant conditions. The Thyroid dose received apparently shielded
by cerrobend blocks was about 8cGy in 100cGy Expose
Abstract: To fight against the economic crisis, French
Government, like many others in Europe, has decided to give a boost
to high-speed line projects. This paper explores the implementation
and decision-making process in TGV projects, their evolutions,
especially since the Mediterranean TGV-line. This project was
probably the most controversial, but paradoxically represents today a
huge success for all the actors involved.
What kind of lessons we can learn from this experience? How to
evaluate the impact of this project on TGV-line planning? How can
we characterize this implementation and decision-making process
regards to the sustainability challenges?
The construction of Mediterranean TGV-line was the occasion to
make several innovations: to introduce more dialog into the decisionmaking
process, to take into account the environment, to introduce a
new project management and technological innovations. That-s why
this project appears today as an example in terms of integration of
sustainable development.
In this paper we examine the different kinds of innovations
developed in this project, by using concepts from sociology of
innovation to understand how these solutions emerged in a
controversial situation. Then we analyze the lessons which were
drawn from this decision-making process (in the immediacy and a
posteriori) and the way in which procedures evolved: creation of new
tools and devices (public consultation, project management...).
Finally we try to highlight the impact of this evolution on TGV
projects governance. In particular, new methods of implementation
and financing involve a reconfiguration of the system of actors. The
aim of this paper is to define the impact of this reconfiguration on
negotiations between stakeholders.
Abstract: Most scientific programs have large input and output
data sets that require out-of-core programming or use virtual memory
management (VMM). Out-of-core programming is very error-prone
and tedious; as a result, it is generally avoided. However, in many
instance, VMM is not an effective approach because it often results
in substantial performance reduction. In contrast, compiler driven I/O
management will allow a program-s data sets to be retrieved in parts,
called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a
compiler combined with a user level runtime system that can be used
to replace standard VMM for out-of-core programs. We describe
Comanche and demonstrate on a number of representative problems
that it substantially out-performs VMM. Significantly our system
does not require any special services from the operating system and
does not require modification of the operating system kernel.
Abstract: We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.