Abstract: Most of the image watermarking methods, using the properties of the human visual system (HVS), have been proposed in literature. The component of the visual threshold is usually related to either the spatial contrast sensitivity function (CSF) or the visual masking. Especially on the contrast masking, most methods have not mention to the effect near to the edge region. Since the HVS is sensitive what happens on the edge area. This paper proposes ultrasound image watermarking using the visual threshold corresponding to the HVS in which the coefficients in a DCT-block have been classified based on the texture, edge, and plain area. This classification method enables not only useful for imperceptibility when the watermark is insert into an image but also achievable a robustness of watermark detection. A comparison of the proposed method with other methods has been carried out which shown that the proposed method robusts to blockwise memoryless manipulations, and also robust against noise addition.
Abstract: The requirements analysis, modeling, and simulation have consistently been one of the main challenges during the development of complex systems. The scenarios and the state machines are two successful models to describe the behavior of an interactive system. The scenarios represent examples of system execution in the form of sequences of messages exchanged between objects and are a partial view of the system. In contrast, state machines can represent the overall system behavior. The automation of processing scenarios in the state machines provide some answers to various problems such as system behavior validation and scenarios consistency checking. In this paper, we propose a method for translating scenarios in state machines represented by Discreet EVent Specification and procedure to detect implied scenarios. Each induced DEVS model represents the behavior of an object of the system. The global system behavior is described by coupling the atomic DEVS models and validated through simulation. We improve the validation process with integrating formal methods to eliminate logical inconsistencies in the global model. For that end, we use the Z notation.
Abstract: In contrast to existing of calculation of temperature field of a profile part a blade with convective cooling which are not taking into account multi connective in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM AND FDM) numerical methods from the point of view of a realization on the PC. The theoretical substantiation of these methods is proved by the appropriate theorems.
Abstract: A structural study of an aqueous electrolyte whose
experimental results are available. It is a solution of LiCl-6H2O type
at glassy state (120K) contrasted with pure water at room temperature
by means of Partial Distribution Functions (PDF) issue from neutron
scattering technique. Based on these partial functions, the Reverse
Monte Carlo method (RMC) computes radial and angular correlation
functions which allow exploring a number of structural features of
the system. The obtained curves include some artifacts. To remedy
this, we propose to introduce a screened potential as an additional
constraint. Obtained results show a good matching between
experimental and computed functions and a significant improvement
in PDFs curves with potential constraint. It suggests an efficient fit of
pair distribution functions curves.
Abstract: Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of medical ultrasound images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to clinical ultrasound images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected ultrasound images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (detection hypotheses) in the original images.
Abstract: The paper focuses on the enhanced stiffness modeling
of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the
virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by
rigid links and perfect joints. In contrast to the conventional
formulation, which is valid for the unloaded mode and small
displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The
developed numerical technique allows computing the static
equilibrium and relevant force/torque reaction of the manipulator for
any given displacement of the end-effector. This enables designer
detecting essentially nonlinear effects in elastic behavior of
manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of
the dedicated matrix composed of the stiffness parameters of the
virtual springs and the Jacobians/Hessians of the active and passive
joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel
manipulator of the Orthoglide family
Abstract: A wide spectrum of systems require reliable
personal recognition schemes to either confirm or determine the
identity of an individual person. This paper considers multimodal
biometric system and their applicability to access control,
authentication and security applications. Strategies for feature
extraction and sensor fusion are considered and contrasted. Issues
related to performance assessment, deployment and standardization
are discussed. Finally future directions of biometric systems
development are discussed.
Abstract: Color categorization is shared among members in a
society. This allows communication of color, especially when using
natural language such as English. Hence sociable robot, to live
coexist with human in human society, must also have the shared
color categorization. To achieve this, many works have been done
relying on modeling of human color perception and mathematical
complexities. In contrast, in this work, the computer as brain of the
robot learns color categorization through interaction with humans
without much mathematical complexities.
Abstract: In this paper a procedure for the split-pipe design of looped water distribution network based on the use of simulated annealing is proposed. Simulated annealing is a heuristic-based search algorithm, motivated by an analogy of physical annealing in solids. It is capable for solving the combinatorial optimization problem. In contrast to the split-pipe design that is derived from a continuous diameter design that has been implemented in conventional optimization techniques, the split-pipe design proposed in this paper is derived from a discrete diameter design where a set of pipe diameters is chosen directly from a specified set of commercial pipes. The optimality and feasibility of the solutions are found to be guaranteed by using the proposed method. The performance of the proposed procedure is demonstrated through solving the three well-known problems of water distribution network taken from the literature. Simulated annealing provides very promising solutions and the lowest-cost solutions are found for all of these test problems. The results obtained from these applications show that simulated annealing is able to handle a combinatorial optimization problem of the least cost design of water distribution network. The technique can be considered as an alternative tool for similar areas of research. Further applications and improvements of the technique are expected as well.
Abstract: In this paper, a robust digital image watermarking
scheme for copyright protection applications using the singular value
decomposition (SVD) is proposed. In this scheme, an entropy
masking model has been applied on the host image for the texture
segmentation. Moreover, the local luminance and textures of the host
image are considered for watermark embedding procedure to
increase the robustness of the watermarking scheme. In contrast to all
existing SVD-based watermarking systems that have been designed
to embed visual watermarks, our system uses a pseudo-random
sequence as a watermark. We have tested the performance of our
method using a wide variety of image processing attacks on different
test images. A comparison is made between the results of our
proposed algorithm with those of a wavelet-based method to
demonstrate the superior performance of our algorithm.
Abstract: This paper examines two policy spaces–the ARC and TVA–and their spatialized politics. The research observes that the regional concept informs public policy and can contribute to the formation of stable policy initiatives. Using the subsystem framework to understand the political viability of policy regimes, the authors conclude policy geographies that appeal to traditional definitions of regions are more stable over time. In contrast, geographies that fail to reflect pre-existing representations of space are engaged in more competitive subsystem politics. The paper demonstrates that the spatial practices of policy regions and their directional politics influence the political viability of programs. The paper concludes that policy spaces should institutionalize pre-existing geographies–not manufacture new ones.
Abstract: In Southeast Asia, during the dry season (August to
October) forest fires in Indonesia emit pollutants into the atmosphere.
For two years during this period, a total of 67 samples of 2.5 μm
particulate matters were collected and analyzed for total mass and
elemental composition with ICP - MS after microwave digestion. A
study of 60 elements measured during these periods suggest that the
concentration of most of elements, even those usually related to
crustal source, are extremely high and unpredictable during the haze
period. In By contrast, trace element concentration in non - haze
months is more stable and covers a lower range. Other unexpected
events and their effects on the findings are discussed.
Abstract: This study is designed to investigate errors emerged in written texts produced by 30 Turkish EFL learners with an explanatory, and thus, qualitative perspective. Erroneous language elements were identified by the researcher first and then their grammaticality and intelligibility were checked by five native speakers of English. The analysis of the data showed that it is difficult to claim that an error stems from only one single factor since different features of an error are triggered by different factors. Our findings revealed two different types of errors: those which stem from the interference of L1 with L2 and those which are developmental ones. The former type contains more global errors whereas the errors in latter type are more intelligible.
Abstract: We have previously introduced an ultrasonic imaging
approach that combines harmonic-sensitive pulse sequences with a
post-beamforming quadratic kernel derived from a second-order
Volterra filter (SOVF). This approach is designed to produce images
with high sensitivity to nonlinear oscillations from microbubble
ultrasound contrast agents (UCA) while maintaining high levels of
noise rejection. In this paper, a two-step algorithm for computing the
coefficients of the quadratic kernel leading to reduction of tissue
component introduced by motion, maximizing the noise rejection and
increases the specificity while optimizing the sensitivity to the UCA
is presented. In the first step, quadratic kernels from individual
singular modes of the PI data matrix are compared in terms of their
ability of maximize the contrast to tissue ratio (CTR). In the second
step, quadratic kernels resulting in the highest CTR values are
convolved. The imaging results indicate that a signal processing
approach to this clinical challenge is feasible.
Abstract: In this paper, a new formulation for acoustics coupled with linear elasticity is presented. The primary objective of the work is to develop a three dimensional hp adaptive finite element method code destinated for modeling of acoustics of human head. The code will have numerous applications e.g. in designing hearing protection devices for individuals working in high noise environments. The presented work is in the preliminary stage. The variational formulation has been implemented and tested on a sequence of meshes with concentric multi-layer spheres, with material data representing the tissue (the brain), skull and the air. Thus, an efficient solver for coupled elasticity/acoustics problems has been developed, and tested on high contrast material data representing the human head.
Abstract: In contrast to existing methods which do not take into account multiconnectivity in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM and FDM) numerical methods of calculation of stationary and quasistationary temperature field of a profile part of a blade with convective cooling (from the point of view of realization on PC). The theoretical substantiation of these methods is proved by appropriate theorems. For it, converging quadrature processes have been developed and the estimations of errors in the terms of A.Ziqmound continuity modules have been received. For visualization of profiles are used: the method of the least squares with automatic conjecture, device spline, smooth replenishment and neural nets. Boundary conditions of heat exchange are determined from the solution of the corresponding integral equations and empirical relationships. The reliability of designed methods is proved by calculation and experimental investigations heat and hydraulic characteristics of the gas turbine first stage nozzle blade.
Abstract: Segmentation is an important step in medical image
analysis and classification for radiological evaluation or computer
aided diagnosis. The CAD (Computer Aided Diagnosis ) of lung CT
generally first segment the area of interest (lung) and then analyze
the separately obtained area for nodule detection in order to
diagnosis the disease. For normal lung, segmentation can be
performed by making use of excellent contrast between air and
surrounding tissues. However this approach fails when lung is
affected by high density pathology. Dense pathologies are present in
approximately a fifth of clinical scans, and for computer analysis
such as detection and quantification of abnormal areas it is vital that
the entire and perfectly lung part of the image is provided and no
part, as present in the original image be eradicated. In this paper we
have proposed a lung segmentation technique which accurately
segment the lung parenchyma from lung CT Scan images. The
algorithm was tested against the 25 datasets of different patients
received from Ackron Univeristy, USA and AGA Khan Medical
University, Karachi, Pakistan.
Abstract: In contrast to existing methods which do not take into account multiconnectivity in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM and FDM) numerical methods of calculation of stationary and quasi-stationary temperature field of a profile part of a blade with convective cooling (from the point of view of realization on PC). The theoretical substantiation of these methods is proved by appropriate theorems. For it, converging quadrature processes have been developed and the estimations of errors in the terms of A.Ziqmound continuity modules have been received. For visualization of profiles are used: the method of the least squares with automatic conjecture, device spline, smooth replenishment and neural nets. Boundary conditions of heat exchange are determined from the solution of the corresponding integral equations and empirical relationships. The reliability of designed methods is proved by calculation and experimental investigations heat and hydraulic characteristics of the gas turbine first stage nozzle blade.
Abstract: Electron multiplying charge coupled devices (EMCCDs) have revolutionized the world of low light imaging by introducing on-chip multiplication gain based on the impact ionization effect in the silicon. They combine the sub-electron readout noise with high frame rates. Signal-to-noise Ratio (SNR) is an important performance parameter for low-light-level imaging systems. This work investigates the SNR performance of an EMCCD operated in Non-inverted Mode (NIMO) and Inverted Mode (IMO). The theory of noise characteristics and operation modes is presented. The results show that the SNR of is determined by dark current and clock induced charge at high gain level. The optimum SNR performance is provided by an EMCCD operated in NIMO in short exposure and strong cooling applications. In contrast, an IMO EMCCD is preferable.
Abstract: Market based models are frequently used in the resource
allocation on the computational grid. However, as the size of
the grid grows, it becomes difficult for the customer to negotiate
directly with all the providers. Middle agents are introduced to
mediate between the providers and customers and facilitate the
resource allocation process. The most frequently deployed middle
agents are the matchmakers and the brokers. The matchmaking agent
finds possible candidate providers who can satisfy the requirements
of the consumers, after which the customer directly negotiates with
the candidates. The broker agents are mediating the negotiation with
the providers in real time.
In this paper we present a new type of middle agent, the marketmaker.
Its operation is based on two parallel operations - through
the investment process the marketmaker is acquiring resources and
resource reservations in large quantities, while through the resale process
it sells them to the customers. The operation of the marketmaker
is based on the fact that through its global view of the grid it can
perform a more efficient resource allocation than the one possible in
one-to-one negotiations between the customers and providers.
We present the operation and algorithms governing the operation
of the marketmaker agent, contrasting it with the matchmaker and
broker agents. Through a series of simulations in the task oriented
domain we compare the operation of the three agents types. We find
that the use of marketmaker agent leads to a better performance in the
allocation of large tasks and a significant reduction of the messaging
overhead.