Abstract: Future astronomical projects on large space x-ray
imaging telescopes require novel substrates and technologies for the
construction of their reflecting mirrors. The mirrors must be
lightweight and precisely shaped to achieve large collecting area with
high angular resolution. The new materials and technologies must be
cost-effective. Currently, the most promising materials are glass or
silicon foils. We focused on precise shaping these foils by thermal
forming process. We studied free and forced slumping in the
temperature region of hot plastic deformation and compared the
shapes obtained by the different slumping processes. We measured
the shapes and the surface quality of the foils. In the experiments, we
varied both heat-treatment temperature and time following our
experiment design. The obtained data and relations we can use for
modeling and optimizing the thermal forming procedure.
Abstract: This paper contributes to the field of Environmental
Awareness Training (EAT) evaluation in terms of military activities.
Environmental management of military activities is a growing concern
for defence forces worldwide and the importance of EAT is becoming
widely recognized. As one of Australia-s largest landowners, the
Australian Defence Force (ADF) is extremely mindful of its duty as a
joint environmental manager. It has an integrated Environmental
Management System (EMS) to assist environmental management and
EAT is an essential part of the ADF EMS model. This paper examines
how EAT was conducted during the exercise Talisman Saber in 2009
(TS09) and evaluates its effectiveness, using Shoalwater Bay Training
Area (SWBTA), one of the most significant military training areas and
a significant protected area in Australia, as a case study. A
questionnaire survey conducted showed, overall, that EAT was
effective from the perspective of a sample of participants.
Recommendations are made for the ADF to refine EAT for future
exercises.
Abstract: In this paper, an extended study is performed on the
effect of different factors on the quality of vector data based on a
previous study. In the noise factor, one kind of noise that appears in
document images namely Gaussian noise is studied while the previous
study involved only salt-and-pepper noise. High and low levels of
noise are studied. For the noise cleaning methods, algorithms that were
not covered in the previous study are used namely Median filters and
its variants. For the vectorization factor, one of the best available
commercial raster to vector software namely VPstudio is used to
convert raster images into vector format. The performance of line
detection will be judged based on objective performance evaluation
method. The output of the performance evaluation is then analyzed
statistically to highlight the factors that affect vector quality.
Abstract: In DMVC, we have more than one options of sources available for construction of side information. The newer techniques make use of both the techniques simultaneously by constructing a bitmask that determines the source of every block or pixel of the side information. A lot of computation is done to determine each bit in the bitmask. In this paper, we have tried to define areas that can only be well predicted by temporal interpolation and not by multiview interpolation or synthesis. We predict that all such areas that are not covered by two cameras cannot be appropriately predicted by multiview synthesis and if we can identify such areas in the first place, we don-t need to go through the script of computations for all the pixels that lie in those areas. Moreover, this paper also defines a technique based on KLT to mark the above mentioned areas before any other processing is done on the side view.
Abstract: We present a method to create special domain
collections from news sites. The method only requires a single
sample article as a seed. No prior corpus statistics are needed and the
method is applicable to multiple languages. We examine various
similarity measures and the creation of document collections for
English and Japanese. The main contributions are as follows. First,
the algorithm can build special domain collections from as little as
one sample document. Second, unlike other algorithms it does not
require a second “general" corpus to compute statistics. Third, in our
testing the algorithm outperformed others in creating collections
made up of highly relevant articles.
Abstract: A subsea hydrocarbon production system can undergo planned and unplanned shutdowns during the life of the field. The thermal FEA is used to simulate the cool down to verify the insulation design of the subsea equipment, but it is also used to derive an acceptable insulation design for the cold spots. The driving factors of subsea analyses require fast responding and accurate models of the equipment cool down. This paper presents cool down analysis carried out by a Krylov subspace reduction method, and compares this approach to the commonly used FEA solvers. The model considered represents a typical component of a subsea production system, a closed valve on a dead leg. The results from the Krylov reduction method exhibits the least error and requires the shortest computational time to reach the solution. These findings make the Krylov model order reduction method very suitable for the above mentioned subsea applications.
Abstract: As the development of digital technology is increasing,
Digital cinema is getting more spread.
However, content copy and attack against the digital cinema becomes
a serious problem. To solve the above security problem, we propose
“Additional Watermarking" for digital cinema delivery system. With
this proposed “Additional watermarking" method, we protect content
copyrights at encoder and user side information at decoder. It realizes
the traceability of the watermark embedded at encoder.
The watermark is embedded into the random-selected frames using
Hash function. Using it, the embedding position is distributed by Hash
Function so that third parties do not break off the watermarking
algorithm.
Finally, our experimental results show that proposed method is much
better than the convenient watermarking techniques in terms of
robustness, image quality and its simple but unbreakable algorithm.
Abstract: The H.264/AVC standard uses an intra prediction, 9
directional modes for 4x4 luma blocks and 8x8 luma blocks, 4
directional modes for 16x16 macroblock and 8x8 chroma blocks,
respectively. It means that, for a macroblock, it has to perform 736
different RDO calculation before a best RDO modes is determined.
With this Multiple intra-mode prediction, intra coding of H.264/AVC
offers a considerably higher improvement in coding efficiency
compared to other compression standards, but computational
complexity is increased significantly. This paper presents a fast intra
prediction algorithm for H.264/AVC intra prediction based a
characteristic of homogeneity information. In this study, the gradient
prediction method used to predict the homogeneous area and the
quadratic prediction function used to predict the nonhomogeneous
area. Based on the correlation between the homogeneity and block
size, the smaller block is predicted by gradient prediction and
quadratic prediction, so the bigger block is predicted by gradient
prediction. Experimental results are presented to show that the
proposed method reduce the complexity by up to 76.07%
maintaining the similar PSNR quality with about 1.94%bit rate
increase in average.
Abstract: Magnesium wastes and scraps, one of the metal wastes, are produced by many industrial activities, all over the world. Their growing size is becoming a future problem for the world. In this study, the use of magnesium wastes as a raw material in the production of the magnesium borate hydrates are aimed. The method used in the experiments is hydrothermal synthesis. The conditions are set to, waste magnesium to B2O3, 1:3 as a molar ratio. Four different reaction times are studied which are 30, 60, 120 and 240 minutes. For the identification analyses X-Ray Diffraction (XRD), Fourier Transform Infrared Spectroscopy (FT-IR) and Raman spectroscopy techniques are used. As a result at all the reaction times magnesium borate hydrates are synthesized and the most crystalline forms are obtained at a reaction time of 120 minutes. The overall yields of the production are found between the values of 65-80 %.
Abstract: Energy intensity(energy consumption intensity) is a
global index which computes the required energy for producing a
specific value of goods and services in each country. It is computed
in terms of initial energy supply or final energy consumption. In this
study (research) Divisia method is used to decompose energy
consumption and energy intensity. This method decomposes
consumption and energy intensity to production effects, structural
and net intensity and could be done as time series or two-periodical.
This study analytically investigates consumption changes and energy
intensity on economical sectors of Iran and more specific on road
transportation(rail road and road).Our results show that the
contribution of structural effect (change in economical activities
combination) is very low and the effect of net energy consumption
has the higher contribution in consumption changes and energy
intensity. In other words, the high consumption of energy is due to
Intensity of energy consumption and is not to structural effect of
transportation sector.
Abstract: This paper highlights the importance of integrating social and technical approach (which is so called a “hybrid socio-technical approach") as one innovative and strategic program to support the social development in geodisaster prone area in Indonesia. Such program mainly based on public education and community participation as a partnership program by the University, local government and may also with the private company and/ or local NGO. The indigenous, simple and low cost technology has also been introduced and developed as a part of the hybrid sociotechnical system, in order to ensure the life and environmental protection, with respect to the sustainable human and social development.
Abstract: A self-evolution algorithm for optimizing neural networks using a combination of PSO and JPSO is proposed. The algorithm optimizes both the network topology and parameters simultaneously with the aim of achieving desired accuracy with less complicated networks. The performance of the proposed approach is compared with conventional back-propagation networks using several synthetic functions, with better results in the case of the former. The proposed algorithm is also implemented on slope stability problem to estimate the critical factor of safety. Based on the results obtained, the proposed self evolving network produced a better estimate of critical safety factor in comparison to conventional BPN network.
Abstract: Engineered nanoparticles’ usage rapidly increased in
various applications in the last decade due to their unusual properties.
However, there is an ever increasing concern to understand their
toxicological effect in human health. Particularly, metal and metal
oxide nanoparticles have been used in various sectors including
biomedical, food and agriculture. But their impact on human health is
yet to be fully understood. In this present investigation, we assessed
the toxic effect of engineered nanoparticles (ENPs) including Ag,
MgO and Co3O4 nanoparticles (NPs) on human mesenchymal stem
cells (hMSC) adopting cell viability and cellular morphological
changes as tools The results suggested that silver NPs are more toxic
than MgO and Co3O4NPs. The ENPs induced cytotoxicity and
nuclear morphological changes in hMSC depending on dose. The cell
viability decreases with increase in concentration of ENPs. The
cellular morphology studies revealed that ENPs damaged the cells.
These preliminary findings have implications for the use of these
nanoparticles in food industry with systematic regulations.
Abstract: The Boundary Representation of a 3D manifold contains
FACES (connected subsets of a parametric surface S : R2 -!
R3). In many science and engineering applications it is cumbersome
and algebraically difficult to deal with the polynomial set and
constraints (LOOPs) representing the FACE. Because of this reason, a
Piecewise Linear (PL) approximation of the FACE is needed, which is
usually represented in terms of triangles (i.e. 2-simplices). Solving the
problem of FACE triangulation requires producing quality triangles
which are: (i) independent of the arguments of S, (ii) sensitive to the
local curvatures, and (iii) compliant with the boundaries of the FACE
and (iv) topologically compatible with the triangles of the neighboring
FACEs. In the existing literature there are no guarantees for the point
(iii). This article contributes to the topic of triangulations conforming
to the boundaries of the FACE by applying the concept of parameterindependent
Gabriel complex, which improves the correctness of the
triangulation regarding aspects (iii) and (iv). In addition, the article
applies the geometric concept of tangent ball to a surface at a point to
address points (i) and (ii). Additional research is needed in algorithms
that (i) take advantage of the concepts presented in the heuristic
algorithm proposed and (ii) can be proved correct.
Abstract: In this paper, a two factor scheme is proposed to
generate cryptographic keys directly from biometric data, which
unlike passwords, are strongly bound to the user. Hash value of the
reference iris code is used as a cryptographic key and its length
depends only on the hash function, being independent of any other
parameter. The entropy of such keys is 94 bits, which is much higher
than any other comparable system. The most important and distinct
feature of this scheme is that it regenerates the reference iris code by
providing a genuine iris sample and the correct user password. Since
iris codes obtained from two images of the same eye are not exactly
the same, error correcting codes (Hadamard code and Reed-Solomon
code) are used to deal with the variability. The scheme proposed here
can be used to provide keys for a cryptographic system and/or for
user authentication. The performance of this system is evaluated on
two publicly available databases for iris biometrics namely CBS and
ICE databases. The operating point of the system (values of False
Acceptance Rate (FAR) and False Rejection Rate (FRR)) can be set
by properly selecting the error correction capacity (ts) of the Reed-
Solomon codes, e.g., on the ICE database, at ts = 15, FAR is 0.096%
and FRR is 0.76%.
Abstract: Video watermarking is usually considered as watermarking of a set of still images. In frame-by-frame watermarking approach, each video frame is seen as a single watermarked image, so collusion attack is more critical in video watermarking. If the same or redundant watermark is used for embedding in every frame of video, the watermark can be estimated and then removed by watermark estimate remodolulation (WER) attack. Also if uncorrelated watermarks are used for every frame, these watermarks can be washed out with frame temporal filtering (FTF). Switching watermark system or so-called SS-N system has better performance against WER and FTF attacks. In this system, for each frame, the watermark is randomly picked up from a finite pool of watermark patterns. At first SS-N system will be surveyed and then a new collusion attack for SS-N system will be proposed using a new algorithm for separating video frame based on watermark pattern. So N sets will be built in which every set contains frames carrying the same watermark. After that, using WER attack in every set, N different watermark patterns will be estimated and removed later.
Abstract: In contrast to existing methods which do not take into account multiconnectivity in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM and FDM) numerical methods of calculation of stationary and quasi-stationary temperature field of a profile part of a blade with convective cooling (from the point of view of realization on PC). The theoretical substantiation of these methods is proved by appropriate theorems. For it, converging quadrature processes have been developed and the estimations of errors in the terms of A.Ziqmound continuity modules have been received. For visualization of profiles are used: the method of the least squares with automatic conjecture, device spline, smooth replenishment and neural nets. Boundary conditions of heat exchange are determined from the solution of the corresponding integral equations and empirical relationships. The reliability of designed methods is proved by calculation and experimental investigations heat and hydraulic characteristics of the gas turbine first stage nozzle blade.
Abstract: Because of architectural condition and structure application, sometimes mass source and stiffness source are not coincidence, and the structure is irregular. The structure is also might be asymmetric as an asymmetric bracing in plan which leads to unbalance distribution of stiffness or because of unbalance distribution of the mass. Both condition lead to eccentricity and torsion in the structure. The deficiency of ordinary code to evaluate the performance of steel structures against earthquake has been caused designing based on performance level or capacity spectrum be used. By using the mentioned methods it is possible to design a structure that its behavior against different earthquakes be predictive. In this article 5- story buildings with different percentage of asymmetric which is because of stiffness changes have been designed. The static and dynamic nonlinear analysis under three acceleration recording has been done. Finally performance level of the structure has been evaluated.
Abstract: Electron multiplying charge coupled devices (EMCCDs) have revolutionized the world of low light imaging by introducing on-chip multiplication gain based on the impact ionization effect in the silicon. They combine the sub-electron readout noise with high frame rates. Signal-to-noise Ratio (SNR) is an important performance parameter for low-light-level imaging systems. This work investigates the SNR performance of an EMCCD operated in Non-inverted Mode (NIMO) and Inverted Mode (IMO). The theory of noise characteristics and operation modes is presented. The results show that the SNR of is determined by dark current and clock induced charge at high gain level. The optimum SNR performance is provided by an EMCCD operated in NIMO in short exposure and strong cooling applications. In contrast, an IMO EMCCD is preferable.
Abstract: Morgan-s refinement calculus (MRC) is one of the
well-known methods allowing the formality presented in the program
specification to be continued all the way to code. On the other hand,
Object-Z (OZ) is an extension of Z adding support for classes and
objects. There are a number of methods for obtaining code from OZ
specifications that can be categorized into refinement and animation
methods. As far as we know, only one refinement method exists
which refines OZ specifications into code. However, this method
does not have fine-grained refinement rules and thus cannot be
automated. On the other hand, existing animation methods do not
present mapping rules formally and do not support the mapping of
several important constructs of OZ, such as all cases of operation
expressions and most of constructs in global paragraph. In this paper,
with the aim of providing an automatic path from OZ specifications
to code, we propose an approach to map OZ specifications into their
counterparts in MRC in order to use fine-grained refinement rules of
MRC. In this way, having counterparts of our specifications in MRC,
we can refine them into code automatically using MRC tools such as
RED. Other advantages of our work pertain to proposing mapping
rules formally, supporting the mapping of all important constructs of
Object-Z, and considering dynamic instantiation of objects while OZ
itself does not cover this facility.