Abstract: Power transformer consists of components which are
under consistent thermal and electrical stresses. The major
component which degrades under these stresses is the paper
insulation of the power transformer. At site, lightning impulses and
cable faults may cause the winding deformation. In addition, the
winding may deform due to impact during transportation. A
deformed winding will excite more stress to its insulating paper thus
will degrade it. Insulation degradation will shorten the life-span of
the transformer. Currently there are two methods of detecting the
winding deformation which are Sweep Frequency Response
Analysis (SFRA) and Low Voltage Impulse Test (LVI). The latter
injects current pulses to the winding and capture the admittance
plot. In this paper, a transformer which experienced overheating and
arcing was identified, and both SFRA and LVI were performed.
Next, the transformer was brought to the factory for untanking. The
untanking results revealed that the LVI is more accurate than the
SFRA method for this case study.
Abstract: This paper proposes a method, combining color and
layout features, for identifying documents captured from lowresolution
handheld devices. On one hand, the document image color
density surface is estimated and represented with an equivalent
ellipse and on the other hand, the document shallow layout structure
is computed and hierarchically represented. The combined color and
layout features are arranged in a symbolic file, which is unique for
each document and is called the document-s visual signature. Our
identification method first uses the color information in the
signatures in order to focus the search space on documents having a
similar color distribution, and finally selects the document having the
most similar layout structure in the remaining search space. Finally,
our experiment considers slide documents, which are often captured
using handheld devices.
Abstract: MinC plays an important role in bacterial cell division
system by inhibiting FtsZ assembly. However, the molecular
mechanism of the action is poorly understood. E. coli MinC Nterminus
domain was purified and crystallized using 1.4 M sodium
citrate pH 6.5 as a precipitant. X-ray diffraction data was collected
and processed to 2.3 Å from a native crystal. The crystal belonged to
space group P212121, with the unit cell parameters a = 52.7, b = 54.0,
c = 64.7 Å. Assuming the presence of two molecules in the
asymmetric unit, the Matthews coefficient value is 1.94 Å3 Da-1,
which corresponds to a solvent content of 36.5%. The overall
structure of MinCN is observed as a dimer form through anti-parallel
ß-strand interaction.
Abstract: The importance of happiness understanding research is caused by cardinal changes experiences in system of people values in the post-Soviet countries territory. «The time of changes», which characterized with destruction of old values and not creativeness of new, stimulating experiences by the person of existential vacuum. The given research is actual not only in connection with sense formation, but also in connection with necessity creatively to adapt in integrative space. According to numerous works [1,2,3], we define happiness as the peak experience connected with satisfaction correlated system of needs, dependent on style of subject's coping behavior.
Abstract: The development of Web has affected different aspects of our lives, such as communication, sharing knowledge, searching for jobs, social activities, etc. The web portal as a gateway in the World Wide Web is a starting point for people who are connecting to the Internet. The web portal as the type of knowledge management system provides a rich space to share and search information as well as communication services like free email or content provision for the users. This research aims to discover the university needs to the web portal as a necessary tool for students in the universities to help them in getting the required information. A survey was conducted to gather students' requirements which can be incorporated in to portal to be developed.
Abstract: Digital watermarking has become an important technique for copyright protection but its robustness against attacks remains a major problem. In this paper, we propose a normalizationbased robust image watermarking scheme. In the proposed scheme, original host image is first normalized to a standard form. Zernike transform is then applied to the normalized image to calculate Zernike moments. Dither modulation is adopted to quantize the magnitudes of Zernike moments according to the watermark bit stream. The watermark extracting method is a blind method. Security analysis and false alarm analysis are then performed. The quality degradation of watermarked image caused by the embedded watermark is visually transparent. Experimental results show that the proposed scheme has very high robustness against various image processing operations and geometric attacks.
Abstract: In this article we explore the application of a formal
proof system to verification problems in cryptography. Cryptographic
properties concerning correctness or security of some cryptographic
algorithms are of great interest. Beside some basic lemmata, we
explore an implementation of a complex function that is used in
cryptography. More precisely, we describe formal properties of this
implementation that we computer prove. We describe formalized
probability distributions (σ-algebras, probability spaces and conditional
probabilities). These are given in the formal language of the
formal proof system Isabelle/HOL. Moreover, we computer prove
Bayes- Formula. Besides, we describe an application of the presented
formalized probability distributions to cryptography. Furthermore,
this article shows that computer proofs of complex cryptographic
functions are possible by presenting an implementation of the Miller-
Rabin primality test that admits formal verification. Our achievements
are a step towards computer verification of cryptographic primitives.
They describe a basis for computer verification in cryptography.
Computer verification can be applied to further problems in cryptographic
research, if the corresponding basic mathematical knowledge
is available in a database.
Abstract: In this study, the locations and areas of commercial
accumulations were detected by using digital yellow page data. An
original buffering method that can accurately create polygons of
commercial accumulations is proposed in this paper.; by using this
method, distribution of commercial accumulations can be easily
created and monitored over a wide area. The locations, areas, and
time-series changes of commercial accumulations in the South Kanto
region can be monitored by integrating polygons of commercial
accumulations with the time-series data of digital yellow page data.
The circumstances of commercial accumulations were shown to vary
according to areas, that is, highly- urbanized regions such as the city
center of Tokyo and prefectural capitals, suburban areas near large
cities, and suburban and rural areas.
Abstract: Heat pipes are used to control the thermal problem for
electronic cooling. It is especially difficult to dissipate heat to a heat
sink in an environment in space compared to earth. For solving this
problem, in this study, the Poiseuille (Po) number, which is the main
measure of the performance of a heat pipe, is studied by CFD; then, the
heat pipe performance is verified with experimental results. A heat
pipe is then fabricated for a spatial environment, and an in-house code
is developed. Further, a heat pipe subsystem, which consists of a heat
pipe, MLI (Multi Layer Insulator), SSM (Second Surface Mirror), and
radiator, is tested and correlated with the TMM (Thermal
Mathematical Model) through a commercial code. The correlation
results satisfy the 3K requirement, and the generated thermal model is
verified for application to a spatial environment.
Abstract: A combination of image fusion and quad tree decomposition method is used for detecting the sunspot trajectories in each month and computation of the latitudes of these trajectories in each solar hemisphere. Daily solar images taken with SOHO satellite are fused for each month and the result of fused image is decomposed with Quad Tree decomposition method in order to classifying the sunspot trajectories and then to achieve the precise information about latitudes of sunspot trajectories. Also with fusion we deduce some physical remarkable conclusions about sun magnetic fields behavior. Using quad tree decomposition we give information about the region on sun surface and the space angle that tremendous flares and hot plasma gases permeate interplanetary space and attack to satellites and human technical systems. Here sunspot images in June, July and August 2001 are used for studying and give a method to compute the latitude of sunspot trajectories in each month with sunspot images.
Abstract: This paper presents the averaging model of a buck
converter derived from the generalized state-space averaging method.
The sliding mode control is used to regulate the output voltage of the
converter and taken into account in the model. The proposed model
requires the fast computational time compared with those of the full
topology model. The intensive time-domain simulations via the exact
topology model are used as the comparable model. The results show
that a good agreement between the proposed model and the switching
model is achieved in both transient and steady-state responses. The
reported model is suitable for the optimal controller design by using
the artificial intelligence techniques.
Abstract: Nowadays, with the emerging of the new applications
like robot control in image processing, artificial vision for visual
servoing is a rapidly growing discipline and Human-machine
interaction plays a significant role for controlling the robot. This
paper presents a new algorithm based on spatio-temporal volumes for
visual servoing aims to control robots. In this algorithm, after
applying necessary pre-processing on video frames, a spatio-temporal
volume is constructed for each gesture and feature vector is extracted.
These volumes are then analyzed for matching in two consecutive
stages. For hand gesture recognition and classification we tested
different classifiers including k-Nearest neighbor, learning vector
quantization and back propagation neural networks. We tested the
proposed algorithm with the collected data set and results showed the
correct gesture recognition rate of 99.58 percent. We also tested the
algorithm with noisy images and algorithm showed the correct
recognition rate of 97.92 percent in noisy images.
Abstract: In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.
Abstract: This paper presents an efficient method of obtaining a straight-line motion in the tool configuration space using an articulated robot between two specified points. The simulation results & the implementation results show the effectiveness of the method.
Abstract: The statistical process control (SPC) is one of the most powerful tools developed to assist ineffective control of quality, involves collecting, organizing and interpreting data during production. This article aims to show how the use of CEP industries can control and continuously improve product quality through monitoring of production that can detect deviations of parameters representing the process by reducing the amount of off-specification products and thus the costs of production. This study aimed to conduct a technological forecasting in order to characterize the research being done related to the CEP. The survey was conducted in the databases Spacenet, WIPO and the National Institute of Industrial Property (INPI). Among the largest are the United States depositors and deposits via PCT, the classification section that was presented in greater abundance to F.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: We address the balancing problem of transfer lines in
this paper to find the optimal line balancing that minimizes the nonproductive
time. We focus on the tool change time and face
orientation change time both of which influence the makespane. We
consider machine capacity limitations and technological constraints
associated with the manufacturing process of auto cylinder heads.
The problem is represented by a mixed integer programming model
that aims at distributing the design features to workstations and
sequencing the machining processes at a minimum non-productive
time. The proposed model is solved by an algorithm established using
linearization schemes and Benders- decomposition approach. The
experiments show the efficiency of the algorithm in reaching the
exact solution of small and medium problem instances at reasonable
time.
Abstract: Clustering algorithms help to understand the hidden
information present in datasets. A dataset may contain intrinsic and
nested clusters, the detection of which is of utmost importance. This
paper presents a Distributed Grid-based Density Clustering algorithm
capable of identifying arbitrary shaped embedded clusters as well as
multi-density clusters over large spatial datasets. For handling
massive datasets, we implemented our method using a 'sharednothing'
architecture where multiple computers are interconnected
over a network. Experimental results are reported to establish the
superiority of the technique in terms of scale-up, speedup as well as
cluster quality.
Abstract: The data measurement of mean velocity has been
taken for the wake of single circular cylinder with three different diameters for two different velocities. The effects of change in
diameter and in velocity are studied in self-similar coordinate system.
The spatial variations of velocity defect and that of the half-width
have been investigated. The results are compared with those
published by H.Schlichting. In the normalized coordinates, it is also observed that all cases except for the first station are self-similar. By attention to self-similarity profiles of mean velocity, it is observed for all the cases at the each station curves tend to zero at a same point.
Abstract: In this paper, a decision aid method for preoptimization
is presented. The method is called “negotiation", and it
is based on the identification, formulation, modeling and use of
indicators defined as “negotiation indicators". These negotiation
indicators are used to explore the solution space by means of a classbased
approach. The classes are subdomains for the negotiation
indicators domain. They represent equivalent cognitive solutions in
terms of the negotiation indictors being used. By this method, we
reduced the size of the solution space and the criteria, thus aiding the
optimization methods. We present an example to show the method.