Abstract: In this work, a special case of the image superresolution
problem where the only type of motion is global
translational motion and the blurs are shift-invariant is investigated.
The necessary conditions for exact reconstruction of the original
image by using finite impulse-response reconstruction filters are
developed. Given that the conditions are satisfied, a method for exact
super-resolution is presented and some simulation results are shown.
Abstract: The presented work is motivated by a French law
regarding nuclear waste management. A new conceptual Accelerator
Driven System (ADS) designed for the Minor Actinides (MA)
transmutation has been assessed by numerical simulation. The
MUltiple Spallation Target (MUST) ADS combines high thermal power (up to 1.4 GWth) and high specific power. A 30 mA and 1
GeV proton beam is divided into three secondary beams transmitted on three liquid lead-bismuth spallation targets. Neutron and thermalhydraulic
simulations have been performed with the code MURE, based on the Monte-Carlo transport code MCNPX. A methodology has been developed to define characteristic of the MUST ADS concept according to a specific transmutation scenario. The reference
scenario is based on a MA flux (neptunium, americium and curium)
providing from European Fast Reactor (EPR) and a plutonium multireprocessing
strategy is accounted for. The MUST ADS reference
concept is a sodium cooled fast reactor. The MA fuel at equilibrium is mixed with MgO inert matrix to limit the core reactivity and
improve the fuel thermal conductivity. The fuel is irradiated over five
years. Five years of cooling and two years for the fuel fabrication are
taken into account. The MUST ADS reference concept burns about 50% of the initial MA inventory during a complete cycle. In term of
mass, up to 570 kg/year are transmuted in one concept. The methodology to design the MUST ADS and to calculate fuel
composition at equilibrium is precisely described in the paper. A detailed fuel evolution analysis is performed and the reference scenario is compared to a scenario where only americium transmutation is performed.
Abstract: With a surge of stream processing applications novel
techniques are required for generation and analysis of association
rules in streams. The traditional rule mining solutions cannot handle
streams because they generally require multiple passes over the data
and do not guarantee the results in a predictable, small time. Though
researchers have been proposing algorithms for generation of rules
from streams, there has not been much focus on their analysis.
We propose Association rule profiling, a user centric process for
analyzing association rules and attaching suitable profiles to them
depending on their changing frequency behavior over a previous
snapshot of time in a data stream.
Association rule profiles provide insights into the changing nature
of associations and can be used to characterize the associations. We
discuss importance of characteristics such as predictability of
linkages present in the data and propose metric to quantify it. We
also show how association rule profiles can aid in generation of user
specific, more understandable and actionable rules.
The framework is implemented as SUPAR: System for Usercentric
Profiling of Association Rules in streaming data. The
proposed system offers following capabilities:
i) Continuous monitoring of frequency of streaming item-sets
and detection of significant changes therein for association rule
profiling.
ii) Computation of metrics for quantifying predictability of
associations present in the data.
iii) User-centric control of the characterization process: user
can control the framework through a) constraint specification and b)
non-interesting rule elimination.
Abstract: Design and implementation of a novel B-ACOSD CFAR algorithm is presented in this paper. It is proposed for detecting radar target in log-normal distribution environment. The BACOSD detector is capable to detect automatically the number interference target in the reference cells and detect the real target by an adaptive threshold. The detector is implemented as a System on Chip on FPGA Altera Stratix II using parallelism and pipelining technique. For a reference window of length 16 cells, the experimental results showed that the processor works properly with a processing speed up to 115.13MHz and processing time0.29 ┬Ás, thus meets real-time requirement for a typical radar system.
Abstract: In this paper, an automatic detecting algorithm for
QRS complex detecting was applied for analyzing ECG recordings
and five criteria for dangerous arrhythmia diagnosing are applied for a
protocol type of automatic arrhythmia diagnosing system. The
automatic detecting algorithm applied in this paper detected the
distribution of QRS complexes in ECG recordings and related
information, such as heart rate and RR interval. In this investigation,
twenty sampled ECG recordings of patients with different pathologic
conditions were collected for off-line analysis. A combinative
application of four digital filters for bettering ECG signals and
promoting detecting rate for QRS complex was proposed as
pre-processing. Both of hardware filters and digital filters were
applied to eliminate different types of noises mixed with ECG
recordings. Then, an automatic detecting algorithm of QRS complex
was applied for verifying the distribution of QRS complex. Finally,
the quantitative clinic criteria for diagnosing arrhythmia were
programmed in a practical application for automatic arrhythmia
diagnosing as a post-processor. The results of diagnoses by automatic
dangerous arrhythmia diagnosing were compared with the results of
off-line diagnoses by experienced clinic physicians. The results of
comparison showed the application of automatic dangerous
arrhythmia diagnosis performed a matching rate of 95% compared
with an experienced physician-s diagnoses.
Abstract: Adapting wireless devices to communicate within grid
networks empowers us by providing range of possibilities.. These
devices create a mechanism for consumers and publishers to create
modern networks with or without peer device utilization. Emerging
mobile networks creates new challenges in the areas of reliability,
security, and adaptability. In this paper, we propose a system
encompassing mobility management using AAA context transfer for
mobile grid networks. This system ultimately results in seamless task
processing and reduced packet loss, communication delays,
bandwidth, and errors.
Abstract: We introduce an extended resource leveling model that abstracts real life projects that consider specific work ranges for each resource. Contrary to traditional resource leveling problems this model considers scarce resources and multiple objectives: the minimization of the project makespan and the leveling of each resource usage over time. We formulate this model as a multiobjective optimization problem and we propose a multiobjective genetic algorithm-based solver to optimize it. This solver consists in a two-stage process: a main stage where we obtain non-dominated solutions for all the objectives, and a postprocessing stage where we seek to specifically improve the resource leveling of these solutions. We propose an intelligent encoding for the solver that allows including domain specific knowledge in the solving mechanism. The chosen encoding proves to be effective to solve leveling problems with scarce resources and multiple objectives. The outcome of the proposed solvers represent optimized trade-offs (alternatives) that can be later evaluated by a decision maker, this multi-solution approach represents an advantage over the traditional single solution approach. We compare the proposed solver with state-of-art resource leveling methods and we report competitive and performing results.
Abstract: This paper proposes and implements an core transform architecture, which is one of the major processes in HEVC video compression standard. The proposed core transform architecture is implemented with only adders and shifters instead of area-consuming multipliers. Shifters in the proposed core transform architecture are implemented in wires and multiplexers, which significantly reduces chip area. Also, it can process from 4×4 to 16×16 blocks with common hardware by reusing processing elements. Designed core transform architecture in 0.13um technology can process a 16×16 block with 2-D transform in 130 cycles, and its gate count is 101,015 gates.
Abstract: This paper presents a wavelet transform and Support
Vector Machine (SVM) based algorithm for estimating fault location
on transmission lines. The Discrete wavelet transform (DWT) is used
for data pre-processing and this data are used for training and testing
SVM. Five types of mother wavelet are used for signal processing to
identify a suitable wavelet family that is more appropriate for use in
estimating fault location. The results demonstrated the ability of SVM
to generalize the situation from the provided patterns and to
accurately estimate the location of faults with varying fault resistance.
Abstract: Today-s children, who are born into a more colorful,
more creative, more abstract and more accessible communication
environment than their ancestors as a result of dizzying advances in
technology, have an interesting capacity to perceive and make sense
of the world. Millennium children, who live in an environment where
all kinds of efforts by marketing communication are more intensive
than ever are, from their early childhood on, subject to all kinds of
persuasive messages. As regards advertising communication, it
outperforms all the other marketing communication efforts in
creating little consumer individuals and, as a result of processing of
codes and signs, plays a significant part in building a world of seeing,
thinking and understanding for children. Children who are raised with
metaphorical expressions such as tales and riddles also meet that fast
and effective meaning communication in advertisements.
Children-s perception of metaphors, which help grasp the “product
and its promise" both verbally and visually and facilitate association
between them is the subject of this study. Stimulating and activating
imagination, metaphors have unique advantages in promoting the
product and its promise especially in regard to print advertisements,
which have certain limitations. This study deals comparatively with
both literal and metaphoric versions of print advertisements
belonging to various product groups and attempts to discover to what
extent advertisements are liked, recalled, perceived and are
persuasive. The sample group of the study, which was conducted in
two elementary schools situated in areas that had different socioeconomic
features, consisted of children aged 12.
Abstract: Using strength Pulse Electrical Field (PEF) in food
industries is a non-thermal process that can deactivate
microorganisms and increase penetration in plant and animals tissues
without serious impact on food taste and quality. In this paper designing and fabricating of a PEF generator has been presented. Pulse generation methods have been surveyed and the best of them
selected. The equipment by controller set can generate square pulse with adjustable parameters such as amplitude 1-5kV, frequency 0.1-10Hz, pulse width 10-100s, and duty cycle 0-100%. Setting the number of pulses, and presenting the output voltage and current
waveforms on the oscilloscope screen are another advantages of this
equipment. Finally, some food samples were tested that yielded the satisfactory results. PEF applying had considerable effects on potato, banana and purple cabbage. It caused increase Brix factor from 0.05
to 0.15 in potato solution. It is also so effective in extraction color material from purple cabbage. In the last experiment effects of PEF
voltages on color extraction of saffron scum were surveyed (about 6% increasing yield).
Abstract: The complex shape of the human pelvic bone was
successfully imaged and modeled using finite element FE processing.
The bone was subjected to quasi-static and dynamic loading
conditions simulating the effect of both weight gain and impact.
Loads varying between 500 – 2500 N (~50 – 250 Kg of weight) was
used to simulate 3D quasi-static weight gain. Two different 3D
dynamic analyses, body free fall at two different heights (1 and 2 m)
and forced side impact at two different velocities (20 and 40 Km/hr)
were also studied. The computed resulted stresses were compared for
the four loading cases, where Von Misses stresses increases linearly
with the weight gain increase under quasi-static loading. For the
dynamic models, the Von Misses stress history behaviors were
studied for the affected area and effected load with respect to time.
The normalization Von Misses stresses with respect to the applied
load were used for comparing the free fall and the forced impact load
results. It was found that under the forced impact loading condition
an over lapping behavior was noticed, where as for the free fall the
normalized Von Misses stresses behavior was found to nonlinearly
different. This phenomenon was explained through the energy
dissipation concept. This study will help designers in different
specialization in defining the weakest spots for designing different
supporting systems.
Abstract: Cryo-electron microscopy (CEM) in combination with
single particle analysis (SPA) is a widely used technique for
elucidating structural details of macromolecular assemblies at closeto-
atomic resolutions. However, development of automated software
for SPA processing is still vital since thousands to millions of
individual particle images need to be processed. Here, we present our
workflow for automated particle picking. Our approach integrates
peak shape analysis to the classical correlation and an iterative
approach to separate macromolecules and background by
classification. This particle selection workflow furthermore provides
a robust means for SPA with little user interaction. Processing
simulated and experimental data assesses performance of the
presented tools.
Abstract: This paper presents an efficient VLSI architecture
design to achieve real time video processing using Full-Search Block
Matching (FSBM) algorithm. The design employs parallel bank
architecture with minimum latency, maximum throughput, and full
hardware utilization. We use nine parallel processors in our
architecture and each controlled by a state machine. State machine
control implementation makes the design very simple and cost
effective. The design is implemented using VHDL and the
programming techniques we incorporated makes the design
completely programmable in the sense that the search ranges and the
block sizes can be varied to suit any given requirements. The design
can operate at frequencies up to 36 MHz and it can function in QCIF
and CIF video resolution at 1.46 MHz and 5.86 MHz, respectively.
Abstract: Circular knitting machine makes the fabric with more than two knitting tools. Variation of yarn tension between different knitting tools causes different loop length of stitches duration knitting process. In this research, a new intelligent method is applied to control loop length of stitches in various tools based on ideal shape of stitches and real angle of stitches direction while different loop length of stitches causes stitches deformation and deviation those of angle. To measure deviation of stitch direction against variation of tensions, image processing technique was applied to pictures of different fabrics with constant front light. After that, the rate of deformation is translated to needed compensation of loop length cam degree to cure stitches deformation. A fuzzy control algorithm was applied to loop length modification in knitting tools. The presented method was experienced for different knitted fabrics of various structures and yarns. The results show that presented method is useable for control of loop length variation between different knitting tools based on stitch deformation for various knitted fabrics with different fabric structures, densities and yarn types.
Abstract: In Blind Source Separation (BSS) processing, taking
advantage of scaling factor indetermination and based on the floatingpoint
representation, we propose a scaling technique applied to the
separation matrix, to avoid the saturation or the weakness in the
recovered source signals. This technique performs an Automatic Gain
Control (AGC) in an on-line BSS environment. We demonstrate
the effectiveness of this technique by using the implementation of
a division free BSS algorithm with two input, two output. This
technique is computationally cheaper and efficient for a hardware
implementation.
Abstract: Iris-based biometric authentication is gaining importance
in recent times. Iris biometric processing however, is a complex
process and computationally very expensive. In the overall processing
of iris biometric in an iris-based biometric authentication system,
feature processing is an important task. In feature processing, we extract
iris features, which are ultimately used in matching. Since there
is a large number of iris features and computational time increases
as the number of features increases, it is therefore a challenge to
develop an iris processing system with as few as possible number of
features and at the same time without compromising the correctness.
In this paper, we address this issue and present an approach to feature
extraction and feature matching process. We apply Daubechies D4
wavelet with 4 levels to extract features from iris images. These
features are encoded with 2 bits by quantizing into 4 quantization
levels. With our proposed approach it is possible to represent an
iris template with only 304 bits, whereas existing approaches require
as many as 1024 bits. In addition, we assign different weights to
different iris region to compare two iris templates which significantly
increases the accuracy. Further, we match the iris template based on
a weighted similarity measure. Experimental results on several iris
databases substantiate the efficacy of our approach.
Abstract: The present study was carried out to evaluate the
nutritional value of sorghum flour during processing of injera
(unleavened thick bread). The proximate composition of sorghum
flour before and after fermentation and that of injera was determined.
Compared to the raw flour and fermented one, injera had low protein
(11.55%), ash (1.57%) and fat (2.40%) contents but high in fiber
content. Moreover, injera was found to have significantly (P ≤ 0.05)
higher energy (389.08 Kcal/100g) compared to raw and fermented
sorghum flour. Injera contained lower levels of anti-nutritional
factors (polyphenols, phytate and tannins) compared to raw and
fermented sorghum. Also it was found to be rich in Ca
(4.75mg/100g), Fe (3.95 mg/100g), and Cu (0.7 mg/100g) compared
to that of raw and fermented flour. Moreover, both the extractable
minerals and protein digestibility were high for injera due to low
amount of anti-nutrients. Injera was found to contain an appreciable
amount of amino acids except arginine and tyrosine.
Abstract: In the end of the day, meteorological data and environmental data becomes widely used such as plant varieties selection system. Variety plant selection for planted area is of almost importance for all crops, including varieties of sugarcane. Since sugarcane have many varieties. Variety plant non selection for planting may not be adapted to the climate or soil conditions for planted area. Poor growth, bloom drop, poor fruit, and low price are to be from varieties which were not recommended for those planted area. This paper presents plant varieties selection system for planted areas in Thailand from meteorological data and environmental data by the use of decision tree techniques. With this software developed as an environmental data analysis tool, it can analyze resulting easier and faster. Our software is a front end of WEKA that provides fundamental data mining functions such as classify, clustering, and analysis functions. It also supports pre-processing, analysis, and decision tree output with exporting result. After that, our software can export and display data result to Google maps API in order to display result and plot plant icons effectively.
Abstract: This paper provides an introduction into the
evolution of information and communication technology and illustrates its usage in the work domain. The paper is sub-divided into two parts. The first part gives an overview over the different
phases of information processing in the work domain. It starts by
charting the past and present usage of computers in work
environments and shows current technological trends, which are likely to influence future business applications. The second part
starts by briefly describing, how the usage of computers changed business processes in the past, and presents first Ambient
Intelligence applications based on identification and localization
information, which are already used in the production and retail sector. Based on current systems and prototype applications, the
paper gives an outlook of how Ambient Intelligence technologies could change business processes in the future.