Abstract: For a given specific problem an efficient algorithm has been the matter of study. However, an alternative approach orthogonal to this approach comes out, which is called a reduction. In general for a given specific problem this reduction approach studies how to convert an original problem into subproblems. This paper proposes a formal modeling language to support this reduction approach in order to make a solver quickly. We show three examples from the wide area of learning problems. The benefit is a fast prototyping of algorithms for a given new problem. It is noted that our formal modeling language is not intend for providing an efficient notation for data mining application, but for facilitating a designer who develops solvers in machine learning.
Abstract: In this paper, we present a cost-effective wireless
distributed load shedding system for non-emergency scenarios. In
power transformer locations where SCADA system cannot be used,
the proposed solution provides a reasonable alternative that combines
the use of microcontrollers and existing GSM infrastructure to send
early warning SMS messages to users advising them to proactively
reduce their power consumption before system capacity is reached
and systematic power shutdown takes place.
A novel communication protocol and message set have been
devised to handle the messaging between the transformer sites, where
the microcontrollers are located and where the measurements take
place, and the central processing site where the database server is
hosted. Moreover, the system sends warning messages to the endusers
mobile devices that are used as communication terminals. The
system has been implemented and tested via different experimental
results.
Abstract: Because of the great advance in multimedia
technology, digital multimedia is vulnerable to malicious
manipulations. In this paper, a public key self-recovery block-based
video authentication technique is proposed which can not only
precisely localize the alteration detection but also recover the missing
data with high reliability. In the proposed block-based technique,
multiple description coding MDC is used to generate two codes (two
descriptions) for each block. Although one block code (one
description) is enough to rebuild the altered block, the altered block
is rebuilt with better quality by the two block descriptions. So using
MDC increases the ratability of recovering data. A block signature is
computed using a cryptographic hash function and a doubly linked
chain is utilized to embed the block signature copies and the block
descriptions into the LSBs of distant blocks and the block itself. The
doubly linked chain scheme gives the proposed technique the
capability to thwart vector quantization attacks. In our proposed
technique , anyone can check the authenticity of a given video using
the public key. The experimental results show that the proposed
technique is reliable for detecting, localizing and recovering the
alterations.
Abstract: This paper addresses the problem of determining the current 3D location of a moving object and robustly tracking it from a sequence of camera images. The approach presented here uses a particle filter and does not perform any explicit triangulation. Only the color of the object to be tracked is required, but not any precisemotion model. The observation model we have developed avoids the color filtering of the entire image. That and the Monte Carlotechniques inside the particle filter provide real time performance.Experiments with two real cameras are presented and lessons learned are commented. The approach scales easily to more than two cameras and new sensor cues.
Abstract: A dissimilarity measure between the empiric
characteristic functions of the subsamples associated to the different
classes in a multivariate data set is proposed. This measure can be
efficiently computed, and it depends on all the cases of each class. It
may be used to find groups of similar classes, which could be joined
for further analysis, or it could be employed to perform an
agglomerative hierarchical cluster analysis of the set of classes. The
final tree can serve to build a family of binary classification models,
offering an alternative approach to the multi-class SVM problem. We
have tested this dendrogram based SVM approach with the oneagainst-
one SVM approach over four publicly available data sets,
three of them being microarray data. Both performances have been
found equivalent, but the first solution requires a smaller number of
binary SVM models.
Abstract: The public sector holds large amounts of data of
various areas such as social affairs, economy, or tourism. Various
initiatives such as Open Government Data or the EU Directive on
public sector information aim to make these data available for public
and private service providers. Requirements for the provision of
public sector data are defined by legal and organizational
frameworks. Surprisingly, the defined requirements hardly cover
security aspects such as integrity or authenticity.
In this paper we discuss the importance of these missing
requirements and present a concept to assure the integrity and
authenticity of provided data based on electronic signatures. We
show that our concept is perfectly suitable for the provisioning of
unaltered data. We also show that our concept can also be extended
to data that needs to be anonymized before provisioning by
incorporating redactable signatures. Our proposed concept enhances
trust and reliability of provided public sector data.
Abstract: This paper presents a implementation of an object tracking system in a video sequence. This object tracking is an important task in many vision applications. The main steps in video analysis are two: detection of interesting moving objects and tracking of such objects from frame to frame. In a similar vein, most tracking algorithms use pre-specified methods for preprocessing. In our work, we have implemented several object tracking algorithms (Meanshift, Camshift, Kalman filter) with different preprocessing methods. Then, we have evaluated the performance of these algorithms for different video sequences. The obtained results have shown good performances according to the degree of applicability and evaluation criteria.
Abstract: Rounding of coefficients is a common practice in
hardware implementation of digital filters. Where some coefficients
are very close to zero or one, as assumed in this paper, this rounding
action also leads to some computation reduction. Furthermore, if the
discarded coefficient is of high order, a reduced order filter is
obtained, otherwise the order does not change but computation is
reduced. In this paper, the Least Squares approximation to rounded
(or discarded) coefficient FIR filter is investigated. The result also
succinctly extended to general type of FIR filters.
Abstract: Medical image segmentation based on image smoothing followed by edge detection assumes a great degree of importance in the field of Image Processing. In this regard, this paper proposes a novel algorithm for medical image segmentation based on vigorous smoothening by identifying the type of noise and edge diction ideology which seems to be a boom in medical image diagnosis. The main objective of this algorithm is to consider a particular medical image as input and make the preprocessing to remove the noise content by employing suitable filter after identifying the type of noise and finally carrying out edge detection for image segmentation. The algorithm consists of three parts. First, identifying the type of noise present in the medical image as additive, multiplicative or impulsive by analysis of local histograms and denoising it by employing Median, Gaussian or Frost filter. Second, edge detection of the filtered medical image is carried out using Canny edge detection technique. And third part is about the segmentation of edge detected medical image by the method of Normalized Cut Eigen Vectors. The method is validated through experiments on real images. The proposed algorithm has been simulated on MATLAB platform. The results obtained by the simulation shows that the proposed algorithm is very effective which can deal with low quality or marginal vague images which has high spatial redundancy, low contrast and biggish noise, and has a potential of certain practical use of medical image diagnosis.
Abstract: In this paper, three types of defected ground structure
(DGS) units which are triangular-head (TH), rectangular-head (RH)
and U-shape (US) are investigated. They are further used to low-pass
and band-pass filters designs (LPF and BPF) and the obtained
performances are examined. The LPF employing RH-DGS geometry
presents the advantages of compact size, low-insertion loss and wide
stopband compared to the other filters. It provides cutoff frequency of
2.5 GHz, largest rejection band width of 20 dB from 2.98 to 8.76
GHz, smallest transition region and smallest sharpness of the cutoff
frequency. The BPF based on RH-DGS has the highest bandwidth
(BW) of about 0.74 GHz and the lowest center frequency of 3.24
GHz, whereas the other BPFs have BWs less than 0.7 GHz.
Abstract: Determining depth of anesthesia is a challenging problem
in the context of biomedical signal processing. Various methods
have been suggested to determine a quantitative index as depth of
anesthesia, but most of these methods suffer from high sensitivity
during the surgery. A novel method based on energy scattering of
samples in the wavelet domain is suggested to represent the basic
content of electroencephalogram (EEG) signal. In this method, first
EEG signal is decomposed into different sub-bands, then samples
are squared and energy of samples sequence is constructed through
each scale and time, which is normalized and finally entropy of the
resulted sequences is suggested as a reliable index. Empirical Results
showed that applying the proposed method to the EEG signals can
classify the awake, moderate and deep anesthesia states similar to
BIS.
Abstract: Freeze concentration freezes or crystallises the water
molecules out as ice crystals and leaves behind a highly concentrated
solution. In conventional suspension freeze concentration where ice
crystals formed as a suspension in the mother liquor, separation of
ice is difficult. The size of the ice crystals is still very limited which
will require usage of scraped surface heat exchangers, which is very
expensive and accounted for approximately 30% of the capital cost.
This research is conducted using a newer method of freeze
concentration, which is progressive freeze concentration. Ice crystals
were formed as a layer on the designed heat exchanger surface. In
this particular research, a helical structured copper crystallisation
chamber was designed and fabricated. The effect of two operating
conditions on the performance of the newly designed crystallisation
chamber was investigated, which are circulation flowrate and coolant
temperature. The performance of the design was evaluated by the
effective partition constant, K, calculated from the volume and
concentration of the solid and liquid phase. The system was also
monitored by a data acquisition tool in order to see the temperature
profile throughout the process. On completing the experimental
work, it was found that higher flowrate resulted in a lower K, which
translated into high efficiency. The efficiency is the highest at 1000
ml/min. It was also found that the process gives the highest
efficiency at a coolant temperature of -6 °C.
Abstract: Bloom filter is a probabilistic and memory efficient
data structure designed to answer rapidly whether an element is
present in a set. It tells that the element is definitely not in the set but
its presence is with certain probability. The trade-off to use Bloom
filter is a certain configurable risk of false positives. The odds of a
false positive can be made very low if the number of hash function is
sufficiently large. For spam detection, weight is attached to each set
of elements. The spam weight for a word is a measure used to rate the
e-mail. Each word is assigned to a Bloom filter based on its weight.
The proposed work introduces an enhanced concept in Bloom filter
called Bin Bloom Filter (BBF). The performance of BBF over
conventional Bloom filter is evaluated under various optimization
techniques. Real time data set and synthetic data sets are used for
experimental analysis and the results are demonstrated for bin sizes 4,
5, 6 and 7. Finally analyzing the results, it is found that the BBF
which uses heuristic techniques performs better than the traditional
Bloom filter in spam detection.
Abstract: Preparation of size controlled nano-particles of silver catalyst on carbon substrate from e-waste has been investigated. Chemical route was developed by extraction of the metals available in nitric acid followed by treatment with hydrofluoric acid. Silver metal particles deposited with an average size 4-10 nm. A stabilizer concentration of 10- 40 g/l was used. The average size of the prepared silver decreased with increase of the anode current density. Size uniformity of the silver nano-particles was improved distinctly at higher current density no more than 20mA... Grain size increased with EK time whereby aggregation of particles was observed after 6 h of reaction.. The chemical method involves adsorption of silver nitrate on the carbon substrate. Adsorbed silver ions were directly reduced to metal particles using hydrazine hydrate. Another alternative method is by treatment with ammonia followed by heating the carbon loaded-silver hydroxide at 980°C. The product was characterized with the help of XRD, XRF, ICP, SEM and TEM techniques.
Abstract: In this work, a special case of the image superresolution
problem where the only type of motion is global
translational motion and the blurs are shift-invariant is investigated.
The necessary conditions for exact reconstruction of the original
image by using finite impulse-response reconstruction filters are
developed. Given that the conditions are satisfied, a method for exact
super-resolution is presented and some simulation results are shown.
Abstract: Oilsands bitumen is an extremely important source of
energy for North America. However, due to the presence of large
molecules such as asphaltenes, the density and viscosity of the
bitumen recovered from these sands are much higher than those of
conventional crude oil. As a result the extracted bitumen has to be
diluted with expensive solvents, or thermochemically upgraded in
large, capital-intensive conventional upgrading facilities prior to
pipeline transport. This study demonstrates that globally abundant
natural zeolites such as clinoptilolite from Saint Clouds, New Mexico
and Ca-chabazite from Bowie, Arizona can be used as very effective
reagents for cracking and visbreaking of oilsands bitumen. Natural
zeolite cracked oilsands bitumen products are highly recoverable (up
to ~ 83%) using light hydrocarbons such as pentane, which indicates
substantial conversion of heavier fractions to lighter components.
The resultant liquid products are much less viscous, and have lighter
product distribution compared to those produced from pure thermal
treatment. These natural minerals impart similar effect on industrially
extracted Athabasca bitumen.
Abstract: Traditionally, Internet has provided best-effort service to every user regardless of its requirements. However, as Internet becomes universally available, users demand more bandwidth and applications require more and more resources, and interest has developed in having the Internet provide some degree of Quality of Service. Although QoS is an important issue, the question of how it will be brought into the Internet has not been solved yet. Researches, due to the rapid advances in technology are proposing new and more desirable capabilities for the next generation of IP infrastructures. But neither all applications demand the same amount of resources, nor all users are service providers. In this way, this paper is the first of a series of papers that presents an architecture as a first step to the optimization of QoS in the Internet environment as a solution to a SMSE's problem whose objective is to provide public service to internet with certain Quality of Service expectations. The service provides new business opportunities, but also presents new challenges. We have designed and implemented a scalable service framework that supports adaptive bandwidth based on user demands, and the billing based on usage and on QoS. The developed application has been evaluated and the results show that traffic limiting works at optimum and so it does exceeding bandwidth distribution. However, some considerations are done and currently research is under way in two basic areas: (i) development and testing new transfer protocols, and (ii) developing new strategies for traffic improvements based on service differentiation.
Abstract: Oxidative stress and overwhelming free radicals
associated with diabetes mellitus are likely to be linked with
development of certain complication such as retinopathy,
nephropathy and neuropathy. Treatment of diabetic subjects with
antioxidant may be of advantage in attenuating these complications.
Olive leaf (Oleaeuropaea), has been endowed with many beneficial
and health promoting properties mostly linked to its antioxidant
activity. This study aimed to evaluate the significance of
supplementation of Olive leaves extract (OLE) in reducing oxidative
stress, hyperglycemia and hyperlipidemia in Sterptozotocin (STZ)-
induced diabetic rats. After induction of diabetes, a significant rise in
plasma glucose, lipid profiles except High density lipoproteincholestrol
(HDLc), malondialdehyde (MDA) and significant decrease
of plasma insulin, HDLc and Plasma reduced glutathione GSH as
well as alteration in enzymatic antioxidants was observed in all
diabetic animals. During treatment of diabetic rats with 0.5g/kg body
weight of Olive leaves extract (OLE) the levels of plasma (MDA)
,(GSH), insulin, lipid profiles along with blood glucose and
erythrocyte enzymatic antioxidant enzymes were significantly
restored to establish values that were not different from normal
control rats. Untreated diabetic rats on the other hand demonstrated
persistent alterations in the oxidative stress marker (MDA), blood
glucose, insulin, lipid profiles and the antioxidant parameters. These
results demonstrate that OLE may be of advantage in inhibiting
hyperglycemia, hyperlipidemia and oxidative stress induced by
diabetes and suggest that administration of OLE may be helpful in
the prevention or at least reduced of diabetic complications
associated with oxidative stress.
Abstract: Design and implementation of a novel B-ACOSD CFAR algorithm is presented in this paper. It is proposed for detecting radar target in log-normal distribution environment. The BACOSD detector is capable to detect automatically the number interference target in the reference cells and detect the real target by an adaptive threshold. The detector is implemented as a System on Chip on FPGA Altera Stratix II using parallelism and pipelining technique. For a reference window of length 16 cells, the experimental results showed that the processor works properly with a processing speed up to 115.13MHz and processing time0.29 ┬Ás, thus meets real-time requirement for a typical radar system.
Abstract: Recently in the field of bridges that are newly built or
repaired, fast construction is required more than ever. For these
reasons, precast prefabricated bridge that enables rapid construction is
actively discussed and studied today. In South Korea, it is called
modular bridge. Cross beam is an integral component of modular
bridge. It functions for load distribution, reduction of bending
moment, resistance of horizontal strength on lateral upper structure. In
this study, the structural characteristics of domestic and foreign cross
beam types were compared. Based on this, alternative cross beam
connection types suitable for modular bridge were selected. And
bulb-T girder specimens were fabricated with each type of connection.
The behavior of each specimen was analyzed under static loading, and
cross beam connection type which is expected to be best suited to
modular bridge proposed.