Abstract: The technique of k-anonymization has been proposed to obfuscate private data through associating it with at least k identities. This paper investigates the basic tabular structures that
underline the notion of k-anonymization using cell suppression.
These structures are studied under idealized conditions to identify the
essential features of the k-anonymization notion. We optimize data kanonymization
through requiring a minimum number of anonymized
values that are balanced over all columns and rows. We study the
relationship between the sizes of the anonymized tables, the value k, and the number of attributes. This study has a theoretical value through contributing to develop a mathematical foundation of the kanonymization
concept. Its practical significance is still to be
investigated.
Abstract: This paper used a fuzzy kohonen neural network for medical image segmentation. Image segmentation plays a important role in the many of medical imaging applications by automating or facilitating the diagnostic. The paper analyses the tumor by extraction of the features of (area, entropy, means and standard deviation).These measurements gives a description for a tumor.
Abstract: Motion capture devices have been utilized in
producing several contents, such as movies and video games. However,
since motion capture devices are expensive and inconvenient to use,
motions segmented from captured data was recycled and synthesized
to utilize it in another contents, but the motions were generally
segmented by contents producers in manual. Therefore, automatic
motion segmentation is recently getting a lot of attentions. Previous
approaches are divided into on-line and off-line, where on-line
approaches segment motions based on similarities between
neighboring frames and off-line approaches segment motions by
capturing the global characteristics in feature space. In this paper, we
propose a graph-based high-level motion segmentation method. Since
high-level motions consist of several repeated frames within temporal
distances, we consider all similarities among all frames within the
temporal distance. This is achieved by constructing a graph, where
each vertex represents a frame and the edges between the frames are
weighted by their similarity. Then, normalized cuts algorithm is used
to partition the constructed graph into several sub-graphs by globally
finding minimum cuts. In the experiments, the results using the
proposed method showed better performance than PCA-based method
in on-line and GMM-based method in off-line, as the proposed method
globally segment motions from the graph constructed based
similarities between neighboring frames as well as similarities among
all frames within temporal distances.
Abstract: The African Great Lakes Region refers to the zone
around lakes Victoria, Tanganyika, Albert, Edward, Kivu, and
Malawi. The main source of electricity in this region is hydropower
whose systems are generally characterized by relatively weak,
isolated power schemes, poor maintenance and technical deficiencies
with limited electricity infrastructures. Most of the hydro sources are
rain fed, and as such there is normally a deficiency of water during
the dry seasons and extended droughts. In such calamities fossil fuels
sources, in particular petroleum products and natural gas, are
normally used to rescue the situation but apart from them being nonrenewable,
they also release huge amount of green house gases to our
environment which in turn accelerates the global warming that has at
present reached an amazing stage. Wind power is ample, renewable,
widely distributed, clean, and free energy source that does not
consume or pollute water. Wind generated electricity is one of the
most practical and commercially viable option for grid quality and
utility scale electricity production. However, the main shortcoming
associated with electric wind power generation is fluctuation in its
output both in space and time. Before making a decision to establish
a wind park at a site, the wind speed features there should therefore
be known thoroughly as well as local demand or transmission
capacity. The main objective of this paper is to utilise monthly
average wind speed data collected from one prospective site within
the African Great Lakes Region to demonstrate that the available
wind power there is high enough to generate electricity. The mean
monthly values were calculated from records gathered on hourly
basis for a period of 5 years (2001 to 2005) from a site in Tanzania.
The documentations that were collected at a height of 2 m were
projected to a height of 50 m which is the standard hub height of
wind turbines. The overall monthly average wind speed was found to
be 12.11 m/s whereas June to November was established to be the
windy season as the wind speed during the session is above the
overall monthly wind speed. The available wind power density
corresponding to the overall mean monthly wind speed was evaluated
to be 1072 W/m2, a potential that is worthwhile harvesting for the
purpose of electric generation.
Abstract: Wireless Sensor and Actor Networks (WSANs) constitute an emerging and pervasive technology that is attracting increasing interest in the research community for a wide range of applications. WSANs have two important requirements: coordination interactions and real-time communication to perform correct and timely actions. This paper introduces a methodology to facilitate the task of the application programmer focusing on the coordination and real-time requirements of WSANs. The methodology proposed in this model uses a real-time component model, UM-RTCOM, which will help us to achieve the design and implementation of applications in WSAN by using the component oriented paradigm. This will help us to develop software components which offer some very interesting features, such as reusability and adaptability which are very suitable for WSANs as they are very dynamic environments with rapidly changing conditions. In addition, a high-level coordination model based on tuple channels (TC-WSAN) is integrated into the methodology by providing a component-based specification of this model in UM-RTCOM; this will allow us to satisfy both sensor-actor and actor-actor coordination requirements in WSANs. Finally, we present in this paper the design and implementation of an application which will help us to show how the methodology can be easily used in order to achieve the development of WSANs applications.
Abstract: Learning using labeled and unlabelled data has
received considerable amount of attention in the machine learning
community due its potential in reducing the need for expensive
labeled data. In this work we present a new method for combining
labeled and unlabeled data based on classifier ensembles. The model
we propose assumes each classifier in the ensemble observes the
input using different set of features. Classifiers are initially trained
using some labeled samples. The trained classifiers learn further
through labeling the unknown patterns using a teaching signals that is
generated using the decision of the classifier ensemble, i.e. the
classifiers self-supervise each other. Experiments on a set of object
images are presented. Our experiments investigate different classifier
models, different fusing techniques, different training sizes and
different input features. Experimental results reveal that the proposed
self-supervised ensemble learning approach reduces classification
error over the single classifier and the traditional ensemble classifier
approachs.
Abstract: The novelty proposed in this study is twofold and consists in the developing of a new color similarity metric based on the human visual system and a new color indexing based on a textual approach. The new color similarity metric proposed is based on the color perception of the human visual system. Consequently the results returned by the indexing system can fulfill as much as possibile the user expectations. We developed a web application to collect the users judgments about the similarities between colors, whose results are used to estimate the metric proposed in this study. In order to index the image's colors, we used a text indexing engine to facilitate the integration of visual features in a database of text documents. The textual signature is build by weighting the image's colors in according to their occurrence in the image. The use of a textual indexing engine, provide us a simple, fast and robust solution to index images. A typical usage of the system proposed in this study, is the development of applications whose data type is both visual and textual. In order to evaluate the proposed method we chose a price comparison engine as a case of study, collecting a series of commercial offers containing the textual description and the image representing a specific commercial offer.
Abstract: Exchange algorithm with constraints on magnitude and phase error separately in new way is presented in this paper. An important feature of the algorithms presented in this paper is that they allow for design constraints which often arise in practical filter design problems. Meeting required minimum stopband attenuation or a maximum deviation from the desired magnitude and phase responses in the passbands are common design constraints that can be handled by the methods proposed here. This new algorithm may have important advantages over existing technique, with respect to the speed and stability of convergence, memory requirement and low ripples.
Abstract: The international society focuses on the environment
protection and natural energy sources control for the global
cooperation against weather change and sustainable growth. The study
presents the overview of the water shortage status and the necessity of wastewater reuse facility in military facilities and for the possibility of
the introduction, compares the economics by means of cost-benefit
analysis. The military features such as the number of users of military barracks and the water use were surveyed by the design principles by
facility types, the application method of wastewater reuse facility was selected, the feed water, its application and the volume of reuse volume were defined and the expectation was estimated, confirming
the possibility of introducing a wastewater reuse possibility by means of cost-benefit analysis.
Abstract: HIV-1 genome is highly heterogeneous. Due to this
variation, features of HIV-I genome is in a wide range. For this
reason, the ability to infection of the virus changes depending on
different chemokine receptors. From this point of view, R5 HIV
viruses use CCR5 coreceptor while X4 viruses use CXCR5 and
R5X4 viruses can utilize both coreceptors. Recently, in
Bioinformatics, R5X4 viruses have been studied to classify by using
the experiments on HIV-1 genome.
In this study, R5X4 type of HIV viruses were classified using
Auto Regressive (AR) model through Artificial Neural Networks
(ANNs). The statistical data of R5X4, R5 and X4 viruses was
analyzed by using signal processing methods and ANNs. Accessible
residues of these virus sequences were obtained and modeled by AR
model since the dimension of residues is large and different from
each other. Finally the pre-processed data was used to evolve various
ANN structures for determining R5X4 viruses. Furthermore ROC
analysis was applied to ANNs to show their real performances. The
results indicate that R5X4 viruses successfully classified with high
sensitivity and specificity values training and testing ROC analysis
for RBF, which gives the best performance among ANN structures.
Abstract: Most of researches for conventional simulations were
studied focusing on flocks with a single species. While there exist the
flocking behaviors with a single species in nature, the flocking
behaviors are frequently observed with multi-species. This paper
studies on the flocking simulation for heterogeneous agents. In order
to simulate the flocks for heterogeneous agents, the conventional
method uses the identifier of flock, while the proposed method defines
the feature vector of agent and uses the similarity between agents by
comparing with those feature vectors. Based on the similarity, the
paper proposed the attractive force and repulsive force and then
executed the simulation by applying two forces. The results of
simulation showed that flock formation with heterogeneous agents is
very natural in both cases. In addition, it showed that unlike the
existing method, the proposed method can not only control the density
of the flocks, but also be possible for two different groups of agents to
flock close to each other if they have a high similarity.
Abstract: German electricity European options on futures using
Lévy processes for the underlying asset are examined. Implied
volatility evolution, under each of the considered models, is
discussed after calibrating for the Merton jump diffusion (MJD),
variance gamma (VG), normal inverse Gaussian (NIG), Carr, Geman,
Madan and Yor (CGMY) and the Black and Scholes (B&S) model.
Implied volatility is examined for the entire sample period, revealing
some curious features about market evolution, where data fitting
performances of the five models are compared. It is shown that
variance gamma processes provide relatively better results and that
implied volatility shows significant differences through time, having
increasingly evolved. Volatility changes for changed uncertainty, or
else, increasing futures prices and there is evidence for the need to
account for seasonality when modelling both electricity spot/futures
prices and volatility.
Abstract: The image segmentation method described in this
paper has been developed as a pre-processing stage to be used in
methodologies and tools for video/image indexing and retrieval by
content. This method solves the problem of whole objects extraction
from background and it produces images of single complete objects
from videos or photos. The extracted images are used for calculating
the object visual features necessary for both indexing and retrieval
processes.
The segmentation algorithm is based on the cooperation among an
optical flow evaluation method, edge detection and region growing
procedures. The optical flow estimator belongs to the class of
differential methods. It permits to detect motions ranging from a
fraction of a pixel to a few pixels per frame, achieving good results in
presence of noise without the need of a filtering pre-processing stage
and includes a specialised model for moving object detection.
The first task of the presented method exploits the cues from
motion analysis for moving areas detection. Objects and background
are then refined using respectively edge detection and seeded region
growing procedures. All the tasks are iteratively performed until
objects and background are completely resolved.
The method has been applied to a variety of indoor and outdoor
scenes where objects of different type and shape are represented on
variously textured background.
Abstract: Stipples are desired for pattern fillings and
transparency effects. In contrast, some graphics standards, including
OpenGL ES 1.1 and 2.0, omitted this feature. We represent details of
providing line stipples and polygon stipples, through combining
texture mapping and alpha blending functions. We start from the
OpenGL-specified stipple-related API functions. The details of
mathematical transformations are explained to get the correct texture
coordinates. Then, the overall algorithm is represented, and its
implementation results are followed. We accomplished both of line
and polygon stipples, and verified its result with conformance test
routines.
Abstract: In this study, we developed an algorithm for detecting
seam cracks in a steel plate. Seam cracks are generated in the edge
region of a steel plate. We used the Gabor filter and an adaptive double
threshold method to detect them. To reduce the number of pseudo
defects, features based on the shape of seam cracks were used. To
evaluate the performance of the proposed algorithm, we tested 989
images with seam cracks and 9470 defect-free images. Experimental
results show that the proposed algorithm is suitable for detecting seam
cracks. However, it should be improved to increase the true positive
rate.
Abstract: In the highly competitive and rapidly changing global
marketplace, independent organizations and enterprises often come
together and form a temporary alignment of virtual enterprise in a
supply chain to better provide products or service. As firms adopt the
systems approach implicit in supply chain management, they must
manage the quality from both internal process control and external
control of supplier quality and customer requirements. How to
incorporate quality management of upstream and downstream supply
chain partners into their own quality management system has recently
received a great deal of attention from both academic and practice.
This paper investigate the collaborative feature and the entities-
relationship in a supply chain, and presents an ontology of
collaborative supply chain from an approach of aligning
service-oriented framework with service-dominant logic. This
perspective facilitates the segregation of material flow management
from manufacturing capability management, which provides a
foundation for the coordination and integration of the business process
to measure, analyze, and continually improve the quality of products,
services, and process. Further, this approach characterizes the different
interests of supply chain partners, providing an innovative approach to
analyze the collaborative features of supply chain. Furthermore, this
ontology is the foundation to develop quality management system
which internalizes the quality management in upstream and
downstream supply chain partners and manages the quality in supply
chain systematically.
Abstract: Sonogram images of normal and lymphocyte thyroid tissues have considerable overlap which makes it difficult to interpret and distinguish. Classification from sonogram images of thyroid gland is tackled in semiautomatic way. While making manual diagnosis from images, some relevant information need not to be recognized by human visual system. Quantitative image analysis could be helpful to manual diagnostic process so far done by physician. Two classes are considered: normal tissue and chronic lymphocyte thyroid (Hashimoto's Thyroid). Data structure is analyzed using K-nearest-neighbors classification. This paper is mentioned that unlike the wavelet sub bands' energy, histograms and Haralick features are not appropriate to distinguish between normal tissue and Hashimoto's thyroid.
Abstract: Bioinformatics methods for predicting the T cell
coreceptor usage from the array of membrane protein of HIV-1 are
investigated. In this study, we aim to propose an effective prediction
method for dealing with the three-class classification problem of
CXCR4 (X4), CCR5 (R5) and CCR5/CXCR4 (R5X4). We made
efforts in investigating the coreceptor prediction problem as follows: 1)
proposing a feature set of informative physicochemical properties
which is cooperated with SVM to achieve high prediction test
accuracy of 81.48%, compared with the existing method with
accuracy of 70.00%; 2) establishing a large up-to-date data set by
increasing the size from 159 to 1225 sequences to verify the proposed
prediction method where the mean test accuracy is 88.59%, and 3)
analyzing the set of 14 informative physicochemical properties to
further understand the characteristics of HIV-1coreceptors.
Abstract: This paper presents a simple method for estimation of
additional load as a factor of the existing load that may be drawn
before reaching the point of line maximum loadability of radial
distribution system (RDS) with different realistic load models at
different substation voltages. The proposed method involves a simple
line loadability index (LLI) that gives a measure of the proximity of
the present state of a line in the distribution system. The LLI can use
to assess voltage instability and the line loading margin. The
proposed method also compares with the existing method of
maximum loadability index [10]. The simulation results show that the
LLI can identify not only the weakest line/branch causing system
instability but also the system voltage collapse point when it is near
one. This feature enables us to set an index threshold to monitor and
predict system stability on-line so that a proper action can be taken to
prevent the system from collapse. To demonstrate the validity of the
proposed algorithm, computer simulations are carried out on two bus
and 69 bus RDS.
Abstract: This paper presents design, analysis and comparison of the different rotor type permanent magnet machines. The presented machines are designed as having same geometrical dimensions and same materials for comparison. The main machine parameters of interior and exterior rotor type machines including eddy current effect, torque-speed characteristics and magnetic analysis are investigated using MAXWELL program. With this program, the components of the permanent magnet machines can be calculated with high accuracy. Six types of Permanent machines are compared with respect to their topology, size, magnetic field, air gap flux, voltage, torque, loss and efficiency. The analysis results demonstrate the effectiveness of the proposed machines design methodology. We believe that, this study will be a helpful resource in terms of examination and comparison of the basic structure and magnetic features of the PM (Permanent magnet) machines which have different rotor structure.