Abstract: The tagging data of (users, tags and resources) constitutes a folksonomy that is the user-driven and bottom-up approach to organizing and classifying information on the Web. Tagging data stored in the folksonomy include a lot of very useful information and knowledge. However, appropriate approach for analyzing tagging data and discovering hidden knowledge from them still remains one of the main problems on the folksonomy mining researches. In this paper, we have proposed a folksonomy data mining approach based on FCA for discovering hidden knowledge easily from folksonomy. Also we have demonstrated how our proposed approach can be applied in the collaborative tagging system through our experiment. Our proposed approach can be applied to some interesting areas such as social network analysis, semantic web mining and so on.
Abstract: In this paper, we aim to investigate a new stability analysis for discrete-time switched linear systems based on the comparison, the overvaluing principle, the application of Borne-Gentina criterion and the Kotelyanski conditions. This stability conditions issued from vector norms correspond to a vector Lyapunov function. In fact, the switched system to be controlled will be represented in the Companion form. A comparison system relative to a regular vector norm is used in order to get the simple arrow form of the state matrix that yields to a suitable use of Borne-Gentina criterion for the establishment of sufficient conditions for global asymptotic stability. This proposed approach could be a constructive solution to the state and static output feedback stabilization problems.
Abstract: Parallel programming models exist as an abstraction
of hardware and memory architectures. There are several parallel
programming models in commonly use; they are shared memory
model, thread model, message passing model, data parallel model,
hybrid model, Flynn-s models, embarrassingly parallel computations
model, pipelined computations model. These models are not specific
to a particular type of machine or memory architecture. This paper
expresses the model program for concurrent approach to data parallel
model through java programming.
Abstract: This study performs a comparative analysis of the 21 Greek Universities in terms of their public funding, awarded for covering their operating expenditure. First it introduces a DEA/MCDM model that allocates the fund into four expenditure factors in the most favorable way for each university. Then, it presents a common, consensual assessment model to reallocate the amounts, remaining in the same level of total public budget. From the analysis it derives that a number of universities cannot justify the public funding in terms of their size and operational workload. For them, the sufficient reduction of their public funding amount is estimated as a future target. Due to the lack of precise data for a number of expenditure criteria, the analysis is based on a mixed crisp-ordinal data set.
Abstract: Text categorization is the problem of classifying text
documents into a set of predefined classes. In this paper, we
investigated three approaches to build a meta-classifier in order to
increase the classification accuracy. The basic idea is to learn a metaclassifier
to optimally select the best component classifier for each
data point. The experimental results show that combining classifiers
can significantly improve the accuracy of classification and that our
meta-classification strategy gives better results than each individual
classifier. For 7083 Reuters text documents we obtained a
classification accuracies up to 92.04%.
Abstract: This study reveals that anti-immigrant policies in
Europe result from a process of securitization, and that, within this
process, radical right parties have been formulating discourses and
approaches through a construction process by using some common
security themes. These security themes can be classified as national
security, economic security, cultural security and internal security.
The frequency with which radical right parties use these themes may
vary according to the specific historical, social and cultural
characteristics of a particular country.
Abstract: This paper presents a simplified version of Data Envelopment Analysis (DEA) - a conventional approach to evaluating the performance and ranking of competitive objects characterized by two groups of factors acting in opposite directions: inputs and outputs. DEA with a Perfect Object (DEA PO) augments the group of actual objects with a virtual Perfect Object - the one having greatest outputs and smallest inputs. It allows for obtaining an explicit analytical solution and making a step to an absolute efficiency. This paper develops this approach further and introduces a DEA model with Partially Perfect Objects. DEA PPO consecutively eliminates the smallest relative inputs or greatest relative outputs, and applies DEA PO to the reduced collections of indicators. The partial efficiency scores are combined to get the weighted efficiency score. The computational scheme remains simple, like that of DEA PO, but the advantage of the DEA PPO is taking into account all of the inputs and outputs for each actual object. Firm evaluation is considered as an example.
Abstract: This paper presents a new steganography approach suitable for Arabic texts. It can be classified under steganography feature coding methods. The approach hides secret information bits within the letters benefiting from their inherited points. To note the specific letters holding secret bits, the scheme considers the two features, the existence of the points in the letters and the redundant Arabic extension character. We use the pointed letters with extension to hold the secret bit 'one' and the un-pointed letters with extension to hold 'zero'. This steganography technique is found attractive to other languages having similar texts to Arabic such as Persian and Urdu.
Abstract: This paper reports a case study on how a conceptual
and analytical thinking approach was used in Art and Design Department at Multimedia University (Malaysia) in addressing the
issues of one nation and its impact in the society through artworks. The art project was designed for students to increase the know-how
and develop creative thinking in design and communication. Goals of the design project were: (1) to develop creative thinking in design
and communication, (2) to increase student understanding on the
process of problem solving for design work, and (3) to use design
elements and principles to generate interest, attention and emotional responses. An exhibition entitled "One Nation" was showcased to
local and international viewers consisting of the general public, professionals, academics, artists and students. Findings indicate that the project supported several visual art standards, as well as
generated awareness in the society. This project may be of interest to
current and future art educators and others interested in the potential
of utilizing global issues as content for art, community and environment studies for the purpose of educational art.
Abstract: Software Development Risks Identification (SDRI),
using Fault Tree Analysis (FTA), is a proposed technique to identify
not only the risk factors but also the causes of the appearance of the
risk factors in software development life cycle. The method is based
on analyzing the probable causes of software development failures
before they become problems and adversely affect a project. It uses
Fault tree analysis (FTA) to determine the probability of a particular
system level failures that are defined by A Taxonomy for Sources of
Software Development Risk to deduce failure analysis in which an
undesired state of a system by using Boolean logic to combine a
series of lower-level events. The major purpose of this paper is to use
the probabilistic calculations of Fault Tree Analysis approach to
determine all possible causes that lead to software development risk
occurrence
Abstract: Software estimation accuracy is among the greatest
challenges for software developers. This study aimed at building and
evaluating a neuro-fuzzy model to estimate software projects
development time. The forty-one modules developed from ten
programs were used as dataset. Our proposed approach is compared
with fuzzy logic and neural network model and Results show that the
value of MMRE (Mean of Magnitude of Relative Error) applying
neuro-fuzzy was substantially lower than MMRE applying fuzzy
logic and neural network.
Abstract: The paper proposes an approach using genetic algorithm for computing the region based image similarity. The image is denoted using a set of segmented regions reflecting color and texture properties of an image. An image is associated with a family of image features corresponding to the regions. The resemblance of two images is then defined as the overall similarity between two families of image features, and quantified by a similarity measure, which integrates properties of all the regions in the images. A genetic algorithm is applied to decide the most plausible matching. The performance of the proposed method is illustrated using examples from an image database of general-purpose images, and is shown to produce good results.
Abstract: Contamination of heavy metals in tin tailings has
caused an interest in the scientific approach of their remediation. One
of the approaches is through phytoremediation, which is using tree
species to extract the heavy metals from the contaminated soils. Tin
tailings comprise of slime and sand tailings. This paper reports only
on the finding of the four timber species namely Acacia mangium,
Hopea odorata, Intsia palembanica and Swietenia macrophylla on
the removal of cadmium (Cd) and lead (Pb) from the slime tailings.
The methods employed for sampling and soil analysis are established
methods. Six trees of each species were randomly selected from a
0.25 ha plot for extraction and determination of their heavy metals.
The soil samples were systematically collected according to 5 x 5 m
grid from each plot. Results showed that the concentration of heavy
metals in soils and trees varied according to species. Higher
concentration of heavy metals was found in the stem than the
primary roots of all the species. A. Mangium accumulated the highest
total amount of Pb per hectare basis.
Abstract: The research on the effectiveness of environmental
assessment (EA) is a milestone effort to evaluate the state of the field,
including many contributors related with a lot of countries since more
than two decades. In the 1960s, there was a surge of interest between
modern industrialized countries over unexpected opposite effects of
technical invention. The interest led to choice of approaches for
assessing and prediction the impressions of technology and
advancement for social and economic, state health and safety, solidity
and the circumstances. These are consisting of risk assessment,
technology assessment, environmental impact assessment and costbenefit
analysis. In this research contribution, the authors have
described the research status for environmental assessment in
cumulative environmental system. This article discusses the methods
for cumulative effect assessment (CEA).
Abstract: Rapid economic development and population growth
in Malaysia had accelerated the generation of solid waste. This issue
gives pressure for effective management of municipal solid waste
(MSW) to take place in Malaysia due to the increased cost of landfill.
This paper discusses optimal planning of waste-to-energy (WTE)
using a combinatorial simulation and optimization model through
mixed integer linear programming (MILP) approach. The proposed
multi-period model is tested in Iskandar Malaysia (IM) as case study
for a period of 12 years (2011 -2025) to illustrate the economic
potential and tradeoffs involved in this study. In this paper, 3
scenarios have been used to demonstrate the applicability of the
model: (1) Incineration scenario (2) Landfill scenario (3) Optimal
scenario. The model revealed that the minimum cost of electricity
generation from 9,995,855 tonnes of MSW is estimated as USD
387million with a total electricity generation of 50MW /yr in the
optimal scenario.
Abstract: This paper examines the problem of designing a robust H8 state-feedback controller for a class of nonlinear two-time scale systems with Markovian Jumps described by a Takagi-Sugeno (TS) fuzzy model. Based on a linear matrix inequality (LMI) approach, LMI-based sufficient conditions for the uncertain Markovian jump nonlinear two-time scale systems to have an H8 performance are derived. The proposed approach does not involve the separation of states into slow and fast ones and it can be applied not only to standard, but also to nonstandard nonlinear two-time scale systems. A numerical example is provided to illustrate the design developed in this paper.
Abstract: In this article we explore the application of a formal
proof system to verification problems in cryptography. Cryptographic
properties concerning correctness or security of some cryptographic
algorithms are of great interest. Beside some basic lemmata, we
explore an implementation of a complex function that is used in
cryptography. More precisely, we describe formal properties of this
implementation that we computer prove. We describe formalized
probability distributions (σ-algebras, probability spaces and conditional
probabilities). These are given in the formal language of the
formal proof system Isabelle/HOL. Moreover, we computer prove
Bayes- Formula. Besides, we describe an application of the presented
formalized probability distributions to cryptography. Furthermore,
this article shows that computer proofs of complex cryptographic
functions are possible by presenting an implementation of the Miller-
Rabin primality test that admits formal verification. Our achievements
are a step towards computer verification of cryptographic primitives.
They describe a basis for computer verification in cryptography.
Computer verification can be applied to further problems in cryptographic
research, if the corresponding basic mathematical knowledge
is available in a database.
Abstract: In this paper, we propose a robust scheme to work face alignment and recognition under various influences. For face representation, illumination influence and variable expressions are the important factors, especially the accuracy of facial localization and face recognition. In order to solve those of factors, we propose a robust approach to overcome these problems. This approach consists of two phases. One phase is preprocessed for face images by means of the proposed illumination normalization method. The location of facial features can fit more efficient and fast based on the proposed image blending. On the other hand, based on template matching, we further improve the active shape models (called as IASM) to locate the face shape more precise which can gain the recognized rate in the next phase. The other phase is to process feature extraction by using principal component analysis and face recognition by using support vector machine classifiers. The results show that this proposed method can obtain good facial localization and face recognition with varied illumination and local distortion.
Abstract: This paper presents a fuzzy logic controlled shunt
active power filter used to compensate for harmonic distortion in three-phase four-wire systems. The shunt active filter employs a
simple method for the calculation of the reference compensation current based of Fast Fourier Transform. This presented filter is able
to operate in both balanced and unbalanced load conditions. A fuzzy
logic based current controller strategy is used to regulate the filter current and hence ensure harmonic free supply current. The validity
of the presented approach in harmonic mitigation is verified via
simulation results of the proposed test system under different loading
conditions.
Abstract: Textures are replications, symmetries and
combinations of various basic patterns, usually with some random
variation one of the gray-level statistics. This article proposes a
new approach to Segment texture images. The proposed approach
proceeds in 2 stages. First, in this method, local texture information
of a pixel is obtained by fuzzy texture unit and global texture
information of an image is obtained by fuzzy texture spectrum.
The purpose of this paper is to demonstrate the usefulness of fuzzy
texture spectrum for texture Segmentation.
The 2nd Stage of the method is devoted to a decision process,
applying a global analysis followed by a fine segmentation,
which is only focused on ambiguous points. The above Proposed
approach was applied to brain image to identify the components
of brain in turn, used to locate the brain tumor and its Growth
rate.