Abstract: By the evolvement in technology, the way of
expressing opinions switched direction to the digital world. The
domain of politics, as one of the hottest topics of opinion mining
research, merged together with the behavior analysis for affiliation
determination in texts, which constitutes the subject of this paper.
This study aims to classify the text in news/blogs either as
Republican or Democrat with the minimum number of features. As
an initial set, 68 features which 64 were constituted by Linguistic
Inquiry and Word Count (LIWC) features were tested against 14
benchmark classification algorithms. In the later experiments, the
dimensions of the feature vector reduced based on the 7 feature
selection algorithms. The results show that the “Decision Tree”,
“Rule Induction” and “M5 Rule” classifiers when used with “SVM”
and “IGR” feature selection algorithms performed the best up to
82.5% accuracy on a given dataset. Further tests on a single feature
and the linguistic based feature sets showed the similar results. The
feature “Function”, as an aggregate feature of the linguistic category,
was found as the most differentiating feature among the 68 features
with the accuracy of 81% in classifying articles either as Republican
or Democrat.
Abstract: Any signal transmitted over a channel is corrupted by noise and interference. A host of channel coding techniques has been proposed to alleviate the effect of such noise and interference. Among these Turbo codes are recommended, because of increased capacity at higher transmission rates and superior performance over convolutional codes. The multimedia elements which are associated with ample amount of data are best protected by Turbo codes. Turbo decoder employs Maximum A-posteriori Probability (MAP) and Soft Output Viterbi Decoding (SOVA) algorithms. Conventional Turbo coded systems employ Equal Error Protection (EEP) in which the protection of all the data in an information message is uniform. Some applications involve Unequal Error Protection (UEP) in which the level of protection is higher for important information bits than that of other bits. In this work, enhancement to the traditional Log MAP decoding algorithm is being done by using optimized scaling factors for both the decoders. The error correcting performance in presence of UEP in Additive White Gaussian Noise channel (AWGN) and Rayleigh fading are analyzed for the transmission of image with Discrete Cosine Transform (DCT) as source coding technique. This paper compares the performance of log MAP, Modified log MAP (MlogMAP) and Enhanced log MAP (ElogMAP) algorithms used for image transmission. The MlogMAP algorithm is found to be best for lower Eb/N0 values but for higher Eb/N0 ElogMAP performs better with optimized scaling factors. The performance comparison of AWGN with fading channel indicates the robustness of the proposed algorithm. According to the performance of three different message classes, class3 would be more protected than other two classes. From the performance analysis, it is observed that ElogMAP algorithm with UEP is best for transmission of an image compared to Log MAP and MlogMAP decoding algorithms.
Abstract: This paper presented a study of three algorithms, the
equalization algorithm to equalize the transmission channel with ZF
and MMSE criteria, application of channel Bran A, and adaptive
filtering algorithms LMS and RLS to estimate the parameters of the
equalizer filter, i.e. move to the channel estimation and therefore
reflect the temporal variations of the channel, and reduce the error in
the transmitted signal. So far the performance of the algorithm
equalizer with ZF and MMSE criteria both in the case without noise,
a comparison of performance of the LMS and RLS algorithm.
Abstract: Hybrid electric vehicles can reduce pollution and
improve fuel economy. Power-split hybrid electric vehicles (HEVs)
provide two power paths between the internal combustion engine
(ICE) and energy storage system (ESS) through the gears of an
electrically variable transmission (EVT). EVT allows ICE to operate
independently from vehicle speed all the time. Therefore, the ICE can
operate in the efficient region of its characteristic brake specific fuel
consumption (BSFC) map. The two-mode powertrain can operate in
input-split or compound-split EVT modes and in four different fixed
gear configurations. Power-split architecture is advantageous because
it combines conventional series and parallel power paths. This
research focuses on input-split and compound-split modes in the
two-mode power-split powertrain. Fuzzy Logic Control (FLC) for an
internal combustion engine (ICE) and PI control for electric machines
(EMs) are derived for the urban driving cycle simulation. These
control algorithms reduce vehicle fuel consumption and improve ICE
efficiency while maintaining the state of charge (SOC) of the energy
storage system in an efficient range.
Abstract: In this paper we propose a computer-aided solution
with Genetic Algorithms in order to reduce the drafting of reports:
FMEA analysis and Control Plan required in the manufacture of the
product launch and improved knowledge development teams for
future projects. The solution allows to the design team to introduce
data entry required to FMEA. The actual analysis is performed using
Genetic Algorithms to find optimum between RPN risk factor and
cost of production. A feature of Genetic Algorithms is that they are
used as a means of finding solutions for multi criteria optimization
problems. In our case, along with three specific FMEA risk factors is
considered and reduce production cost. Analysis tool will generate
final reports for all FMEA processes. The data obtained in FMEA
reports are automatically integrated with other entered parameters in
Control Plan. Implementation of the solution is in the form of an
application running in an intranet on two servers: one containing
analysis and plan generation engine and the other containing the
database where the initial parameters and results are stored. The
results can then be used as starting solutions in the synthesis of other
projects. The solution was applied to welding processes, laser cutting
and bending to manufacture chassis for buses. Advantages of the
solution are efficient elaboration of documents in the current project
by automatically generating reports FMEA and Control Plan using
multiple criteria optimization of production and build a solid
knowledge base for future projects. The solution which we propose is
a cheap alternative to other solutions on the market using Open
Source tools in implementation.
Abstract: In this study, the pedestrian simulation VISWALK
integration and application platform ant algorithms written program
made to construct a renovation engineering schedule planning mode.
The use of simulation analysis platform construction site when the user
running the simulation, after calculating the user walks in the case of
construction delays, the ant algorithm to find out the minimum delay
time schedule plan, and add volume and unit area deactivated loss of
business computing, and finally to the owners and users of two
different positions cut considerations pick out the best schedule
planning. To assess and validate its effectiveness, this study
constructed the model imported floor of a shopping mall floor
renovation engineering cases. Verify that the case can be found from
the mode of the proposed project schedule planning program can
effectively reduce the delay time and the user's walking mall loss of
business, the impact of the operation on the renovation engineering
facilities in the building to a minimum.
Abstract: Mammography is widely used technique for breast cancer
screening. There are various other techniques for breast cancer screening
but mammography is the most reliable and effective technique. The
images obtained through mammography are of low contrast which
causes problem for the radiologists to interpret. Hence, a high quality
image is mandatory for the processing of the image for extracting any
kind of information from it. Many contrast enhancement algorithms have
been developed over the years. In the present work, an efficient
morphology based technique is proposed for contrast enhancement of
masses in mammographic images. The proposed method is based on
Multiscale Morphology and it takes into consideration the scale of the
structuring element. The proposed method is compared with other stateof-
the-art techniques. The experimental results show that the proposed
method is better both qualitatively and quantitatively than the other
standard contrast enhancement techniques.
Abstract: In a practical power system, the power plants are not
located at the same distance from the center of loads and their fuel
costs are different. Also, under normal operating conditions, the
generation capacity is more than the total load demand and losses.
Thus, there are many options for scheduling generation. In an
interconnected power system, the objective is to find the real and
reactive power scheduling of each power plant in such a way as to
minimize the operating cost. This means that the generator’s real and
reactive powers are allowed to vary within certain limits so as to meet
a particular load demand with minimum fuel cost. This is called
optimal power flow problem. In this paper, Economic Load Dispatch
(ELD) of real power generation is considered. Economic Load
Dispatch (ELD) is the scheduling of generators to minimize total
operating cost of generator units subjected to equality constraint of
power balance within the minimum and maximum operating limits of
the generating units. In this paper, genetic algorithms are considered.
ELD solutions are found by solving the conventional load flow
equations while at the same time minimizing the fuel costs.
Abstract: This paper investigates simple implicit force control
algorithms realizable with industrial robots. A lot of approaches
already published are difficult to implement in commercial robot
controllers, because the access to the robot joint torques is necessary
or the complete dynamic model of the manipulator is used. In
the past we already deal with explicit force control of a position
controlled robot. Well known schemes of implicit force control are
stiffness control, damping control and impedance control. Using such
algorithms the contact force cannot be set directly. It is further
the result of controller impedance, environment impedance and
the commanded robot motion/position. The relationships of these
properties are worked out in this paper in detail for the chosen
implicit approaches. They have been adapted to be implementable
on a position controlled robot. The behaviors of stiffness control
and damping control are verified by practical experiments. For this
purpose a suitable test bed was configured. Using the full mechanical
impedance within the controller structure will not be practical in the
case when the robot is in physical contact with the environment. This
fact will be verified by simulation.
Abstract: The paper deals with the classical fiber bundle model
of equal load sharing, sometimes referred to as the Daniels’ bundle
or the democratic bundle. Daniels formulated a multidimensional
integral and also a recursive formula for evaluation of the
strength cumulative distribution function. This paper describes
three algorithms for evaluation of the recursive formula and also
their implementations with source codes in the Python high-level
programming language. A comparison of the algorithms are provided
with respect to execution time. Analysis of orders of magnitudes of
addends in the recursion is also provided.
Abstract: This paper contains the description of argumentation
approach for the problem of inductive concept formation. It is
proposed to use argumentation, based on defeasible reasoning with
justification degrees, to improve the quality of classification models,
obtained by generalization algorithms. The experiment’s results on
both clear and noisy data are also presented.
Abstract: In this work, we explore the capability of the mean
shift algorithm as a powerful preprocessing tool for improving the
quality of spatial data, acquired from airborne scanners, from densely
built urban areas. On one hand, high resolution image data corrupted
by noise caused by lossy compression techniques are appropriately
smoothed while at the same time preserving the optical edges and, on
the other, low resolution LiDAR data in the form of normalized
Digital Surface Map (nDSM) is upsampled through the joint mean
shift algorithm. Experiments on both the edge-preserving smoothing
and upsampling capabilities using synthetic RGB-z data show that the
mean shift algorithm is superior to bilateral filtering as well as to
other classical smoothing and upsampling algorithms. Application of
the proposed methodology for 3D reconstruction of buildings of a
pilot region of Athens, Greece results in a significant visual
improvement of the 3D building block model.
Abstract: In the Hierarchical Temporal Memory (HTM) paradigm
the effect of overlap between inputs on the activation of columns in
the spatial pooler is studied. Numerical results suggest that similar
inputs are represented by similar sets of columns and dissimilar inputs
are represented by dissimilar sets of columns. It is shown that the
spatial pooler produces these results under certain conditions for
the connectivity and proximal thresholds. Following the discussion
of the initialization of parameters for the thresholds, corresponding
qualitative arguments about the learning dynamics of the spatial
pooler are discussed.
Abstract: Evolutionary optimization methods such as genetic
algorithms have been used extensively for the construction site layout
problem. More recently, ant colony optimization algorithms, which
are evolutionary methods based on the foraging behavior of ants,
have been successfully applied to benchmark combinatorial
optimization problems. This paper proposes a formulation of the site
layout problem in terms of a sequencing problem that is suitable for
solution using an ant colony optimization algorithm.
In the construction industry, site layout is a very important
planning problem. The objective of site layout is to position
temporary facilities both geographically and at the correct time such
that the construction work can be performed satisfactorily with
minimal costs and improved safety and working environment. During
the last decade, evolutionary methods such as genetic algorithms
have been used extensively for the construction site layout problem.
This paper proposes an ant colony optimization model for
construction site layout. A simple case study for a highway project is
utilized to illustrate the application of the model.
Abstract: In this paper we propose a novel methodology for
extracting a road network and its nodes from satellite images of
Algeria country.
This developed technique is a progress of our previous research
works. It is founded on the information theory and the mathematical
morphology; the information theory and the mathematical
morphology are combined together to extract and link the road
segments to form a road network and its nodes.
We therefore have to define objects as sets of pixels and to study
the shape of these objects and the relations that exist between them.
In this approach, geometric and radiometric features of roads are
integrated by a cost function and a set of selected points of a crossing
road. Its performances were tested on satellite images of Algeria
country.
Abstract: The Scheduling and mapping of tasks on a set of
processors is considered as a critical problem in parallel and
distributed computing system. This paper deals with the problem of
dynamic scheduling on a special type of multiprocessor architecture
known as Linear Crossed Cube (LCQ) network. This proposed
multiprocessor is a hybrid network which combines the features of
both linear types of architectures as well as cube based architectures.
Two standard dynamic scheduling schemes namely Minimum
Distance Scheduling (MDS) and Two Round Scheduling (TRS)
schemes are implemented on the LCQ network. Parallel tasks are
mapped and the imbalance of load is evaluated on different set of
processors in LCQ network. The simulations results are evaluated
and effort is made by means of through analysis of the results to
obtain the best solution for the given network in term of load
imbalance left and execution time. The other performance matrices
like speedup and efficiency are also evaluated with the given
dynamic algorithms.
Abstract: In the past few years, the amount of malicious software
increased exponentially and, therefore, machine learning algorithms
became instrumental in identifying clean and malware files through
(semi)-automated classification. When working with very large
datasets, the major challenge is to reach both a very high malware
detection rate and a very low false positive rate. Another challenge
is to minimize the time needed for the machine learning algorithm to
do so. This paper presents a comparative study between different
machine learning techniques such as linear classifiers, ensembles,
decision trees or various hybrids thereof. The training dataset consists
of approximately 2 million clean files and 200.000 infected files,
which is a realistic quantitative mixture. The paper investigates the
above mentioned methods with respect to both their performance
(detection rate and false positive rate) and their practicability.
Abstract: Grid is an environment with millions of resources
which are dynamic and heterogeneous in nature. A computational
grid is one in which the resources are computing nodes and is meant
for applications that involves larger computations. A scheduling
algorithm is said to be efficient if and only if it performs better
resource allocation even in case of resource failure. Resource
allocation is a tedious issue since it has to consider several
requirements such as system load, processing cost and time, user’s
deadline and resource failure. This work attempts in designing a
resource allocation algorithm which is cost-effective and also targets
at load balancing, fault tolerance and user satisfaction by considering
the above requirements. The proposed Budget Constrained Load
Balancing Fault Tolerant algorithm with user satisfaction (BLBFT)
reduces the schedule makespan, schedule cost and task failure rate
and improves resource utilization. Evaluation of the proposed
BLBFT algorithm is done using Gridsim toolkit and the results are
compared with the algorithms which separately concentrates on all
these factors. The comparison results ensure that the proposed
algorithm works better than its counterparts.
Abstract: Association rule mining is one of the most important fields of data mining and knowledge discovery. In this paper, we propose an efficient multiple support frequent pattern growth algorithm which we called “MSFP-growth” that enhancing the FPgrowth algorithm by making infrequent child node pruning step with multiple minimum support using maximum constrains. The algorithm is implemented, and it is compared with other common algorithms: Apriori-multiple minimum supports using maximum constraints and FP-growth. The experimental results show that the rule mining from the proposed algorithm are interesting and our algorithm achieved better performance than other algorithms without scarifying the accuracy.
Abstract: Wavelength Division Multiplexing (WDM) is the dominant transport technology used in numerous high capacity backbone networks, based on optical infrastructures. Given the importance of costs (CapEx and OpEx) associated to these networks, resource management is becoming increasingly important, especially how the optical circuits, called “lightpaths”, are routed throughout the network. This requires the use of efficient algorithms which provide routing strategies with the lowest cost. We focus on the lightpath routing and wavelength assignment problem, known as the RWA problem, while optimizing wavelength fragmentation over the network. Wavelength fragmentation poses a serious challenge for network operators since it leads to the misuse of the wavelength spectrum, and then to the refusal of new lightpath requests. In this paper, we first establish a new Integer Linear Program (ILP) for the problem based on a node-link formulation. This formulation is based on a multilayer approach where the original network is decomposed into several network layers, each corresponding to a wavelength. Furthermore, we propose an efficient heuristic for the problem based on a greedy algorithm followed by a post-treatment procedure. The obtained results show that the optimal solution is often reached. We also compare our results with those of other RWA heuristic methods