Abstract: The decisions made by admission control algorithms are
based on the availability of network resources viz. bandwidth, energy,
memory buffers, etc., without degrading the Quality-of-Service (QoS)
requirement of applications that are admitted. In this paper, we
present an energy-aware admission control (EAAC) scheme which
provides admission control for flows in an ad hoc network based
on the knowledge of the present and future residual energy of the
intermediate nodes along the routing path. The aim of EAAC is to
quantify the energy that the new flow will consume so that it can
be decided whether the future residual energy of the nodes along
the routing path can satisfy the energy requirement. In other words,
this energy-aware routing admits a new flow iff any node in the
routing path does not run out of its energy during the transmission
of packets. The future residual energy of a node is predicted using
the Multi-layer Neural Network (MNN) model. Simulation results
shows that the proposed scheme increases the network lifetime. Also
the performance of the MNN model is presented.
Abstract: In many data mining applications, it is a priori known
that the target function should satisfy certain constraints imposed
by, for example, economic theory or a human-decision maker. In this
paper we consider partially monotone prediction problems, where the
target variable depends monotonically on some of the input variables
but not on all. We propose a novel method to construct prediction
models, where monotone dependences with respect to some of
the input variables are preserved by virtue of construction. Our
method belongs to the class of mixture models. The basic idea is to
convolute monotone neural networks with weight (kernel) functions
to make predictions. By using simulation and real case studies,
we demonstrate the application of our method. To obtain sound
assessment for the performance of our approach, we use standard
neural networks with weight decay and partially monotone linear
models as benchmark methods for comparison. The results show that
our approach outperforms partially monotone linear models in terms
of accuracy. Furthermore, the incorporation of partial monotonicity
constraints not only leads to models that are in accordance with the
decision maker's expertise, but also reduces considerably the model
variance in comparison to standard neural networks with weight
decay.
Abstract: Imperfect knowledge cannot be avoided all the time. Imperfections may have several forms; uncertainties, imprecision and incompleteness. When we look to classification of methods for the management of imperfect knowledge we see fuzzy set-based techniques. The choice of a method to process data is linked to the choice of knowledge representation, which can be numerical, symbolic, logical or semantic and it depends on the nature of the problem to be solved for example decision support, which will be mentioned in our study. Fuzzy Logic is used for its ability to manage imprecise knowledge, but it can take advantage of the ability of neural networks to learn coefficients or functions. Such an association of methods is typical of so-called soft computing. In this study a new method was used for the management of imprecision for collected knowledge which related to economic analysis of construction industry in Turkey. Because of sudden changes occurring in economic factors decrease competition strength of construction companies. The better evaluation of these changes in economical factors in view of construction industry will made positive influence on company-s decisions which are dealing construction.
Abstract: Decision making preferences to certain criteria
usually focus on positive degrees without considering the negative
degrees. However, in real life situation, evaluation becomes more
comprehensive if negative degrees are considered concurrently.
Preference is expected to be more effective when considering both
positive and negative degrees of preference to evaluate the best
selection. Therefore, the aim of this paper is to propose the
conflicting bifuzzy preference relations in group decision making by
utilization of a novel score function. The conflicting bifuzzy
preference relation is obtained by introducing some modifications on
intuitionistic fuzzy preference relations. Releasing the intuitionistic
condition by taking into account positive and negative degrees
simultaneously and utilizing the novel score function are the main
modifications to establish the proposed preference model. The
proposed model is tested with a numerical example and proved to be
simple and practical. The four-step decision model shows the
efficiency of obtaining preference in group decision making.
Abstract: This article presents a method for elections between the members of a group that is founded by fuzzy logic. Linguistic variables are objects for decision on election cards and deduction is based on t-norms and s-norms. In this election-s method election cards are questionnaire. The questionnaires are comprised of some questions with some choices. The choices are words from natural language. Presented method is accompanied by center of gravity (COG) defuzzification added up to a computer program by MATLAB. Finally the method is illustrated by solving two examples; choose a head for a research group-s members and a representative for students.
Abstract: In DMVC, we have more than one options of sources available for construction of side information. The newer techniques make use of both the techniques simultaneously by constructing a bitmask that determines the source of every block or pixel of the side information. A lot of computation is done to determine each bit in the bitmask. In this paper, we have tried to define areas that can only be well predicted by temporal interpolation and not by multiview interpolation or synthesis. We predict that all such areas that are not covered by two cameras cannot be appropriately predicted by multiview synthesis and if we can identify such areas in the first place, we don-t need to go through the script of computations for all the pixels that lie in those areas. Moreover, this paper also defines a technique based on KLT to mark the above mentioned areas before any other processing is done on the side view.
Abstract: Teachers form the backbone of any educational system, hence selecting qualified candidates is very crucial. In Malaysia, the decision making in the selection process involves a few stages: Initial filtering through academic achievement, taking entry examination and going through an interview session. The last stage is the most challenging since it highly depends on human judgment. Therefore, this study sought to identify the selection criteria for teacher candidates that form the basis for an efficient multi-criteria teacher-candidate selection model for that last stage. The relevant criteria were determined from the literature and also based on expert input that is those who were involved in interviewing teacher candidates from a public university offering the formal training program. There are three main competency criteria that were identified which are content of knowledge, communication skills and personality. Further, each main criterion was divided into a few subcriteria. The Analytical Hierarchy Process (AHP) technique was employed to allocate weights for the criteria and later, integrated a Simple Weighted Average (SWA) scoring approach to develop the selection model. Subsequently, a web-based Decision Support System was developed to assist in the process of selecting the qualified teacher candidates. The Teacher-Candidate Selection (TeCaS) system is able to assist the panel of interviewers during the selection process which involves a large amount of complex qualitative judgments.
Abstract: The new status generated by technological advancements and changes in the global economy raises important issues on how communities and organisations need to innovate upon their traditional processes in order to adapt to the challenges of the Knowledge Society. The DialogoS+ European project aims to study the role of and promote social dialogue in the banking sector, strengthen the link between old and new members and make social dialogue at the European level a force for innovation and change, also given the context of the international crisis emerging in 2008- 2009. Under the scope of DialogoS+, this paper describes how the community of Europe-s banking sector trade unions attempted to adapt to the challenges of the Knowledge Society by exploiting the benefits of new channels of communication, learning, knowledge generation and diffusion focusing on the concept of roadmapping. Important dimensions of social dialogue such as collective bargaining and working conditions are addressed.
Abstract: In this paper, we first consider the quality of service
problems in heterogeneous wireless networks for sending the video
data, which their problem of being real-time is pronounced. At last,
we present a method for ensuring the end-to-end quality of service at
application layer level for adaptable sending of the video data at
heterogeneous wireless networks. To do this, mechanism in different
layers has been used. We have used the stop mechanism, the
adaptation mechanism and the graceful degrade at the application
layer, the multi-level congestion feedback mechanism in the network
layer and connection cutting off decision mechanism in the link
layer. At the end, the presented method and the achieved
improvement is simulated and presented in the NS-2 software.
Abstract: A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance cost. Therefore, in this paper, we introduce a new approach aimed at solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that our method provides a further improvement in term of query processing cost and view maintenance cost.
Abstract: Currently, the Malaysian construction industry is
focusing on transforming construction processes from conventional
building methods to the Industrialized Building System (IBS). Still,
research on the decision making of IBS technology adoption with the
influence of contextual factors is scarce. The purpose of this paper is
to explore how contextual factors influence the IBS decision making
in building projects which is perceived by those involved in
construction industry namely construction stakeholders and IBS
supply chain members. Theoretical background, theoretical
frameworks and literatures which identify possible contextual factors
that influence decision making towards IBS technology adoption are
presented. This paper also discusses the importance of contextual
factors in IBS decision making, highlighting some possible crossover
benefits and making some suggestions as to how these can be
utilized. Conclusions are drawn and recommendations are made with
respect to the perception of socio-economic, IBS policy and IBS
technology associated with building projects.
Abstract: Ontology is widely being used as a tool for organizing
information, creating the relation between the subjects within the
defined knowledge domain area. Various fields such as Civil,
Biology, and Management have successful integrated ontology in
decision support systems for managing domain knowledge and to
assist their decision makers. Gross pollutant traps (GPT) are devices
used in trapping and preventing large items or hazardous particles in
polluting and entering our waterways. However choosing and
determining GPT is a challenge in Malaysia as there are inadequate
GPT data repositories being captured and shared. Hence ontology is
needed to capture, organize and represent this knowledge into
meaningful information which can be contributed to the efficiency of
GPT selection in Malaysia urbanization. A GPT Ontology framework
is therefore built as the first step to capture GPT knowledge which
will then be integrated into the decision support system. This paper
will provide several examples of the GPT ontology, and explain how
it is constructed by using the Protégé tool.
Abstract: Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.
Abstract: We present a simplified equalization technique for a
π/4 differential quadrature phase shift keying ( π/4 -DQPSK) modulated
signal in a multipath fading environment. The proposed equalizer is
realized as a fractionally spaced adaptive decision feedback equalizer
(FS-ADFE), employing exponential step-size least mean square
(LMS) algorithm as the adaptation technique. The main advantage of
the scheme stems from the usage of exponential step-size LMS algorithm
in the equalizer, which achieves similar convergence behavior
as that of a recursive least squares (RLS) algorithm with significantly
reduced computational complexity. To investigate the finite-precision
performance of the proposed equalizer along with the π/4 -DQPSK
modem, the entire system is evaluated on a 16-bit fixed point digital
signal processor (DSP) environment. The proposed scheme is found
to be attractive even for those cases where equalization is to be
performed within a restricted number of training samples.
Abstract: Knowing consumers' preferences and perceptions of
the sensory evaluation of drink products are very significant to
manufacturers and retailers alike. With no appropriate sensory
analysis, there is a high risk of market disappointment. This paper
aims to rank the selected coffee products and also to determine the
best of quality attribute through sensory evaluation using fuzzy
decision making model. Three products of coffee drinks were used
for sensory evaluation. Data were collected from thirty judges at a
hypermarket in Kuala Terengganu, Malaysia. The judges were asked
to specify their sensory evaluation in linguistic terms of the quality
attributes of colour, smell, taste and mouth feel for each product and
also the weight of each quality attribute. Five fuzzy linguistic terms
represent the quality attributes were introduced prior analysing. The
judgment membership function and the weights were compared to
rank the products and also to determine the best quality attribute. The
product of Indoc was judged as the first in ranking and 'taste' as the
best quality attribute. These implicate the importance of sensory
evaluation in identifying consumers- preferences and also the
competency of fuzzy approach in decision making.
Abstract: Nowadays, the plant location selection has a critical
impact on the performance of numerous companies. In this paper, a
methodology is presented to solve this problem. The three decision
making methods, namely Delphi, AHP and improved VIKOR, are
hybridized in order to make the best use of information available
based on the decision makers or experts. In this respect, the aim of
using Delphi is to select the most influential criteria by a few decision
makers. The AHP is utilized to give weights of the selected criteria.
Finally, the improved VIKOR method is applied to rank alternatives.
At the end of paper, an application example demonstrates the
applicability of the proposed methodology.
Abstract: Main goal of preventive healthcare problems are at
decreasing the likelihood and severity of potentially life-threatening
illnesses by protection and early detection. The levels of
establishment and staffing costs along with summation of the travel
and waiting time that clients spent are considered as objectives
functions of the proposed nonlinear integer programming model. In
this paper, we have proposed a bi-objective mathematical model for
designing a network of preventive healthcare facilities so as to
minimize aforementioned objectives, simultaneously. Moreover, each
facility acts as M/M/1 queuing system. The number of facilities to be
established, the location of each facility, and the level of technology
for each facility to be chosen are provided as the main determinants
of a healthcare facility network. Finally, to demonstrate performance
of the proposed model, four multi-objective decision making
techniques are presented to solve the model.
Abstract: The recognition of handwritten numeral is an
important area of research for its applications in post office, banks
and other organizations. This paper presents automatic recognition of
handwritten Kannada numerals based on structural features. Five
different types of features, namely, profile based 10-segment string,
water reservoir; vertical and horizontal strokes, end points and
average boundary length from the minimal bounding box are used in
the recognition of numeral. The effect of each feature and their
combination in the numeral classification is analyzed using nearest
neighbor classifiers. It is common to combine multiple categories of
features into a single feature vector for the classification. Instead,
separate classifiers can be used to classify based on each visual
feature individually and the final classification can be obtained based
on the combination of separate base classification results. One
popular approach is to combine the classifier results into a feature
vector and leaving the decision to next level classifier. This method
is extended to extract a better information, possibility distribution,
from the base classifiers in resolving the conflicts among the
classification results. Here, we use fuzzy k Nearest Neighbor (fuzzy
k-NN) as base classifier for individual feature sets, the results of
which together forms the feature vector for the final k Nearest
Neighbor (k-NN) classifier. Testing is done, using different features,
individually and in combination, on a database containing 1600
samples of different numerals and the results are compared with the
results of different existing methods.
Abstract: In this paper, a method for decision making in fuzzy environment is presented.A new subjective and objective integrated approach is introduced that used to assign weight attributes in fuzzy multiple attribute decision making (FMADM) problems and alternatives and fmally ranked by proposed method.
Abstract: In this paper, a robust statistics based filter to remove salt and pepper noise in digital images is presented. The function of the algorithm is to detect the corrupted pixels first since the impulse noise only affect certain pixels in the image and the remaining pixels are uncorrupted. The corrupted pixels are replaced by an estimated value using the proposed robust statistics based filter. The proposed method perform well in removing low to medium density impulse noise with detail preservation upto a noise density of 70% compared to standard median filter, weighted median filter, recursive weighted median filter, progressive switching median filter, signal dependent rank ordered mean filter, adaptive median filter and recently proposed decision based algorithm. The visual and quantitative results show the proposed algorithm outperforms in restoring the original image with superior preservation of edges and better suppression of impulse noise