Abstract: Data mining is the procedure of determining interesting patterns from the huge amount of data. With the intention of accessing the data faster the most supporting processes needed is clustering. Clustering is the process of identifying similarity between data according to the individuality present in the data and grouping associated data objects into clusters. Cluster ensemble is the technique to combine various runs of different clustering algorithms to obtain a general partition of the original dataset, aiming for consolidation of outcomes from a collection of individual clustering outcomes. The performances of clustering ensembles are mainly affecting by two principal factors such as diversity and quality. This paper presents the overview about the different cluster ensemble algorithm along with their methods used in cluster ensemble to improve the diversity and quality in the several cluster ensemble related papers and shows the comparative analysis of different cluster ensemble also summarize various cluster ensemble methods. Henceforth this clear analysis will be very useful for the world of clustering experts and also helps in deciding the most appropriate one to determine the problem in hand.
Abstract: The paper presents a method in which the expert
knowledge is applied to fuzzy inference model. Even a less
experienced person could benefit from the use of such a system, e.g.
urban planners, officials. The analysis result is obtained in a very
short time, so a large number of the proposed locations can also be
verified in a short time. The proposed method is intended for testing
of locations of car parks in a city. The paper shows selected examples
of locations of the P&R facilities in cities planning to introduce the
P&R. The analyses of existing objects are also shown in the paper
and they are confronted with the opinions of the system users, with
particular emphasis on unpopular locations. The results of the
analyses are compared to expert analysis of the P&R facilities
location that was outsourced by the city and the opinions about
existing facilities users that were expressed on social networking
sites. The obtained results are consistent with actual users’ feedback.
The proposed method proves to be good, but does not require the
involvement of a large experts team and large financial contributions
for complicated research. The method also provides an opportunity to
show the alternative location of P&R facilities. Although the results
of the method are approximate, they are not worse than results of
analysis of employed experts. The advantage of this method is ease of
use, which simplifies the professional expert analysis. The ability of
analyzing a large number of alternative locations gives a broader
view on the problem. It is valuable that the arduous analysis of the
team of people can be replaced by the model's calculation. According
to the authors, the proposed method is also suitable for
implementation on a GIS platform.
Abstract: People, throughout the history, have made estimates
and inferences about the future by using their past experiences.
Developing information technologies and the improvements in the
database management systems make it possible to extract useful
information from knowledge in hand for the strategic decisions.
Therefore, different methods have been developed. Data mining by
association rules learning is one of such methods. Apriori algorithm,
one of the well-known association rules learning algorithms, is not
commonly used in spatio-temporal data sets. However, it is possible
to embed time and space features into the data sets and make Apriori
algorithm a suitable data mining technique for learning spatiotemporal
association rules. Lake Van, the largest lake of Turkey, is a
closed basin. This feature causes the volume of the lake to increase or
decrease as a result of change in water amount it holds. In this study,
evaporation, humidity, lake altitude, amount of rainfall and
temperature parameters recorded in Lake Van region throughout the
years are used by the Apriori algorithm and a spatio-temporal data
mining application is developed to identify overflows and newlyformed
soil regions (underflows) occurring in the coastal parts of
Lake Van. Identifying possible reasons of overflows and underflows
may be used to alert the experts to take precautions and make the
necessary investments.
Abstract: The objective of this paper is to analyze the role played by the institute of the public hearings in the Brazilian Supreme Court. The public hearings are regulated since 1999 by the Brazilian Laws nº 9.868, nº 9.882 and by the Intern Regiment of the Brazilian Supreme Court. According to this legislation, the public hearings are supposed to be called when a matter of circumstance of fact must be clarified, what can be done through the hearing of the testimonies of persons with expertise and authority in the theme related to the cause. This work aims to investigate what is the role played by the public hearings and by the experts in the Brazilian Supreme Court. The hypothesis of this research is that: (I) The public hearings in the Brazilian Supreme Court are used to uphold a rhetoric of a democratic legitimacy of the Court`s decisions; (II) The Legislative intentions have been distorted. To test this hypothesis, the adopted methodology involves an empirical study of the Brazilian jurisprudence. As a conclusion, it follows that the public hearings convened by the Brazilian Supreme Court do not correspond, in practice, to the role assigned to them by the Congress since they do not serve properly to epistemic interests. The public hearings not only do not legitimate democratically the decisions, but also, do not properly clarify technical issues.
Abstract: Malaysia’s green building development is gaining
momentum and green buildings have become a key focus area,
especially within the commercial sector with the encouragement of
government legislation and policy. Due to the emerging awareness
among the market players’ views of the benefits associated with the
ownership of green buildings in Malaysia, there is a need for valuers
to incorporate consideration of sustainability into their assessments of
property market value to ensure the green buildings continue to
increase in the market. This paper analyses the valuers’ current
perception on the valuation practices with regard to the green issues
in Malaysia. The study was based on a survey of registered real estate
valuers and the experts whose work related to valuation in the Klang
Valley area to rate their view regarding the perception on valuation of
green building. The findings present evidence that even though
Malaysian valuers have limited knowledge of green buildings, they
recognise the importance of incorporating the green features in the
valuation process. The inclusion of incorporating the green features
in valuations in practice was hindered by the inadequacy of sufficient
transaction data in the market. Furthermore, valuers experienced
difficulty in identifying what are the various input parameters of
green building and how to adjust it in order to reflect the benefit of
sustainability features correctly in the valuation process. This paper
focuses on the present challenges confronted by Malaysian valuers
with regards to incorporating the green features in their valuation.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: Experts, enterprises and operators expect that the
bandwidth request will increase up to rates of 100 to 1,000 Mbps
within several years. Therefore the most important question is which
technology shall satisfy the future consumer broadband demands.
Currently the consensus is, that the fiber technology has the best
technical characteristics to achieve such the high bandwidth rates.
But fiber technology is so far very cost-intensive and resource
consuming. To avoid these investments, operators are concentrating
to upgrade the existing copper and hybrid fiber coax infrastructures.
This work presents a comparison of the copper and fiber
technologies including an overview about the current German
broadband market. Both technologies are reviewed in the terms of
demand, willingness to pay and economic efficiency in connection
with the technical characteristics.
Abstract: Total Quality Management (TQM) is a managerial
approach that improves the competitiveness of the industry,
meanwhile Information technology (IT) was introduced with TQM
for handling the technical issues which is supported by quality
experts for fulfilling the customers’ requirement. Present paper aims
to utilise AHP (Analytic Hierarchy Process) methodology to
priorities and rank the hierarchy levels of TQM enablers and IT
resource together for its successful implementation in the Information
and Communication Technology (ICT) industry. A total of 17 TQM
enablers (nine) and IT resources (eight) were identified and
partitioned into 3 categories and were prioritised by AHP approach.
The finding indicates that the 17 sub-criteria can be grouped into
three main categories namely organizing, tools and techniques, and
culture and people. Further, out of 17 sub-criteria, three sub-criteria:
top management commitment and support, total employee
involvement, and continuous improvement got highest priority
whereas three sub-criteria such as structural equation modelling,
culture change, and customer satisfaction got lowest priority. The
result suggests a hierarchy model for ICT industry to prioritise the
enablers and resources as well as to improve the TQM and IT
performance in the ICT industry. This paper has some managerial
implication which suggests the managers of ICT industry to
implement TQM and IT together in their organizations to get
maximum benefits and how to utilize available resources. At the end,
conclusions, limitation, future scope of the study are presented.
Abstract: The paper presents a new method for efficient
innovation process management. Even though the innovation
management methods, tools and knowledge are well established and
documented in literature, most of the companies still do not manage it
efficiently. Especially in SMEs the front end of innovation - problem
identification, idea creation and selection - is often not optimally
performed. Our eMIPS methodology represents a sort of "umbrella
methodology" - a well-defined set of procedures, which can be
dynamically adapted to the concrete case in a company. In daily
practice, various methods (e.g. for problem identification and idea
creation) can be applied, depending on the company's needs. It is
based on the proactive involvement of the company's employees
supported by the appropriate methodology and external experts. The
presented phases are performed via a mixture of face-to-face
activities (workshops) and online (eLearning) activities taking place
in eLearning Moodle environment and using other e-communication
channels. One part of the outcomes is an identified set of
opportunities and concrete solutions ready for implementation. The
other also very important result is connected to innovation
competences for the participating employees related with concrete
tools and methods for idea management. In addition, the employees
get a strong experience for dynamic, efficient and solution oriented
managing of the invention process. The eMIPS also represents a way
of establishing or improving the innovation culture in the
organization. The first results in a pilot company showed excellent
results regarding the motivation of participants and also as to the
results achieved.
Abstract: In medical investigations, uncertainty is a major
challenging problem in making decision for doctors/experts to
identify the diseases with a common set of symptoms and also has
been extensively increasing in medical diagnosis problems. The
theory of cross entropy for intuitionistic fuzzy sets (IFS) is an
effective approach in coping uncertainty in decision making for
medical diagnosis problem. The main focus of this paper is to
propose a new intuitionistic fuzzy cross entropy measure (IFCEM),
which aid in reducing the uncertainty and doctors/experts will take
their decision easily in context of patient’s disease. It is shown that
the proposed measure has some elegant properties, which
demonstrates its potency. Further, it is also exemplified in detail the
efficiency and utility of the proposed measure by using a real life
case study of diagnosis the disease in medical science.
Abstract: This paper reports the worldwide status of building
information modeling (BIM) adoption from the perspectives of the
engagement level, the Hype Cycle model, the technology diffusion
model, and BIM services. An online survey was distributed, and 156
experts from six continents responded. Overall, North America was
the most advanced continent, followed by Oceania and Europe.
Countries in Asia perceived their phase mainly as slope of
enlightenment (mature) in the Hype Cycle model. In the technology
diffusion model, the main BIM-users worldwide were “early majority”
(third phase), but those in the Middle East/Africa and South America
were “early adopters” (second phase). In addition, the more advanced
the country, the more number of BIM services employed in general. In
summary, North America, Europe, Oceania, and Asia were advancing
rapidly toward the mature stage of BIM, whereas the Middle
East/Africa and South America were still in the early phase. The
simple indexes used in this study may be used to track the worldwide
status of BIM adoption in long-term surveys.
Abstract: This paper argues nation-building theories that
prioritize democratic governance best explain the successful postindependence
development of Botswana. Three main competing
schools of thought exist regarding the sequencing of policies that
should occur to re-build weakened or failed states. The first posits
that economic development should receive foremost attention, while
democratization and a binding sense of nationalism can wait. A
second group of experts identified constructing a sense of nationalism
among a populace is necessary first, so that the state receives popular
legitimacy and obedience that are prerequisites for development.
Botswana, though, transitioned into a multi-party democracy and
prosperous open economy due to the utilization of traditional
democratic structures, enlightened and accountable leadership, and an
educated technocratic civil service. With these political foundations
already in place when the discovery of diamonds occurred, the
resulting revenues were spent wisely on projects that grew the
economy, improved basic living standards, and attracted foreign
investment. Thus democratization preceded, and therefore provided
an accountable basis for, economic development that might otherwise
have been squandered by greedy and isolated elites to the detriment
of the greater population. Botswana was one of the poorest nations in
the world at the time of its independence in 1966, with little
infrastructure, a dependence on apartheid South Africa for trade, and
a largely subsistence economy. Over the next thirty years, though, its
economy grew the fastest of any nation in the world. The transparent
and judicious use of diamond returns is only a partial explanation, as
the government also pursued economic diversification, mass
education, and rural development in response to public needs.
As nation-building has become a project undertaken by nations
and multilateral agencies such as the United Nations and the North
Atlantic Treaty Organization, Botswana may provide best practices
that others should follow in attempting to reconstruct economically
and politically unstable states.
Abstract: The objectives of the study were to determine the
marketing mix factors that influencing tourist’s destination decision
making for cultural tourism in the Chonburi province. Both
quantitative and qualitative data were applied in this study. The
samples of 400 cases for quantitative analysis were tourists (both
Thai and foreign) who were interested in cultural tourism in the
Chonburi province, and traveled to cultural sites in Chonburi and 14
representatives from provincial tourism committee of Chonburi and
local tourism experts. Statistics utilized in this research included
frequency, percentage, mean, standard deviation, and multiple
regression analysis. The study found that Thai and foreign tourists
are influenced by different important marketing mix factors. The
important factors for Thai respondents were physical evidence, price,
people, and place at high importance level. For foreign respondents,
physical evidence, price, people, and process were high importance
level, whereas, product, place and promotion were moderate
importance level.
Abstract: Adolescents with Autism Spectrum Disorders (ASD)
often experience social-communication difficulties that negatively
impact their social interactions with typical peers. However, unlike
other age and disability groups, there is little intervention research to
inform best practice for these students. One evidence-based strategy
for younger students with ASD is peer-mediated intervention (PMI).
PMI may be particularly promising for use with adolescents, as peers
are readily available and are natural experts for encouraging authentic
high school conversations. This paper provides a review of previous
research that evaluated the use of PMI to improve the socialcommunication
skills of students with ASD. Specific intervention
features associated with positive student outcomes are identified and
recommendations for future research are provided. Adolescents with
ASD are targeted due the critical importance of social conversation at
the high school level.
Abstract: Despite the highly touted benefits, emerging
technologies have unleashed pervasive concerns regarding unintended
and unforeseen social impacts. Thus, those wishing to create safe and
socially acceptable products need to identify such side effects and
mitigate them prior to the market proliferation. Various methodologies
in the field of technology assessment (TA), namely Delphi, impact
assessment, and scenario planning, have been widely incorporated in
such a circumstance. However, literatures face a major limitation in
terms of sole reliance on participatory workshop activities. They
unfortunately missed out the availability of a massive untapped data
source of futuristic information flooding through the Internet. This
research thus seeks to gain insights into utilization of futuristic data,
future-oriented documents from the Internet, as a supplementary
method to generate social impact scenarios whilst capturing
perspectives of experts from a wide variety of disciplines. To this end,
network analysis is conducted based on the social keywords extracted
from the futuristic documents by text mining, which is then used as a
guide to produce a comprehensive set of detailed scenarios. Our
proposed approach facilitates harmonized depictions of possible
hazardous consequences of emerging technologies and thereby makes
decision makers more aware of, and responsive to, broad qualitative
uncertainties.
Abstract: Quantification of cardiac function is performed by
calculating blood volume and ejection fraction in routine clinical
practice. However, these works have been performed by manual
contouring, which requires computational costs and varies on the
observer. In this paper, an automatic left ventricle segmentation
algorithm on cardiac magnetic resonance images (MRI) is presented.
Using knowledge on cardiac MRI, a K-mean clustering technique is
applied to segment blood region on a coil-sensitivity corrected image.
Then, a graph searching technique is used to correct segmentation
errors from coil distortion and noises. Finally, blood volume and
ejection fraction are calculated. Using cardiac MRI from 15 subjects,
the presented algorithm is tested and compared with manual
contouring by experts to show outstanding performance.
Abstract: A large amount of data is typically stored in relational
databases (DB). The latter can efficiently handle user queries which
intend to elicit the appropriate information from data sources.
However, direct access and use of this data requires the end users to
have an adequate technical background, while they should also cope
with the internal data structure and values presented. Consequently
the information retrieval is a quite difficult process even for IT or DB
experts, taking into account the limited contributions of relational
databases from the conceptual point of view. Ontologies enable users
to formally describe a domain of knowledge in terms of concepts and
relations among them and hence they can be used for unambiguously
specifying the information captured by the relational database.
However, accessing information residing in a database using
ontologies is feasible, provided that the users are keen on using
semantic web technologies. For enabling users form different
disciplines to retrieve the appropriate data, the design of a Graphical
User Interface is necessary. In this work, we will present an
interactive, ontology-based, semantically enable web tool that can be
used for information retrieval purposes. The tool is totally based on
the ontological representation of underlying database schema while it
provides a user friendly environment through which the users can
graphically form and execute their queries.
Abstract: Random epistemologies and hash tables have garnered
minimal interest from both security experts and experts in the last
several years. In fact, few information theorists would disagree with
the evaluation of expert systems. In our research, we discover how
flip-flop gates can be applied to the study of superpages. Though
such a hypothesis at first glance seems perverse, it is derived from
known results.
Abstract: Verification and Validation of Simulated Process
Model is the most important phase of the simulator life cycle.
Evaluation of simulated process models based on Verification and
Validation techniques checks the closeness of each component model
(in a simulated network) with the real system/process with respect to
dynamic behaviour under steady state and transient conditions. The
process of Verification and Validation helps in qualifying the process
simulator for the intended purpose whether it is for providing
comprehensive training or design verification. In general, model
verification is carried out by comparison of simulated component
characteristics with the original requirement to ensure that each step
in the model development process completely incorporates all the
design requirements. Validation testing is performed by comparing
the simulated process parameters to the actual plant process
parameters either in standalone mode or integrated mode.
A Full Scope Replica Operator Training Simulator for PFBR -
Prototype Fast Breeder Reactor has been developed at IGCAR,
Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder
Reactor Simulator) where in the main participants are
engineers/experts belonging to Modeling Team, Process Design and
Instrumentation & Control design team. This paper discusses about
the Verification and Validation process in general, the evaluation
procedure adopted for PFBR operator training Simulator, the
methodology followed for verifying the models, the reference
documents and standards used etc. It details out the importance of
internal validation by design experts, subsequent validation by
external agency consisting of experts from various fields, model
improvement by tuning based on expert’s comments, final
qualification of the simulator for the intended purpose and the
difficulties faced while co-coordinating various activities.
Abstract: The research was conducted to empirically validate
the proposed maturity model of e-Government implementation,
composed of four dimensions, further specified by 54 success factors
as attributes. To do so, there are two steps were performed. First,
expert’s judgment was conducted to test its content validity. The
second, reliability study was performed to evaluate inter-rater
agreement by using Fleiss Kappa approach. The kappa statistic
(kappa coefficient) is the most commonly used method for testing the
consistency among raters. Fleiss Kappa was a generalization of
Kappa in extensions to the case of more than two raters (multiple
raters) with multi-categorical ratings. Our findings show that most
attributes of the proposed model were related to their corresponding
dimensions. According to our results, The percentage of agree
answers given by the experts was 73.69% in dimension A, 89.76% in
B, 81.5% in C and 60.37% in D. This means that more than half of
the attributes of each dimensions were appropriate or relevant to the
dimensions they were supposed to measure, while 85% of attributes
were relevant enough to their corresponding dimensions. Inter-rater
reliability coefficient also showed satisfactory result and interpreted
as substantial agreement among raters. Therefore, the proposed
model in this paper was valid and reliable to measure the maturity of
e-Government implementation.