Abstract: The clustering ensembles combine multiple partitions
generated by different clustering algorithms into a single clustering
solution. Clustering ensembles have emerged as a prominent method
for improving robustness, stability and accuracy of unsupervised
classification solutions. So far, many contributions have been done to
find consensus clustering. One of the major problems in clustering
ensembles is the consensus function. In this paper, firstly, we
introduce clustering ensembles, representation of multiple partitions,
its challenges and present taxonomy of combination algorithms.
Secondly, we describe consensus functions in clustering ensembles
including Hypergraph partitioning, Voting approach, Mutual
information, Co-association based functions and Finite mixture
model, and next explain their advantages, disadvantages and
computational complexity. Finally, we compare the characteristics of
clustering ensembles algorithms such as computational complexity,
robustness, simplicity and accuracy on different datasets in previous
techniques.
Abstract: In this paper an alternative visualisation approach of
the wake behind different vehicle body shapes with simplified and
fully-detailed underbody has been proposed and analysed. This
allows for a more clear distinction among the different wake regions.
This visualisation is based on a transformation of the cartesian
coordinates of a chosen wake plane to polar coordinates, using as
filter velocities lower than the freestream. This transformation
produces a polar wake plot that enables the division and
quantification of the wake in a number of sections. In this paper,
local drag has been used to visualise the drag contribution of the flow
by the different sections. Visually, a balanced wake can be observed
by the concentric behaviour of the polar plots. Alternatively,
integration of the local drag of each degree section as a ratio of the
total local drag yields a quantifiable approach of the wake uniformity,
where different sections contribute equally to the local drag, with the
exception of the wheels.
Abstract: Design and land use are closely linked to the
energy efficiency levels for an urban area. The current city
planning practice does not involve an effective land useenergy
evaluation in its 'blueprint' urban plans. The study
proposed an appraisal method that can be embedded in GIS
programs using five planning criteria as how far a planner can
give away from the planning principles (criteria) for the most
energy output s/he can obtain. The case of Balcova, a district
in the Izmir Metropolitan area, is used conformingly for
evaluating the proposed master plan and the geothermal
energy (heating only) use for the concern district.
If the land use design were proposed accordingly at-most
energy efficiency (a 30% obtained), mainly increasing the
density around the geothermal wells and also proposing more
mixed use zones, we could have 17% distortion (infidelity to
the main planning principles) from the original plan. The
proposed method can be an effective tool for planners as
simulation media, of which calculations can be made by GIS
ready tools, to evaluate efficiency levels for different plan
proposals, letting to know how much energy saving causes
how much deviation from the other planning ideals. Lower
energy uses can be possible for different land use proposals
for various policy trials.
Abstract: Delay and Disruption Tolerant Networking is part of
the Inter Planetary Internet with primary application being Deep
Space Networks. Its Terrestrial form has interesting research
applications such as Alagappa University Delay Tolerant Water
Monitoring Network which doubles as test beds for improvising its
routing scheme. DTNs depend on node mobility to deliver packets
using a store-carry-and forward paradigm. Throwboxes are small and
inexpensive stationary devices equipped with wireless interfaces and
storage. We propose the use of Throwboxes to enhance the contact
opportunities of the nodes and hence improve the Throughput. The
enhancement is evaluated using Alunivdtnsim, a desktop simulator in
C language and the results are graphically presented.
Abstract: Cerium-doped lanthanum bromide LaBr3:Ce(5%)
crystals are considered to be one of the most advanced scintillator
materials used in PET scanning, combining a high light yield, fast
decay time and excellent energy resolution. Apart from the correct
choice of scintillator, it is also important to optimise the detector
geometry, not least in terms of source-to-detector distance in order to
obtain reliable measurements and efficiency. In this study a
commercially available 25 mm x 25 mm BrilLanCeTM 380 LaBr3: Ce
(5%) detector was characterised in terms of its efficiency at varying
source-to-detector distances. Gamma-ray spectra of 22Na, 60Co, and
137Cs were separately acquired at distances of 5, 10, 15, and 20cm. As
a result of the change in solid angle subtended by the detector, the
geometric efficiency reduced in efficiency with increasing distance.
High efficiencies at low distances can cause pulse pile-up when
subsequent photons are detected before previously detected events
have decayed. To reduce this systematic error the source-to-detector
distance should be balanced between efficiency and pulse pile-up
suppression as otherwise pile-up corrections would need to be
necessary at short distances. In addition to the experimental
measurements Monte Carlo simulations have been carried out for the
same setup, allowing a comparison of results. The advantages and
disadvantages of each approach have been highlighted.
Abstract: This paper argues that networks, such as the ECN and the American network, are affected by certain small events which are inherent to path dependence and preclude the full evolution towards efficiency. It is advocated that the American network is superior to the ECN in many respects due to its greater flexibility and longer history. This stems in particular from the creation of the American network, which was based on a small number of cases. Such a structure encourages further changes and modifications which are not necessarily radical. The ECN, by contrast, was established by legislative action, which explains its rigid structure and resistance to change. This paper is an attempt to transpose the superiority of the American network on to the ECN. It looks at concepts such as judicial cooperation, harmonisation of procedure, peer review and regulatory impact assessments (RIAs), and dispute resolution procedures.
Abstract: Recurrent event data is a special type of multivariate
survival data. Dynamic and frailty models are one of the approaches
that dealt with this kind of data. A comparison between these two
models is studied using the empirical standard deviation of the
standardized martingale residual processes as a way of assessing the
fit of the two models based on the Aalen additive regression model.
Here we found both approaches took heterogeneity into account and
produce residual standard deviations close to each other both in the
simulation study and in the real data set.
Abstract: The influence of viscosity on droplet diameter for
water-in-crude oil (w/o) emulsion with two different ratios; 20-80 %
and 50-50 % w/o emulsion was examined in the Brookfield
Rotational Digital Rheometer. The emulsion was prepared with
sorbitan sesquiolate (Span 83) act as emulsifier at varied temperature
and stirring speed in rotation per minute (rpm). Results showed that
the viscosity of w/o emulsion was strongly augmented by increasing
volume of water and decreased the temperature. The changing of
viscosity also altered the droplet size distribution. Changing of
droplet diameter was depends on the viscosity and the behavior of
emulsion either Newtonian or non-Newtonian.
Abstract: Topics Disaster and Emergency Management are highly debated among experts. Fast communication will help to deal with emergencies. Problem is with the network connection and data exchange. The paper suggests a solution, which allows possibilities and perspectives of new flexible communication platform to the protection of communication systems for crisis management. This platform is used for everyday communication and communication in crisis situations too.
Abstract: Cameron Highlands is a mountainous area subjected
to torrential tropical showers. It extracts 5.8 million liters of water
per day for drinking supply from its rivers at several intake points.
The water quality of rivers in Cameron Highlands, however, has
deteriorated significantly due to land clearing for agriculture,
excessive usage of pesticides and fertilizers as well as construction
activities in rapidly developing urban areas. On the other hand, these
pollution sources known as non-point pollution sources are diverse
and hard to identify and therefore they are difficult to estimate.
Hence, Geographical Information Systems (GIS) was used to provide
an extensive approach to evaluate landuse and other mapping
characteristics to explain the spatial distribution of non-point sources
of contamination in Cameron Highlands. The method to assess
pollution sources has been developed by using Cameron Highlands
Master Plan (2006-2010) for integrating GIS, databases, as well as
pollution loads in the area of study. The results show highest annual
runoff is created by forest, 3.56 × 108 m3/yr followed by urban
development, 1.46 × 108 m3/yr. Furthermore, urban development
causes highest BOD load (1.31 × 106 kgBOD/yr) while agricultural
activities and forest contribute the highest annual loads for
phosphorus (6.91 × 104 kgP/yr) and nitrogen (2.50 × 105 kgN/yr),
respectively. Therefore, best management practices (BMPs) are
suggested to be applied to reduce pollution level in the area.
Abstract: In this paper, different approaches to solve the
forward kinematics of a three DOF actuator redundant hydraulic
parallel manipulator are presented. On the contrary to series
manipulators, the forward kinematic map of parallel manipulators
involves highly coupled nonlinear equations, which are almost
impossible to solve analytically. The proposed methods are using
neural networks identification with different structures to solve the
problem. The accuracy of the results of each method is analyzed in
detail and the advantages and the disadvantages of them in
computing the forward kinematic map of the given mechanism is
discussed in detail. It is concluded that ANFIS presents the best
performance compared to MLP, RBF and PNN networks in this
particular application.
Abstract: Novel acrylated epoxidized hemp oil (AEHO) based
bioresins were successfully synthesised, characterized and applied to
biocomposites reinforced with woven jute fibre. Characterisation of
the synthesised AEHO consisted of acid number titrations and FTIR
spectroscopy to assess the success of the acrylation reaction. Three
different matrices were produced (vinylester (VE), 50/50 blend of
AEHO/VE and 100% AEHO) and reinforced with jute fibre to form
three different types of biocomposite samples. Mechanical properties
in the form of flexural and interlaminar shear strength (ILSS) were
investigated and compared for the different samples. Results from the
mechanical tests showed that AEHO and 50/50 based neat bioresins
displayed lower flexural properties compared with the VE samples.
However when applied to biocomposites and compared with VE
based samples, AEHO biocomposites demonstrated comparable
flexural performance and improved ILSS. These results are attributed
to improved fibre-matrix interfacial adhesion due to surface-chemical
compatibility between the natural fibres and bioresin.
Abstract: In the past years, the world has witnessed significant work in the field of Manufacturing. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Closely following all this, due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: moving toward a more intelligent manufacturing, the present paper emerges with the main aim of contributing to the analysis and a few customization issues of a new iCIM 3000 system at the IPSAM. In this process, special emphasis in made on the material flow problem. For this, besides offering a description and analysis of the system and its main parts, also some tips on how to define other possible alternative material flow scenarios and a partial analysis of the combinatorial nature of the problem are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a few figures and expressions which would help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.
Abstract: A catastrophic earthquake measuring 6.3 on the
Richter scale struck the Christchurch, New Zealand Central Business
District on February 22, 2012, abruptly disrupting the business of
teaching and learning at Christchurch Polytechnic Institute of
Technology. This paper presents the findings from a study
undertaken about the complexity of delivering an educational
programme in the face of this traumatic natural event. Nine
interconnected themes emerged from this multiple method study:
communication, decision making, leader- and follower-ship,
balancing personal and professional responsibilities, taking action,
preparedness and thinking ahead, all within a disruptive and uncertain
context. Sustainable responses that maximise business continuity, and
provide solutions to practical challenges, are among the study-s
recommendations.
Abstract: Petroleum refineries discharged large amount of
wastewater -during the refining process- that contains hazardous
constituents that is hard to degrade. Anaerobic treatment process is
well known as an efficient method to degrade high strength
wastewaters. Up-flow Anaerobic Sludge Blanker (UASB) is a
common process used for various wastewater treatments. Two UASB
reactors were set up and operated in parallel to evaluate the treatment
efficiency of petroleum refinery wastewater. In this study four
organic volumetric loading rates were applied (i.e. 0.58, 0.89, 1.21
and 2.34 kg/m3·d), two loads to each reactor. Each load was applied
for a period of 60 days for the reactor to acclimatize and reach steady
state, and then the second load applied. The chemical oxygen demand
(COD) removals were satisfactory with the removal efficiencies at the
loadings applied were 78, 82, 83 and 81 % respectively.
Abstract: The Internet is the global data communications
infrastructure based on the interconnection of both public and private
networks using protocols that implement Internetworking on a global
scale. Hence the control of protocol and infrastructure development,
resource allocation and network operation are crucial and interlinked
aspects. Internet Governance is the hotly debated and contentious
subject that refers to the global control and operation of key Internet
infrastructure such as domain name servers and resources such as
domain names. It is impossible to separate technical and political
positions as they are interlinked. Furthermore the existence of a
global market, transparency and competition impact upon Internet
Governance and related topics such as network neutrality and
security. Current trends and developments regarding Internet
governance with a focus on the policy-making process, security and
control have been observed to evaluate current and future
implications on the Internet. The multi stakeholder approach to
Internet Governance discussed in this paper presents a number of
opportunities, issues and developments that will affect the future
direction of the Internet. Internet operation, maintenance and
advisory organisations such as the Internet Corporation for Assigned
Names and Numbers (ICANN) or the Internet Governance Forum
(IGF) are currently in the process of formulating policies for future
Internet Governance. Given the controversial nature of the issues at
stake and the current lack of agreement it is predicted that
institutional as well as market governance will remain present for the
network access and content.
Abstract: Current image-based individual human recognition
methods, such as fingerprints, face, or iris biometric modalities
generally require a cooperative subject, views from certain aspects,
and physical contact or close proximity. These methods cannot
reliably recognize non-cooperating individuals at a distance in the
real world under changing environmental conditions. Gait, which
concerns recognizing individuals by the way they walk, is a relatively
new biometric without these disadvantages. The inherent gait
characteristic of an individual makes it irreplaceable and useful in
visual surveillance.
In this paper, an efficient gait recognition system for human
identification by extracting two features namely width vector of
the binary silhouette and the MPEG-7-based region-based shape
descriptors is proposed. In the proposed method, foreground objects
i.e., human and other moving objects are extracted by estimating
background information by a Gaussian Mixture Model (GMM) and
subsequently, median filtering operation is performed for removing
noises in the background subtracted image. A moving target classification
algorithm is used to separate human being (i.e., pedestrian)
from other foreground objects (viz., vehicles). Shape and boundary
information is used in the moving target classification algorithm.
Subsequently, width vector of the outer contour of binary silhouette
and the MPEG-7 Angular Radial Transform coefficients are taken as
the feature vector. Next, the Principal Component Analysis (PCA)
is applied to the selected feature vector to reduce its dimensionality.
These extracted feature vectors are used to train an Hidden Markov
Model (HMM) for identification of some individuals. The proposed
system is evaluated using some gait sequences and the experimental
results show the efficacy of the proposed algorithm.
Abstract: Chemically defined Schlegel-s medium was modified
to improve production of cell growth and other metabolites that are
produced by fluorescent pseudomonad R62 strain. The modified
medium does not require pH control as pH changes are kept within ±
0.2 units of the initial pH 7.1 during fermentation. The siderophore
production was optimized for the fluorescent pseudomonad strain in
the modified medium containing 1% glycerol as a major carbon
source supplemented with 0.05% succinic acid and 0.5% Ltryptophan.
Indole-3 acetic acid (IAA) production was higher when
L-tryptophan was used at 0.5%. The 2,4- diacetylphloroglucinol
(DAPG) was higher with amended three trace elements in medium.
The optimized medium produced 2.28 g/l of dry cell mass and 900
mg/l of siderophore at the end of 36 h cultivation, while the
production levels of IAA and DAPG were 65 mg/l and 81 mg/l
respectively at the end of 48 h cultivation.
Abstract: A mobile agent is a software which performs an
action autonomously and independently as a person or an
organizations assistance. Mobile agents are used for searching
information, retrieval information, filtering, intruder recognition in
networks, and so on. One of the important issues of mobile agent is
their security. It must consider different security issues in effective
and secured usage of mobile agent. One of those issues is the
integrity-s protection of mobile agents.
In this paper, the advantages and disadvantages of each method,
after reviewing the existing methods, is examined. Regarding to this
matter that each method has its own advantage or disadvantage, it
seems that by combining these methods, one can reach to a better
method for protecting the integrity of mobile agents. Therefore, this
method is provided in this paper and then is evaluated in terms of
existing method. Finally, this method is simulated and its results are
the sign of improving the possibility of integrity-s protection of
mobile agents.
Abstract: This paper presents a novel two-phase hybrid optimization algorithm with hybrid genetic operators to solve the optimal control problem of a single stage hybrid manufacturing system. The proposed hybrid real coded genetic algorithm (HRCGA) is developed in such a way that a simple real coded GA acts as a base level search, which makes a quick decision to direct the search towards the optimal region, and a local search method is next employed to do fine tuning. The hybrid genetic operators involved in the proposed algorithm improve both the quality of the solution and convergence speed. The phase–1 uses conventional real coded genetic algorithm (RCGA), while optimisation by direct search and systematic reduction of the size of search region is employed in the phase – 2. A typical numerical example of an optimal control problem with the number of jobs varying from 10 to 50 is included to illustrate the efficacy of the proposed algorithm. Several statistical analyses are done to compare the validity of the proposed algorithm with the conventional RCGA and PSO techniques. Hypothesis t – test and analysis of variance (ANOVA) test are also carried out to validate the effectiveness of the proposed algorithm. The results clearly demonstrate that the proposed algorithm not only improves the quality but also is more efficient in converging to the optimal value faster. They can outperform the conventional real coded GA (RCGA) and the efficient particle swarm optimisation (PSO) algorithm in quality of the optimal solution and also in terms of convergence to the actual optimum value.