Abstract: The paper discusses optimising work on a method of processing ceramic / metal composite coatings for various applications and is based on preliminary work on processing anodes for solid oxide fuel cells (SOFCs). The composite coating is manufactured by the electroless co-deposition of nickel and yttria stabilised zirconia (YSZ) simultaneously on to a ceramic substrate. The effect on coating characteristics of substrate surface treatments and electroless nickel bath parameters such as pH and agitation methods are also investigated. Characterisation of the resulting deposit by scanning electron microscopy (SEM) and energy dispersive X-ray analysis (EDXA) is also discussed.
Abstract: Multi-Radio Multi-Channel Wireless Mesh Networks (MRMC-WMNs) operate at the backbone to access and route high volumes of traffic simultaneously. Such roles demand high network capacity, and long “online" time at the expense of accelerated transmission energy depletion and poor connectivity. This is the problem of transmission power control. Numerous power control methods for wireless networks are in literature. However, contributions towards MRMC configurations still face many challenges worth considering. In this paper, an energy-efficient power selection protocol called PMMUP is suggested at the Link-Layer. This protocol first divides the MRMC-WMN into a set of unified channel graphs (UCGs). A UCG consists of multiple radios interconnected to each other via a common wireless channel. In each UCG, a stochastic linear quadratic cost function is formulated. Each user minimizes this cost function consisting of trade-off between the size of unification states and the control action. Unification state variables come from independent UCGs and higher layers of the protocol stack. The PMMUP coordinates power optimizations at the network interface cards (NICs) of wireless mesh routers. The proposed PMMUP based algorithm converges fast analytically with a linear rate. Performance evaluations through simulations confirm the efficacy of the proposed dynamic power control.
Abstract: One main drawback of intrusion detection system is the
inability of detecting new attacks which do not have known
signatures. In this paper we discuss an intrusion detection method
that proposes independent component analysis (ICA) based feature
selection heuristics and using rough fuzzy for clustering data. ICA is
to separate these independent components (ICs) from the monitored
variables. Rough set has to decrease the amount of data and get rid of
redundancy and Fuzzy methods allow objects to belong to several
clusters simultaneously, with different degrees of membership. Our
approach allows us to recognize not only known attacks but also to
detect activity that may be the result of a new, unknown attack. The
experimental results on Knowledge Discovery and Data Mining-
(KDDCup 1999) dataset.
Abstract: The objective of this study is to propose a statistical
modeling method which enables simultaneous term structure
estimation of the risk-free interest rate, hazard and loss given default,
incorporating the characteristics of the bond issuing company such as
credit rating and financial information. A reduced form model is used
for this purpose. Statistical techniques such as spline estimation and
Bayesian information criterion are employed for parameter estimation
and model selection. An empirical analysis is conducted using the
information on the Japanese bond market data. Results of the
empirical analysis confirm the usefulness of the proposed method.
Abstract: Providing authentication for the messages exchanged
between group members in addition to confidentiality is an important
issue in Secure Group communication. We develop a protocol for
Secure Authentic Communication where we address authentication
for the group communication scheme proposed by Blundo et al.
which only provides confidentiality. Authentication scheme used is a
multiparty authentication scheme which allows all the users in the
system to send and receive messages simultaneously. Our scheme is
secure against colluding malicious parties numbering fewer than k.
Abstract: Nonlinear and unbalance loads in three phase
networks create harmonics and losses. Active and passive filters are
used for elimination or reduction of these effects. Passive filters have
some limitations. For example, they are designed only for a specific
frequency and they may cause to resonance in the network at the
point of common coupling. The other drawback of a passive filter is
that the sizes of required elements are normally large. The active
filter can improve some of limitations of passive filter for example;
they can eliminate more than one harmonic and don't cause resonance
in the network. In this paper inverter analysis have been done
simultaneously in three phase and the RL impedance of the line have
been considered. A sliding mode control based on energy feedback of
capacitors is employed in the design with this method, the dynamic
speed of the filter is improved effectively and harmonics and load
unbalance is compensating quickly.
Abstract: In recent years, response surface methodology (RSM) has
brought many attentions of many quality engineers in different
industries. Most of the published literature on robust design
methodology is basically concerned with optimization of a single
response or quality characteristic which is often most critical to
consumers. For most products, however, quality is multidimensional,
so it is common to observe multiple responses in an experimental
situation. Through this paper interested person will be familiarize
with this methodology via surveying of the most cited technical
papers.
It is believed that the proposed procedure in this study can resolve
a complex parameter design problem with more than two responses.
It can be applied to those areas where there are large data sets and a
number of responses are to be optimized simultaneously. In addition,
the proposed procedure is relatively simple and can be implemented
easily by using ready-made standard statistical packages.
Abstract: Unmanned aerial vehicles (UAVs) performing their
operations for a long time have been attracting much attention in
military and civil aviation industries for the past decade. The
applicable field of UAV is changing from the military purpose only to
the civil one. Because of their low operation cost, high reliability and
the necessity of various application areas, numerous development
programs have been initiated around the world. To obtain the optimal
solutions of the design variable (i.e., sectional airfoil profile, wing
taper ratio and sweep) for high performance of UAVs, both the lift and
lift-to-drag ratio are maximized whereas the pitching moment should
be minimized, simultaneously. It is found that the lift force and
lift-to-drag ratio are linearly dependent and a unique and dominant
solution are existed. However, a trade-off phenomenon is observed
between the lift-to-drag ratio and pitching moment. As the result of
optimization, sixty-five (65) non-dominated Pareto individuals at the
cutting edge of design spaces that are decided by airfoil shapes can be
obtained.
Abstract: A Simultaneous Multithreading (SMT) Processor is
capable of executing instructions from multiple threads in the same
cycle. SMT in fact was introduced as a powerful architecture to
superscalar to increase the throughput of the processor.
Simultaneous Multithreading is a technique that permits multiple
instructions from multiple independent applications or threads to
compete limited resources each cycle. While the fetch unit has been
identified as one of the major bottlenecks of SMT architecture, several
fetch schemes were proposed by prior works to enhance the fetching
efficiency and overall performance.
In this paper, we propose a novel fetch policy called queue situation
identifier (QSI) which counts some kind of long latency instructions of
each thread each cycle then properly selects which threads to fetch
next cycle. Simulation results show that in best case our fetch policy
can achieve 30% on speedup and also can reduce the data cache level 1
miss rate.
Abstract: In this paper we are interested in classification problems
with a performance constraint on error probability. In such
problems if the constraint cannot be satisfied, then a rejection option
is introduced. For binary labelled classification, a number of SVM
based methods with rejection option have been proposed over the
past few years. All of these methods use two thresholds on the SVM
output. However, in previous works, we have shown on synthetic data
that using thresholds on the output of the optimal SVM may lead to
poor results for classification tasks with performance constraint. In
this paper a new method for supervised classification with rejection
option is proposed. It consists in two different classifiers jointly
optimized to minimize the rejection probability subject to a given
constraint on error rate. This method uses a new kernel based linear
learning machine that we have recently presented. This learning
machine is characterized by its simplicity and high training speed
which makes the simultaneous optimization of the two classifiers
computationally reasonable. The proposed classification method with
rejection option is compared to a SVM based rejection method
proposed in recent literature. Experiments show the superiority of
the proposed method.
Abstract: The ever increasing product diversity and competition on the market of goods and services has dictated the pace of growth in the number of advertisements. Despite their admittedly diminished effectiveness over the recent years, advertisements remain the favored method of sales promotion. Consequently, the challenge for an advertiser is to explore every possible avenue of making an advertisement more noticeable, attractive and impellent for consumers. One way to achieve this is through invoking celebrity endorsements. On the one hand, the use of a celebrity to endorse a product involves substantial costs, however, on the other hand, it does not immediately guarantee the success of an advertisement. The question of how celebrities can be used in advertising to the best advantage is therefore of utmost importance. Celebrity endorsements have become commonplace: empirical evidence indicates that approximately 20 to 25 per cent of advertisements feature some famous person as a product endorser. The popularity of celebrity endorsements demonstrates the relevance of the topic, especially in the context of the current global economic downturn, when companies are forced to save in order to survive, yet simultaneously to heavily invest in advertising and sales promotion. The issue of the effective use of celebrity endorsements also figures prominently in the academic discourse. The study presented below is thus aimed at exploring what qualities (characteristics) of a celebrity endorser have an impact on the ffectiveness of the advertisement in which he/she appears and how.
Abstract: Nowadays there is a growing interest in biofuel production in most countries because of the increasing concerns about hydrocarbon fuel shortage and global climate changes, also for enhancing agricultural economy and producing local needs for transportation fuel. Ethanol can be produced from biomass by the hydrolysis and sugar fermentation processes. In this study ethanol was produced without using expensive commercial enzymes from sugarcane bagasse. Alkali pretreatment was used to prepare biomass before enzymatic hydrolysis. The comparison between NaOH, KOH and Ca(OH)2 shows NaOH is more effective on bagasse. The required enzymes for biomass hydrolysis were produced from sugarcane solid state fermentation via two fungi: Trichoderma longibrachiatum and Aspergillus niger. The results show that the produced enzyme solution via A. niger has functioned better than T. longibrachiatum. Ethanol was produced by simultaneous saccharification and fermentation (SSF) with crude enzyme solution from T. longibrachiatum and Saccharomyces cerevisiae yeast. To evaluate this procedure, SSF of pretreated bagasse was also done using Celluclast 1.5L by Novozymes. The yield of ethanol production by commercial enzyme and produced enzyme solution via T. longibrachiatum was 81% and 50% respectively.
Abstract: In general dynamic analyses, lower mode response is
of interest, however the higher modes of spatially discretized
equations generally do not represent the real behavior and not affects
to global response much. Some implicit algorithms, therefore, are
introduced to filter out the high-frequency modes using intended
numerical error. The objective of this study is to introduce the
P-method and PC α-method to compare that with dissipation method
and Newmark method through the stability analysis and numerical
example. PC α-method gives more accuracy than other methods
because it based on the α-method inherits the superior properties of the
implicit α-method. In finite element analysis, the PC α-method is more
useful than other methods because it is the explicit scheme and it
achieves the second order accuracy and numerical damping
simultaneously.
Abstract: This paper proposes a bi-objective model for the
facility location problem under a congestion system. The idea of the
model is motivated by applications of locating servers in bank
automated teller machines (ATMS), communication networks, and so
on. This model can be specifically considered for situations in which
fixed service facilities are congested by stochastic demand within
queueing framework. We formulate this model with two perspectives
simultaneously: (i) customers and (ii) service provider. The
objectives of the model are to minimize (i) the total expected
travelling and waiting time and (ii) the average facility idle-time.
This model represents a mixed-integer nonlinear programming
problem which belongs to the class of NP-hard problems. In addition,
to solve the model, two metaheuristic algorithms including nondominated
sorting genetic algorithms (NSGA-II) and non-dominated
ranking genetic algorithms (NRGA) are proposed. Besides, to
evaluate the performance of the two algorithms some numerical
examples are produced and analyzed with some metrics to determine
which algorithm works better.
Abstract: The aim of this paper is to investigate the
performance of the developed two point block method designed for
two processors for solving directly non stiff large systems of higher
order ordinary differential equations (ODEs). The method calculates
the numerical solution at two points simultaneously and produces
two new equally spaced solution values within a block and it is
possible to assign the computational tasks at each time step to a
single processor. The algorithm of the method was developed in C
language and the parallel computation was done on a parallel shared
memory environment. Numerical results are given to compare the
efficiency of the developed method to the sequential timing. For
large problems, the parallel implementation produced 1.95 speed-up
and 98% efficiency for the two processors.
Abstract: In this paper we address a multi-objective scheduling problem for unrelated parallel machines. In unrelated parallel systems, the processing cost/time of a given job on different machines may vary. The objective of scheduling is to simultaneously determine the job-machine assignment and job sequencing on each machine. In such a way the total cost of the schedule is minimized. The cost function consists of three components, namely; machining cost, earliness/tardiness penalties and makespan related cost. Such scheduling problem is combinatorial in nature. Therefore, a Simulated Annealing approach is employed to provide good solutions within reasonable computational times. Computational results show that the proposed approach can efficiently solve such complicated problems.
Abstract: This paper and its companion (Part 2) deal with
modeling and optimization of two NP-hard problems in production
planning of flexible manufacturing system (FMS), part type selection
problem and loading problem. The part type selection problem and
the loading problem are strongly related and heavily influence the
system-s efficiency and productivity. The complexity of the problems
is harder when flexibilities of operations such as the possibility of
operation processed on alternative machines with alternative tools are
considered. These problems have been modeled and solved
simultaneously by using real coded genetic algorithms (RCGA)
which uses an array of real numbers as chromosome representation.
These real numbers can be converted into part type sequence and
machines that are used to process the part types. This first part of the
papers focuses on the modeling of the problems and discussing how
the novel chromosome representation can be applied to solve the
problems. The second part will discuss the effectiveness of the
RCGA to solve various test bed problems.
Abstract: For the past one decade, biclustering has become popular data mining technique not only in the field of biological data analysis but also in other applications like text mining, market data analysis with high-dimensional two-way datasets. Biclustering clusters both rows and columns of a dataset simultaneously, as opposed to traditional clustering which clusters either rows or columns of a dataset. It retrieves subgroups of objects that are similar in one subgroup of variables and different in the remaining variables. Firefly Algorithm (FA) is a recently-proposed metaheuristic inspired by the collective behavior of fireflies. This paper provides a preliminary assessment of discrete version of FA (DFA) while coping with the task of mining coherent and large volume bicluster from web usage dataset. The experiments were conducted on two web usage datasets from public dataset repository whereby the performance of FA was compared with that exhibited by other population-based metaheuristic called binary Particle Swarm Optimization (PSO). The results achieved demonstrate the usefulness of DFA while tackling the biclustering problem.
Abstract: Starting from the basic pillars of the supportability
analysis this paper queries its characteristics in LCI (Life Cycle
Integration) environment. The research methodology contents a
review of modern logistics engineering literature with the objective to
collect and synthesize the knowledge relating to standards of
supportability design in e-logistics environment. The results show
that LCI framework has properties which are in fully compatibility
with the requirement of simultaneous logistics support and productservice
bundle design. The proposed approach is a contribution to the
more comprehensive and efficient supportability design process.
Also, contributions are reflected through a greater consistency of
collected data, automated creation of reports suitable for different
analysis, as well as the possibility of their customization according
with customer needs. In addition to this, convenience of this approach
is its practical use in real time. In a broader sense, LCI allows
integration of enterprises on a worldwide basis facilitating electronic
business.
Abstract: The work describes the use of a synthetic transmit
aperture (STA) with a single element transmitting and all elements
receiving in medical ultrasound imaging. STA technique is a novel
approach to today-s commercial systems, where an image is acquired
sequentially one image line at a time that puts a strict limit on the
frame rate and the amount of data needed for high image quality. The
STA imaging allows to acquire data simultaneously from all
directions over a number of emissions, and the full image can be
reconstructed.
In experiments a 32-element linear transducer array with 0.48 mm
inter-element spacing was used. Single element transmission aperture
was used to generate a spherical wave covering the full image region.
The 2D ultrasound images of wire phantom are presented obtained
using the STA and commercial ultrasound scanner Antares to
demonstrate the benefits of the SA imaging.