Abstract: Support Vector Machine (SVM) is a statistical
learning tool that was initially developed by Vapnik in 1979 and later
developed to a more complex concept of structural risk minimization
(SRM). SVM is playing an increasing role in applications to
detection problems in various engineering problems, notably in
statistical signal processing, pattern recognition, image analysis, and
communication systems. In this paper, SVM was applied to the
detection of SAR (synthetic aperture radar) images in the presence of
partially developed speckle noise. The simulation was done for single
look and multi-look speckle models to give a complete overlook and
insight to the new proposed model of the SVM-based detector. The
structure of the SVM was derived and applied to real SAR images
and its performance in terms of the mean square error (MSE) metric
was calculated. We showed that the SVM-detected SAR images have
a very low MSE and are of good quality. The quality of the
processed speckled images improved for the multi-look model.
Furthermore, the contrast of the SVM detected images was higher
than that of the original non-noisy images, indicating that the SVM
approach increased the distance between the pixel reflectivity levels
(the detection hypotheses) in the original images.
Abstract: This paper examines the modeling and analysis of a
cruise control system using a Petri net based approach, task graphs,
invariant analysis and behavioral properties. It shows how the
structures used can be verified and optimized.
Abstract: QoS routing is an important component of Traffic
Engineering in networks that provide QoS guarantees. QoS routing is dependent on the link state information which is typically flooded across the network. This affects both the quality
of the routing and the utilization of the network resources. In
this paper, we examine establishing QoS routes with partial state
updates in wired sensor networks.
Abstract: Tandem mass spectrometry (MS/MS) is the engine
driving high-throughput protein identification. Protein mixtures possibly
representing thousands of proteins from multiple species are
treated with proteolytic enzymes, cutting the proteins into smaller
peptides that are then analyzed generating MS/MS spectra. The
task of determining the identity of the peptide from its spectrum
is currently the weak point in the process. Current approaches to de
novo sequencing are able to compute candidate peptides efficiently.
The problem lies in the limitations of current scoring functions. In this
paper we introduce the concept of proteome signature. By examining
proteins and compiling proteome signatures (amino acid usage) it is
possible to characterize likely combinations of amino acids and better
distinguish between candidate peptides. Our results strongly support
the hypothesis that a scoring function that considers amino acid usage
patterns is better able to distinguish between candidate peptides. This
in turn leads to higher accuracy in peptide prediction.
Abstract: Recordings from recent earthquakes have provided evidence that ground motions in the near field of a rupturing fault differ from ordinary ground motions, as they can contain a large energy, or “directivity" pulse. This pulse can cause considerable damage during an earthquake, especially to structures with natural periods close to those of the pulse. Failures of modern engineered structures observed within the near-fault region in recent earthquakes have revealed the vulnerability of existing RC buildings against pulse-type ground motions. This may be due to the fact that these modern structures had been designed primarily using the design spectra of available standards, which have been developed using stochastic processes with relatively long duration that characterizes more distant ground motions. Many recently designed and constructed buildings may therefore require strengthening in order to perform well when subjected to near-fault ground motions. Fiber Reinforced Polymers are considered to be a viable alternative, due to their relatively easy and quick installation, low life cycle costs and zero maintenance requirements. The objective of this paper is to investigate the adequacy of Artificial Neural Networks (ANN) to determine the three dimensional dynamic response of FRP strengthened RC buildings under the near-fault ground motions. For this purpose, one ANN model is proposed to estimate the base shear force, base bending moments and roof displacement of buildings in two directions. A training set of 168 and a validation set of 21 buildings are produced from FEA analysis results of the dynamic response of RC buildings under the near-fault earthquakes. It is demonstrated that the neural network based approach is highly successful in determining the response.
Abstract: Decision Support System (DSS) are interactive
software systems that are built to assist the management of an
organization in the decision making process when faced with nonroutine
problems in a specific application domain. Non-functional
requirements (NFRs) for a DSS deal with the desirable qualities and
restrictions that the DSS functionalities must satisfy. Unlike the
functional requirements, which are tangible functionalities provided
by the DSS, NFRs are often hidden and transparent to DSS users but
affect the quality of the provided functionalities. NFRs are often
overlooked or added later to the system in an ad hoc manner, leading
to a poor overall quality of the system. In this paper, we discuss the
development of NFRs as part of the requirements engineering phase
of the system development life cycle of DSSs. To help eliciting
NFRs, we provide a comprehensive taxonomy of NFRs for DSSs.
Abstract: Accurate evaluation of damping ratios involving soilstructure interaction (SSI) effects is the prerequisite for seismic design of in-situ buildings. This study proposes a combined approach to identify damping ratios of SSI systems based on ambient excitation technique. The proposed approach is illustrated with main test process, sampling principle and algorithm steps through an engineering example, as along with its feasibility and validity. The proposed approach is employed for damping ratio identification of 82 buildings in Xi-an, China. Based on the experimental data, the variation range and tendency of damping ratios of these SSI systems, along with the preliminary influence factor, are shown and discussed. In addition, a fitting curve indicates the relation between the damping ratio and fundamental natural period of SSI system.
Abstract: This paper deals with the experimental investigations
of the in-cylinder tumble flows in an unfired internal combustion
engine with a flat piston at the engine speeds ranging from 400 to
1000 rev/min., and also with the dome and dome-cavity pistons at an
engine speed of 1000 rev/min., using particle image velocimetry.
From the two-dimensional in-cylinder flow measurements, tumble
flow analysis is carried out in the combustion space on a vertical
plane passing through cylinder axis. To analyze the tumble flows,
ensemble average velocity vectors are used and to characterize it,
tumble ratio is estimated. From the results, generally, we have found
that tumble ratio varies mainly with crank angle position. Also, at the
end of compression stroke, average turbulent kinetic energy is more
at higher engine speeds. We have also found that, at 330 crank angle
position, flat piston shows an improvement of about 85 and 23% in
tumble ratio, and about 24 and 2.5% in average turbulent kinetic
energy compared to dome and dome-cavity pistons respectively
Abstract: For the past couple of decades Weak signal detection
is of crucial importance in various engineering and scientific
applications. It finds its application in areas like Wireless
communication, Radars, Aerospace engineering, Control systems and
many of those. Usually weak signal detection requires phase sensitive
detector and demodulation module to detect and analyze the signal.
This article gives you a preamble to intrusion detection system which
can effectively detect a weak signal from a multiplexed signal. By
carefully inspecting and analyzing the respective signal, this
system can successfully indicate any peripheral intrusion. Intrusion
detection system (IDS) is a comprehensive and easy approach
towards detecting and analyzing any signal that is weakened and
garbled due to low signal to noise ratio (SNR). This approach
finds significant importance in applications like peripheral security
systems.
Abstract: With continuous rise of oil price, how to develop alternative energy source has become a hot topic around the world. This study discussed the dynamic characteristics of an island power system operating under random wind speed lower than nominal wind
speeds of wind turbines. The system primarily consists of three diesel engine power generation systems, three constant-speed variable-pitch wind turbines, a small hydraulic induction generation system, and lumped static loads. Detailed models based on Matlab/Simulink were developed to cater for the dynamic behavior of the system. The results suggested this island power system can operate stably in this operational mode. This study can serve as an important reference for planning, operation, and further expansion of island power systems.
Abstract: The Canadian aerospace industry faces many
challenges. One of them is the difficulty in estimating costs. In
particular, the design effort required in a project impacts resource
requirements and lead-time, and consequently the final cost. This
paper presents the findings of a case study conducted for recognized
global leader in the design and manufacturing of aircraft engines. The
study models parametric cost estimation relationships to estimate the
design effort of integrated blade-rotor low-pressure compressor fans.
Several effort drivers are selected to model the relationship.
Comparative analyses of three types of models are conducted. The
model with the best accuracy and significance in design estimation is
retained.
Abstract: Research papers are usually evaluated via peer
review. However, peer review has limitations in evaluating research
papers. In this paper, Scienstein and the new idea of 'collaborative
document evaluation' are presented. Scienstein is a project to
evaluate scientific papers collaboratively based on ratings, links,
annotations and classifications by the scientific community using the
internet. In this paper, critical success factors of collaborative
document evaluation are analyzed. That is the scientists- motivation
to participate as reviewers, the reviewers- competence and the
reviewers- trustworthiness. It is shown that if these factors are
ensured, collaborative document evaluation may prove to be a more
objective, faster and less resource intensive approach to scientific
document evaluation in comparison to the classical peer review
process. It is shown that additional advantages exist as collaborative
document evaluation supports interdisciplinary work, allows
continuous post-publishing quality assessments and enables the
implementation of academic recommendation engines. In the long
term, it seems possible that collaborative document evaluation will
successively substitute peer review and decrease the need for
journals.
Abstract: Verification of real-time software systems can be
expensive in terms of time and resources. Testing is the main method
of proving correctness but has been shown to be a long and time
consuming process. Everyday engineers are usually unwilling to
adopt formal approaches to correctness because of the overhead
associated with developing their knowledge of such techniques.
Performance modelling techniques allow systems to be evaluated
with respect to timing constraints. This paper describes PARTES, a
framework which guides the extraction of performance models from
programs written in an annotated subset of C.
Abstract: Nowadays, the focus on renewable energy and alternative fuels has increased due to increasing oil prices, environment pollution, and also concern on preserving the nature. Biodiesel has been known as an attractive alternative fuel although biodiesel produced from edible oil is very expensive than conventional diesel. Therefore, the uses of biodiesel produced from non-edible oils are much better option. Currently Jatropha biodiesel (JBD) is receiving attention as an alternative fuel for diesel engine. Biodiesel is non-toxic, biodegradable, high lubricant ability, highly renewable, and its use therefore produces real reduction in petroleum consumption and carbon dioxide (CO2) emissions. Although biodiesel has many advantages, but it still has several properties need to improve, such as lower calorific value, lower effective engine power, higher emission of nitrogen oxides (NOX) and greater sensitivity to low temperature. Exhaust gas recirculation (EGR) is effective technique to reduce NOX emission from diesel engines because it enables lower flame temperature and oxygen concentration in the combustion chamber. Some studies succeeded to reduce the NOX emission from biodiesel by EGR but they observed increasing soot emission. The aim of this study was to investigate the engine performance and soot emission by using blended Jatropha biodiesel with different EGR rates. A CI engine that is water-cooled, turbocharged, using indirect injection system was used for the investigation. Soot emission, NOX, CO2, carbon monoxide (CO) were recorded and various engine performance parameters were also evaluated.
Abstract: The requirement to improve software productivity has
promoted the research on software metric technology. There are
metrics for identifying the quality of reusable components but the
function that makes use of these metrics to find reusability of
software components is still not clear. These metrics if identified in
the design phase or even in the coding phase can help us to reduce the
rework by improving quality of reuse of the component and hence
improve the productivity due to probabilistic increase in the reuse
level. CK metric suit is most widely used metrics for the objectoriented
(OO) software; we critically analyzed the CK metrics, tried
to remove the inconsistencies and devised the framework of metrics
to obtain the structural analysis of OO-based software components.
Neural network can learn new relationships with new input data and
can be used to refine fuzzy rules to create fuzzy adaptive system.
Hence, Neuro-fuzzy inference engine can be used to evaluate the
reusability of OO-based component using its structural attributes as
inputs. In this paper, an algorithm has been proposed in which the
inputs can be given to Neuro-fuzzy system in form of tuned WMC,
DIT, NOC, CBO , LCOM values of the OO software component and
output can be obtained in terms of reusability. The developed
reusability model has produced high precision results as expected by
the human experts.
Abstract: The quantified residence time distribution (RTD)
provides a numerical characterization of mixing in a reactor, thus
allowing the process engineer to better understand mixing
performance of the reactor.This paper discusses computational
studies to investigate flow patterns in a two impinging streams
cyclone reactor(TISCR) . Flow in the reactor was modeled with
computational fluid dynamics (CFD). Utilizing the Eulerian-
Lagrangian approach, implemented in FLUENT (V6.3.22), particle
trajectories were obtained by solving the particle force balance
equations. From simulation results obtained at different Δts, the mean
residence time (tm) and the mean square deviation (σ2) were
calculated. a good agreement can be observed between predicted and
experimental data. Simulation results indicate that the behavior of
complex reactor systems can be predicted using the CFD technique
with minimum data requirement for validation.
Abstract: There is significant interest in achieving technology
innovation through new product development activities. It is
recognized, however, that traditional project management practices
focused only on performance, cost, and schedule attributes, can often
lead to risk mitigation strategies that limit new technology
innovation. In this paper, a new approach is proposed for formally
managing and quantifying technology innovation. This approach uses
a risk-based framework that simultaneously optimizes innovation
attributes along with traditional project management and system
engineering attributes. To demonstrate the efficacy of the new riskbased
approach, a comprehensive product development experiment
was conducted. This experiment simultaneously managed the
innovation risks and the product delivery risks through the proposed
risk-based framework. Quantitative metrics for technology
innovation were tracked and the experimental results indicate that the
risk-based approach can simultaneously achieve both project
deliverable and innovation objectives.
Abstract: This paper focuses on the probabilistic numerical
solution of the problems in biomechanics and mining. Applications of
Simulation-Based Reliability Assessment (SBRA) Method are
presented in the solution of designing of the external fixators applied
in traumatology and orthopaedics (these fixators can be applied for
the treatment of open and unstable fractures etc.) and in the solution
of a hard rock (ore) disintegration process (i.e. the bit moves into the
ore and subsequently disintegrates it, the results are compared with
experiments, new design of excavation tool is proposed.
Abstract: This paper focuses on a critical component of the
situational awareness (SA), the control of autonomous vertical flight for tactical unmanned aerial vehicle (TUAV). With the SA strategy,
we proposed a two stage flight control procedure using two autonomous control subsystems to address the dynamics variation
and performance requirement difference in initial and final stages of flight trajectory for a nontrivial nonlinear eight-rotor helicopter
model. This control strategy for chosen model of mini-TUAV has been verified by simulation of hovering maneuvers using software
package Simulink and demonstrated good performance for fast
stabilization of engines in hovering, consequently, fast SA with
economy in energy of batteries can be asserted during search-andrescue
operations.
Abstract: In this paper multi-objective genetic algorithms are
employed for Pareto approach optimization of ideal Turboshaft
engines. In the multi-objective optimization a number of conflicting
objective functions are to be optimized simultaneously. The
important objective functions that have been considered for
optimization are specific thrust (F/m& 0), specific fuel consumption
( P S ), output shaft power 0 (& /&) shaft W m and overall efficiency( ) O
η .
These objectives are usually conflicting with each other. The design
variables consist of thermodynamic parameters (compressor pressure
ratio, turbine temperature ratio and Mach number).
At the first stage single objective optimization has been
investigated and the method of NSGA-II has been used for multiobjective
optimization. Optimization procedures are performed for
two and four objective functions and the results are compared for
ideal Turboshaft engine. In order to investigate the optimal
thermodynamic behavior of two objectives, different set, each
including two objectives of output parameters, are considered
individually. For each set Pareto front are depicted. The sets of
selected decision variables based on this Pareto front, will cause the
best possible combination of corresponding objective functions.
There is no superiority for the points on the Pareto front figure,
but they are superior to any other point. In the case of four objective
optimization the results are given in tables.