Abstract: Different variants for buoyancy-affected terms in k-ε turbulence model have been utilized to predict the flow parameters more accurately, and investigate applicability of alternative k-ε turbulence buoyant closures in numerical simulation of a horizontal gravity current. The additional non-isotropic turbulent stress due to buoyancy has been considered in production term, based on Algebraic Stress Model (ASM). In order to account for turbulent scalar fluxes, general gradient diffusion hypothesis has been used along with Boussinesq gradient diffusion hypothesis with a variable turbulent Schmidt number and additional empirical constant c3ε.To simulate buoyant flow domain a 2D vertical numerical model (WISE, Width Integrated Stratified Environments), based on Reynolds- Averaged Navier-Stokes (RANS) equations, has been deployed and the model has been further developed for different k-ε turbulence closures. Results are compared against measured laboratory values of a saline gravity current to explore the efficient turbulence model.
Abstract: Positron emission particle tracking (PEPT) is a
technique in which a single radioactive tracer particle can be
accurately tracked as it moves. A limitation of PET is that in order to
reconstruct a tomographic image it is necessary to acquire a large
volume of data (millions of events), so it is difficult to study rapidly
changing systems. By considering this fact, PEPT is a very fast
process compared with PET.
In PEPT detecting both photons defines a line and the annihilation
is assumed to have occurred somewhere along this line. The location
of the tracer can be determined to within a few mm from coincident
detection of a small number of pairs of back-to-back gamma rays and
using triangulation. This can be achieved many times per second and
the track of a moving particle can be reliably followed. This
technique was invented at the University of Birmingham [1].
The attempt in PEPT is not to form an image of the tracer particle
but simply to determine its location with time. If this tracer is
followed for a long enough period within a closed, circulating system
it explores all possible types of motion.
The application of PEPT to industrial process systems carried out
at the University of Birmingham is categorized in two subjects: the
behaviour of granular materials and viscous fluids. Granular
materials are processed in industry for example in the manufacture of
pharmaceuticals, ceramics, food, polymers and PEPT has been used
in a number of ways to study the behaviour of these systems [2].
PEPT allows the possibility of tracking a single particle within the
bed [3]. Also PEPT has been used for studying systems such as: fluid
flow, viscous fluids in mixers [4], using a neutrally-buoyant tracer
particle [5].
Abstract: The majority of existing predictors for time series are
model-dependent and therefore require some prior knowledge for the
identification of complex systems, usually involving system
identification, extensive training, or online adaptation in the case of
time-varying systems. Additionally, since a time series is usually
generated by complex processes such as the stock market or other
chaotic systems, identification, modeling or the online updating of
parameters can be problematic. In this paper a model-free predictor
(MFP) for a time series produced by an unknown nonlinear system or
process is derived using tracking theory. An identical derivation of the
MFP using the property of the Newton form of the interpolating
polynomial is also presented. The MFP is able to accurately predict
future values of a time series, is stable, has few tuning parameters and
is desirable for engineering applications due to its simplicity, fast
prediction speed and extremely low computational load. The
performance of the proposed MFP is demonstrated using the
prediction of the Dow Jones Industrial Average stock index.
Abstract: Electronic commerce is growing rapidly with on-line
sales already heading for hundreds of billion dollars per year. Due to
the huge amount of money transferred everyday, an increased
security level is required. In this work we present the architecture of
an intelligent speaker verification system, which is able to accurately
verify the registered users of an e-commerce service using only their
voices as an input. According to the proposed architecture, a
transaction-based e-commerce application should be complemented
by a biometric server where customer-s unique set of speech models
(voiceprint) is stored. The verification procedure requests from the
user to pronounce a personalized sequence of digits and after
capturing speech and extracting voice features at the client side are
sent back to the biometric server. The biometric server uses pattern
recognition to decide whether the received features match the stored
voiceprint of the customer who claims to be, and accordingly grants
verification. The proposed architecture can provide e-commerce
applications with a higher degree of certainty regarding the identity
of a customer, and prevent impostors to execute fraudulent
transactions.
Abstract: In this paper, an analysis of a target location estimation
system using the best linear unbiased estimator (BLUE) for high
performance radar systems is presented. In synthetic environments,
we are here concerned with three key elements of radar system
modeling, which makes radar systems operates accurately in strategic
situation in virtual ground. Radar Cross Section (RCS) modeling
is used to determine the actual amount of electromagnetic waves
that are reflected from a tactical object. Pattern Propagation Factor
(PPF) is an attenuation coefficient of the radar equation that contains
the reflection from the surface of the earth, the diffraction, the
refraction and scattering by the atmospheric environment. Clutter is
the unwanted echoes of electronic systems. For the data fusion of
output results from radar detection in synthetic environment, BLUE
is used and compared with the mean values of each simulation results.
Simulation results demonstrate the performance of the radar system.
Abstract: The leisure boatbuilding industry has tight profit margins that demand that boats are created to a high quality but with low cost. This requirement means reduced design times combined with increased use of design for production can lead to large benefits. The evolutionary nature of the boatbuilding industry can lead to a large usage of previous vessels in new designs. With the increase in automated tools for concurrent engineering within structural design it is important that these tools can reuse this information while subsequently feeding this to designers. The ability to accurately gather this materials and parts data is also a key component to these tools. This paper therefore aims to develop an architecture made up of neural networks and databases to feed information effectively to the designers based on previous design experience.
Abstract: Accurately predicting non-peak traffic is crucial to
daily traffic for all forecasting models. In the paper, least squares
support vector machines (LS-SVMs) are investigated to solve such a
practical problem. It is the first time to apply the approach and analyze
the forecast performance in the domain. For comparison purpose, two
parametric and two non-parametric techniques are selected because of
their effectiveness proved in past research. Having good
generalization ability and guaranteeing global minima, LS-SVMs
perform better than the others. Providing sufficient improvement in
stability and robustness reveals that the approach is practically
promising.
Abstract: A specially designed flat plate was mounted vertically
over the axial line in the wind tunnel of the Aerospace Department of
the Pusan National University. The plate is 2 m long, 0.8 m high and 8
cm thick. The measurements were performed in velocity range from
15 to 60 m/s. A sand paper turbulizer was placed close to the plate nose
to provide fully developed turbulent boundary layer over the most part
of the plate. Strain balances were mounted in the trailing part of the
plate to measure the skin friction drag over removable insertions of
0.55×0.25m2 size. A set of the insertions was designed and
manufactured: 3mm thick polished metal surface and three compliant
surfaces. The compliant surfaces were manufactured of a silicone
rubber Silastic® S2 (Dow Corning company). To modify the
viscoelastic properties of the rubber, its composition was varied: 90%
of the rubber + 10% catalyst (standard), 92.5% + 7.5% (weak), 85% +
15% (strong). Modulus of elasticity and the loss tangent were
measured accurately for these materials in the frequency range from
40 Hz to 3 KHz using the unique proposed technique.
Abstract: This article discusses stress analysis and the shape characteristics of the inflatable wing, and then introduces the design method of inflatable wing, in order to accurately approximate a standard airfoil. It specifically analyses the aerodynamic characteristics of the inflatable wing with the method of CFD, along with comparing to standard airfoil, afterwards we carries out the manufacture of inflatable wing and the flight test.
Abstract: In recent years, many researches to mine the exploding Web world, especially User Generated Content (UGC) such as
weblogs, for knowledge about various phenomena and events in the physical world have been done actively, and also Web services
with the Web-mined knowledge have begun to be developed for
the public. However, there are few detailed investigations on how accurately Web-mined data reflect physical-world data. It must be
problematic to idolatrously utilize the Web-mined data in public Web services without ensuring their accuracy sufficiently. Therefore,
this paper introduces the simplest Web Sensor and spatiotemporallynormalized
Web Sensor to extract spatiotemporal data about a target
phenomenon from weblogs searched by keyword(s) representing the
target phenomenon, and tries to validate the potential and reliability of the Web-sensed spatiotemporal data by four kinds of granularity
analyses of coefficient correlation with temperature, rainfall, snowfall,
and earthquake statistics per day by region of Japan Meteorological
Agency as physical-world data: spatial granularity (region-s population
density), temporal granularity (time period, e.g., per day vs. per week), representation granularity (e.g., “rain" vs. “heavy rain"), and
media granularity (weblogs vs. microblogs such as Tweets).
Abstract: This work presents a fusion of Log Gabor Wavelet
(LGW) and Maximum a Posteriori (MAP) estimator as a speech
enhancement tool for acoustical background noise reduction. The
probability density function (pdf) of the speech spectral amplitude is
approximated by a Generalized Laplacian Distribution (GLD).
Compared to earlier estimators the proposed method estimates the
underlying statistical model more accurately by appropriately
choosing the model parameters of GLD. Experimental results show
that the proposed estimator yields a higher improvement in
Segmental Signal-to-Noise Ratio (S-SNR) and lower Log-Spectral
Distortion (LSD) in two different noisy environments compared to
other estimators.
Abstract: Design and modeling of nonlinear systems require the
knowledge of all inside acting parameters and effects. An empirical
alternative is to identify the system-s transfer function from input and
output data as a black box model. This paper presents a procedure
using least squares algorithm for the identification of a feed drive
system coefficients in time domain using a reduced model based on
windowed input and output data. The command and response of the
axis are first measured in the first 4 ms, and then least squares are
applied to predict the transfer function coefficients for this
displacement segment. From the identified coefficients, the next
command response segments are estimated. The obtained results
reveal a considerable potential of least squares method to identify the
system-s time-based coefficients and predict accurately the command
response as compared to measurements.
Abstract: This paper presented the potential of smart phone to
provide support on mapping the indoor asset. The advantage of using
the smart phone to generate the indoor map is that it has the ability to
capture, store and reproduces still or video images; indeed most of us
do have this powerful gadget. The captured images usually used by
maintenance team to save a record for future reference. Here, these
images are used to generate 3D models of an object precisely and
accurately for efficient and effective solution in data gathering. Thus,
it could be a resource for an informative database in asset
management.
Abstract: C-control chart assumes that process nonconformities follow a Poisson distribution. In actuality, however, this Poisson distribution does not always occur. A process control for semiconductor based on a Poisson distribution always underestimates the true average amount of nonconformities and the process variance. Quality is described more accurately if a compound Poisson process is used for process control at this time. A cumulative sum (CUSUM) control chart is much better than a C control chart when a small shift will be detected. This study calculates one-sided CUSUM ARLs using a Markov chain approach to construct a CUSUM control chart with an underlying Poisson-Gamma compound distribution for the failure mechanism. Moreover, an actual data set from a wafer plant is used to demonstrate the operation of the proposed model. The results show that a CUSUM control chart realizes significantly better performance than EWMA.
Abstract: Cancer classification to their corresponding cohorts has been key area of research in bioinformatics aiming better prognosis of the disease. High dimensionality of gene data has been makes it a complex task and requires significance data identification technique in order to reducing the dimensionality and identification of significant information. In this paper, we have proposed a novel approach for classification of oral cancer into metastasis positive and negative patients. We have used significance analysis of microarrays (SAM) for identifying significant genes which constitutes gene signature. 3 different gene signatures were identified using SAM from 3 different combination of training datasets and their classification accuracy was calculated on corresponding testing datasets using k-Nearest Neighbour (kNN), Fuzzy C-Means Clustering (FCM), Support Vector Machine (SVM) and Backpropagation Neural Network (BPNN). A final gene signature of only 9 genes was obtained from above 3 individual gene signatures. 9 gene signature-s classification capability was compared using same classifiers on same testing datasets. Results obtained from experimentation shows that 9 gene signature classified all samples in testing dataset accurately while individual genes could not classify all accurately.
Abstract: Sensorized instruments that accurately measure the interaction forces (between biological tissue and instrument endeffector) during surgical procedures offer surgeons a greater sense of immersion during minimally invasive robotic surgery. Although there is ongoing research into force measurement involving surgical graspers little corresponding effort has been carried out on the measurement of forces between scissor blades and tissue. This paper presents the design and development of a force measurement test apparatus, which will serve as a sensor characterization and evaluation platform. The primary aim of the experiments is to ascertain whether the system can differentiate between tissue samples with differing mechanical properties in a reliable, repeatable manner. Force-angular displacement curves highlight trends in the cutting process as well the forces generated along the blade during a cutting procedure. Future applications of the test equipment will involve the assessment of new direct force sensing technologies for telerobotic surgery.
Abstract: Rapid Prototyping (RP) is a technology that produces models and prototype parts from 3D CAD model data, CT/MRI scan data, and model data created from 3D object digitizing systems. There are several RP process like Stereolithography (SLA), Solid Ground Curing (SGC), Selective Laser Sintering (SLS), Fused Deposition Modeling (FDM), 3D Printing (3DP) among them SLS and FDM RP processes are used to fabricate pattern of custom cranial implant. RP technology is useful in engineering and biomedical application. This is helpful in engineering for product design, tooling and manufacture etc. RP biomedical applications are design and development of medical devices, instruments, prosthetics and implantation; it is also helpful in planning complex surgical operation. The traditional approach limits the full appreciation of various bony structure movements and therefore the custom implants produced are difficult to measure the anatomy of parts and analyze the changes in facial appearances accurately. Cranioplasty surgery is a surgical correction of a defect in cranial bone by implanting a metal or plastic replacement to restore the missing part. This paper aims to do a comparative study on the dimensional error of CAD and SLS RP Models for reconstruction of cranial defect by comparing the virtual CAD with the physical RP model of a cranial defect.
Abstract: In gas lifted oil fields, the lift gas should be distributed optimally among the wells which share gas from a common source to maximize total oil production. One of the objectives of the paper is to show that a linear MPC consisting of a control objective and an economic objective can be used both as an optimizer and a controller for gas lifted systems. The MPC is based on linearized model of the oil field developed from first principles modeling. Simulation results show that the total oil production is increased by 3.4%. Difficulties in accurately measuring the bottom hole pressure using sensors in harsh operating conditions can be resolved by using an Unscented Kalman Filter (UKF) for estimation. In oil fields where input disturbance (total supply of gas) is not measured, UKF can also be used for disturbance estimation. Increased total oil production due to optimization leads to increased profit.
Abstract: Fast delay estimation methods, as opposed to
simulation techniques, are needed for incremental performance
driven layout synthesis. On-chip inductive effects are becoming
predominant in deep submicron interconnects due to increasing clock
speed and circuit complexity. Inductance causes noise in signal
waveforms, which can adversely affect the performance of the circuit
and signal integrity. Several approaches have been put forward which
consider the inductance for on-chip interconnect modelling. But for
even much higher frequency, of the order of few GHz, the shunt
dielectric lossy component has become comparable to that of other
electrical parameters for high speed VLSI design. In order to cope up
with this effect, on-chip interconnect has to be modelled as
distributed RLCG line. Elmore delay based methods, although
efficient, cannot accurately estimate the delay for RLCG interconnect
line. In this paper, an accurate analytical delay model has been
derived, based on first and second moments of RLCG
interconnection lines. The proposed model considers both the effect
of inductance and conductance matrices. We have performed the
simulation in 0.18μm technology node and an error of as low as less
as 5% has been achieved with the proposed model when compared to
SPICE. The importance of the conductance matrices in interconnect
modelling has also been discussed and it is shown that if G is
neglected for interconnect line modelling, then it will result an delay
error of as high as 6% when compared to SPICE.