Abstract: Determination of wellbore problems during a
production/injection process might be evaluated thorough
temperature log analysis. Other applications of this kind of log
analysis may also include evaluation of fluid distribution analysis
along the wellbore and identification of anomalies encountered
during production/injection process. While the accuracy of such
prediction is paramount, the common method of determination of a
wellbore temperature log includes use of steady-state energy balance
equations, which hardly describe the real conditions as observed in
typical oil and gas flowing wells during production operation; and
thus increase level of uncertainties. In this study, a practical method
has been proposed through development of a simplified semianalytical
model to apply for predicting temperature profile along the
wellbore. The developed model includes an overall heat transfer
coefficient accounting all modes of heat transferring mechanism,
which has been focused on the prediction of a temperature profile as
a function of depth for the injection/production wells. The model has
been validated with the results obtained from numerical simulation.
Abstract: Modern building automation needs to deal with very
different types of demands, depending on the use of a building and the
persons acting in it. To meet the requirements of situation awareness
in modern building automation, scenario recognition becomes more
and more important in order to detect sequences of events and to react
to them properly. We present two concepts of scenario recognition
and their implementation, one based on predefined templates and the
other applying an unsupervised learning algorithm using statistical
methods. Implemented applications will be described and their advantages
and disadvantages will be outlined.
Abstract: Support Vector Machine (SVM) is a statistical
learning tool that was initially developed by Vapnik in 1979 and later
developed to a more complex concept of structural risk minimization
(SRM). SVM is playing an increasing role in applications to
detection problems in various engineering problems, notably in
statistical signal processing, pattern recognition, image analysis, and
communication systems. In this paper, SVM was applied to the
detection of SAR (synthetic aperture radar) images in the presence of
partially developed speckle noise. The simulation was done for single
look and multi-look speckle models to give a complete overlook and
insight to the new proposed model of the SVM-based detector. The
structure of the SVM was derived and applied to real SAR images
and its performance in terms of the mean square error (MSE) metric
was calculated. We showed that the SVM-detected SAR images have
a very low MSE and are of good quality. The quality of the
processed speckled images improved for the multi-look model.
Furthermore, the contrast of the SVM detected images was higher
than that of the original non-noisy images, indicating that the SVM
approach increased the distance between the pixel reflectivity levels
(the detection hypotheses) in the original images.
Abstract: In the last few years, the Semantic Web gained scientific acceptance as a means of relationships identification in knowledge base, widely known by semantic association. Query about complex relationships between entities is a strong requirement for many applications in analytical domains. In bioinformatics for example, it is critical to extract exchanges between proteins. Currently, the widely known result of such queries is to provide paths between connected entities from data graph. However, they do not always give good results while facing the user need by the best association or a set of limited best association, because they only consider all existing paths but ignore the path evaluation. In this paper, we present an approach for supporting association discovery queries. Our proposal includes (i) a query language PmSPRQL which provides a multiparadigm query expressions for association extraction and (ii) some quantification measures making easy the process of association ranking. The originality of our proposal is demonstrated by a performance evaluation of our approach on real world datasets.
Abstract: Bootstrapping has gained popularity in different tests of hypotheses as an alternative in using asymptotic distribution if one is not sure of the distribution of the test statistic under a null hypothesis. This method, in general, has two variants – the parametric and the nonparametric approaches. However, issues on reliability of this method always arise in many applications. This paper addresses the issue on reliability by establishing a reliability measure in terms of quantiles with respect to asymptotic distribution, when this is approximately correct. The test of hypotheses used is Ftest. The simulated results show that using nonparametric bootstrapping in F-test gives better reliability than parametric bootstrapping with relatively higher degrees of freedom.
Abstract: Infrared focal plane arrays (IRFPA) sensors, due to
their high sensitivity, high frame frequency and simple structure, have
become the most prominently used detectors in military applications.
However, they suffer from a common problem called the fixed pattern
noise (FPN), which severely degrades image quality and limits the
infrared imaging applications. Therefore, it is necessary to perform
non-uniformity correction (NUC) on IR image. The algorithms of
non-uniformity correction are classified into two main categories, the
calibration-based and scene-based algorithms. There exist some
shortcomings in both algorithms, hence a novel non-uniformity
correction algorithm based on non-linear fit is proposed, which
combines the advantages of the two algorithms. Experimental results
show that the proposed algorithm acquires a good effect of NUC with
a lower non-uniformity ratio.
Abstract: For the past couple of decades Weak signal detection
is of crucial importance in various engineering and scientific
applications. It finds its application in areas like Wireless
communication, Radars, Aerospace engineering, Control systems and
many of those. Usually weak signal detection requires phase sensitive
detector and demodulation module to detect and analyze the signal.
This article gives you a preamble to intrusion detection system which
can effectively detect a weak signal from a multiplexed signal. By
carefully inspecting and analyzing the respective signal, this
system can successfully indicate any peripheral intrusion. Intrusion
detection system (IDS) is a comprehensive and easy approach
towards detecting and analyzing any signal that is weakened and
garbled due to low signal to noise ratio (SNR). This approach
finds significant importance in applications like peripheral security
systems.
Abstract: This paper presents three new methodologies for the
basic operations, which aim at finding new ways of computing union
(maximum) and intersection (minimum) membership values by
taking into effect the entire membership values in a fuzzy set. The
new methodologies are conceptually simple and easy from the
application point of view and are illustrated with a variety of
problems such as Cartesian product of two fuzzy sets, max –min
composition of two fuzzy sets in different product spaces and an
application of an inverted pendulum to determine the impact of the
new methodologies. The results clearly indicate a difference based on
the nature of the fuzzy sets under consideration and hence will be
highly useful in quite a few applications where different values have
significant impact on the behavior of the system.
Abstract: Simulation is a very helpful and valuable work tool in
manufacturing. It can be used in industrial field allowing the
system`s behavior to be learnt and tested. Simulation provides a low
cost, secure and fast analysis tool. It also provides benefits, which
can be reached with many different system configurations. Topics to
be discussed include: Applications, Modeling, Validating, Software
and benefits of simulation. This paper provides a comprehensive
literature review on research efforts in simulation.
Abstract: This paper describes the results of an extensive study
and comparison of popular hash functions SHA-1, SHA-256,
RIPEMD-160 and RIPEMD-320 with JERIM-320, a 320-bit hash
function. The compression functions of hash functions like SHA-1
and SHA-256 are designed using serial successive iteration whereas
those like RIPEMD-160 and RIPEMD-320 are designed using two
parallel lines of message processing. JERIM-320 uses four parallel
lines of message processing resulting in higher level of security than
other hash functions at comparable speed and memory requirement.
The performance evaluation of these methods has been done by using
practical implementation and also by using step computation
methods. JERIM-320 proves to be secure and ensures the integrity of
messages at a higher degree. The focus of this work is to establish
JERIM-320 as an alternative of the present day hash functions for the
fast growing internet applications.
Abstract: Delivering streaming video over wireless is an
important component of many interactive multimedia applications
running on personal wireless handset devices. Such personal devices
have to be inexpensive, compact, and lightweight. But wireless
channels have a high channel bit error rate and limited bandwidth.
Delay variation of packets due to network congestion and the high bit
error rate greatly degrades the quality of video at the handheld
device. Therefore, mobile access to multimedia contents requires
video transcoding functionality at the edge of the mobile network for
interworking with heterogeneous networks and services. Therefore,
to guarantee quality of service (QoS) delivered to the mobile user, a
robust and efficient transcoding scheme should be deployed in
mobile multimedia transporting network. Hence, this paper
examines the challenges and limitations that the video transcoding
schemes in mobile multimedia transporting network face. Then
handheld resources, network conditions and content based mobile
and wireless video transcoding is proposed to provide high QoS
applications. Exceptional performance is demonstrated in the
experiment results. These experiments were designed to verify and
prove the robustness of the proposed approach. Extensive
experiments have been conducted, and the results of various video
clips with different bit rate and frame rate have been provided.
Abstract: In this paper, design, fabrication and coupled
multifield analysis of hollow out-of-plane silicon microneedle array
with piezoelectrically actuated microfluidic device for transdermal
drug delivery (TDD) applications is presented. The fabrication
process of silicon microneedle array is first done by series of
combined isotropic and anisotropic etching processes using
inductively coupled plasma (ICP) etching technology. Then coupled
multifield analysis of MEMS based piezoelectrically actuated device
with integrated 2×2 silicon microneedle array is presented. To predict
the stress distribution and model fluid flow in coupled field analysis,
finite element (FE) and computational fluid dynamic (CFD) analysis
using ANSYS rather than analytical systems has been performed.
Static analysis and transient CFD analysis were performed to predict
the fluid flow through the microneedle array. The inlet pressure from
10 kPa to 150 kPa was considered for static CFD analysis. In the
lumen region fluid flow rate 3.2946 μL/min is obtained at 150 V for
2×2 microneedle array. In the present study the authors have
performed simulation of structural, piezoelectric and CFD analysis
on three dimensional model of the piezoelectrically actuated
mcirofluidic device integrated with 2×2 microneedle array.
Abstract: Wide applicability of concurrent programming
practices in developing various software applications leads to
different concurrency errors amongst which data race is the most
important. Java provides greatest support for concurrent
programming by introducing various concurrency packages. Aspect
oriented programming (AOP) is modern programming paradigm
facilitating the runtime interception of events of interest and can be
effectively used to handle the concurrency problems. AspectJ being
an aspect oriented extension to java facilitates the application of
concepts of AOP for data race detection. Volatile variables are
usually considered thread safe, but they can become the possible
candidates of data races if non-atomic operations are performed
concurrently upon them. Various data race detection algorithms have
been proposed in the past but this issue of volatility and atomicity is
still unaddressed. The aim of this research is to propose some
suggestions for incorporating certain conditions for data race
detection in java programs at the volatile fields by taking into account
support for atomicity in java concurrency packages and making use
of pointcuts. Two simple test programs will demonstrate the results
of research. The results are verified on two different Java
Development Kits (JDKs) for the purpose of comparison.
Abstract: Web services provide significant new benefits for SOAbased
applications, but they also expose significant new security
risks. There are huge number of WS security standards and
processes. At present, there is still a lack of a comprehensive
approach which offers a methodical development in the construction
of secure WS-based SOA. Thus, the main objective of this paper is
to address this needs, presenting a comprehensive method for Web
Services Security guaranty in SOA. The proposed method defines
three stages, Initial Security Analysis, Architectural Security
Guaranty and WS Security Standards Identification. These facilitate,
respectively, the definition and analysis of WS-specific security
requirements, the development of a WS-based security architecture
and the identification of the related WS security standards that the
security architecture must articulate in order to implement the
security services.
Abstract: The talks about technological convergence had been
around for almost twenty years. Today Internet made it possible. And
this is not only technical evolution. The way it changed our lives
reflected in variety of applications, services and technologies used in
day-to-day life. Such benefits imposed even more requirements on
heterogeneous and unreliable IP networks.
Current paper outlines QoS management system developed in the
NetQoS [1] project. It describes an overall architecture of
management system for heterogeneous networks and proposes
automated multi-layer QoS management. Paper focuses on the
structure of the most crucial modules of the system that enable
autonomous and multi-layer provisioning and dynamic adaptation.
Abstract: The use of un-activated bentonite, and un-activated
bentonite blended with limestone for the treatment of acid mine
drainage (AMD) was investigated. Batch experiments were
conducted in a 5 L PVC reactor. Un-activated bentonite on its own
did not effectively neutralize and remove heavy metals from AMD.
The final pH obtained was below 4 and the metal removal efficiency
was below 50% for all the metals when bentonite solid loadings of 1,
5 and 10% were used. With un-activated bentonite (1%) blended with
1% limestone, the final pH obtained was approximately 7 and metal
removal efficiencies were greater than 60% for most of the metals.
The Langmuir isotherm gave the best fit for the experimental data
giving correlation coefficient (R2) very close to 1. Thus, it was
concluded that un-activated bentonite blended with limestone is
suitable for potential applications in removing heavy metals and
neutralizing AMD.
Abstract: This paper focuses on the probabilistic numerical
solution of the problems in biomechanics and mining. Applications of
Simulation-Based Reliability Assessment (SBRA) Method are
presented in the solution of designing of the external fixators applied
in traumatology and orthopaedics (these fixators can be applied for
the treatment of open and unstable fractures etc.) and in the solution
of a hard rock (ore) disintegration process (i.e. the bit moves into the
ore and subsequently disintegrates it, the results are compared with
experiments, new design of excavation tool is proposed.
Abstract: Virtual Reality Modelling Language (VRML) is description language, which belongs to a field Window on World virtual reality system. The file, which is in VRML format, can be interpreted by VRML explorer in three-dimensional scene. VRML was created with aim to represent virtual reality on Internet easier. Development of 3D graphic is connected with Silicon Graphic Corporation. VRML 2.0 is the file format for describing interactive 3D scenes and objects. It can be used in collaboration with www, can be used for 3D complex representations creating of scenes, products or VR applications VRML 2.0 enables represent static and animated objects too. Interesting application of VRML is in area of manufacturing systems presentation.
Abstract: A unique combination of adsorption and
electrochemical regeneration with a proprietary adsorbent material
called Nyex 100 was introduced at the University of Manchester for
waste water treatment applications. Nyex 100 is based on graphite
intercalation compound. It is non porous and electrically conducing
adsorbent material. This material exhibited very small BET surface
area i.e. 2.75 m2g-1, in consequence, small adsorptive capacities for
the adsorption of various organic pollutants were obtained. This work
aims to develop composite adsorbent material essentially capable of
electrochemical regeneration coupled with improved adsorption
characteristics. An organic dye, acid violet 17 was used as standard
organic pollutant. The developed composite material was
successfully electrochemically regenerated using a DC current of 1 A
for 60 minutes. Regeneration efficiency was maintained at around
100% for five adsorption-regeneration cycles.
Abstract: Web applications have become very complex and
crucial, especially when combined with areas such as CRM
(Customer Relationship Management) and BPR (Business Process
Reengineering), the scientific community has focused attention to
Web applications design, development, analysis, and testing, by
studying and proposing methodologies and tools. This paper
proposes an approach to automatic multi-dimensional concern
mining for Web Applications, based on concepts analysis, impact
analysis, and token-based concern identification. This approach lets
the user to analyse and traverse Web software relevant to a particular
concern (concept, goal, purpose, etc.) via multi-dimensional
separation of concerns, to document, understand and test Web
applications. This technique was developed in the context of WAAT
(Web Applications Analysis and Testing) project. A semi-automatic
tool to support this technique is currently under development.