Abstract: In this paper we discuss the development of an Augmented Reality (AR) - based scientific visualization system prototype that supports identification, localisation, and 3D visualisation of oil leakages sensors datasets. Sensors generates significant amount of multivariate datasets during normal and leak situations. Therefore we have developed a data model to effectively manage such data and enhance the computational support needed for the effective data explorations. A challenge of this approach is to reduce the data inefficiency powered by the disparate, repeated, inconsistent and missing attributes of most available sensors datasets. To handle this challenge, this paper aim to develop an AR-based scientific visualization interface which automatically identifies, localise and visualizes all necessary data relevant to a particularly selected region of interest (ROI) along the virtual pipeline network. Necessary system architectural supports needed as well as the interface requirements for such visualizations are also discussed in this paper.
Abstract: Current advancements in nanotechnology are dependent
on the capabilities that can enable nano-scientists to extend their eyes
and hands into the nano-world. For this purpose, a haptics (devices
capable of recreating tactile or force sensations) based system for
AFM (Atomic Force Microscope) is proposed. The system enables
the nano-scientists to touch and feel the sample surfaces, viewed
through AFM, in order to provide them with better understanding of
the physical properties of the surface, such as roughness, stiffness and
shape of molecular architecture. At this stage, the proposed work uses
of ine images produced using AFM and perform image analysis to
create virtual surfaces suitable for haptics force analysis. The research
work is in the process of extension from of ine to online process
where interaction will be done directly on the material surface for
realistic analysis.
Abstract: A statistical optimization of the saccharification
process of EFB was studied. The statistical analysis was done by
applying faced centered central composite design (FCCCD) under
response surface methodology (RSM). In this investigation, EFB
dose, enzyme dose and saccharification period was examined, and the
maximum 53.45% (w/w) yield of reducing sugar was found with 4%
(w/v) of EFB, 10% (v/v) of enzyme after 120 hours of incubation. It
can be calculated that the conversion rate of cellulose content of the
substrate is more than 75% (w/w) which can be considered as a
remarkable achievement. All the variables, linear, quadratic and
interaction coefficient, were found to be highly significant, other than
two coefficients, one quadratic and another interaction coefficient.
The coefficient of determination (R2) is 0.9898 that confirms a
satisfactory data and indicated that approximately 98.98% of the
variability in the dependent variable, saccharification of EFB, could
be explained by this model.
Abstract: This paper presents an algorithm based on the
wavelet decomposition, for feature extraction from the ECG signal
and recognition of three types of Ventricular Arrhythmias using
neural networks. A set of Discrete Wavelet Transform (DWT)
coefficients, which contain the maximum information about the
arrhythmias, is selected from the wavelet decomposition. After that a
novel clustering algorithm based on nature inspired algorithm (Ant
Colony Optimization) is developed for classifying arrhythmia types.
The algorithm is applied on the ECG registrations from the MIT-BIH
arrhythmia and malignant ventricular arrhythmia databases. We
applied Daubechies 4 wavelet in our algorithm. The wavelet
decomposition enabled us to perform the task efficiently and
produced reliable results.
Abstract: This paper presents a case study that uses processoriented
simulation to identify bottlenecks in the service delivery
system in an emergency department of a hospital in the United Arab
Emirates. Using results of the simulation, response surface models
were developed to explain patient waiting time and the total time
patients spend in the hospital system. Results of the study could be
used as a service improvement tool to help hospital management in
improving patient throughput and service quality in the hospital
system.
Abstract: The increase popularity of multimedia application especially in image processing places a great demand on efficient data storage and transmission techniques. Network communication such as wireless network can easily be intercepted and cause of confidential information leaked. Unfortunately, conventional compression and encryption methods are too slow; it is impossible to carry out real time secure image processing. In this research, Embedded Zerotree Wavelet (EZW) encoder which specially designs for wavelet compression is examined. With this algorithm, three methods are proposed to reduce the processing time, space and security protection that will be secured enough to protect the data.
Abstract: This paper investigates the effects of lubrication on
the quantity of heat emission of two spur gear. System with and
without lubrication effected on the quantity of heat induced on the
gear box (oil - bearings – gears). Both of lubrication and speed of
motor are affected on the performance of gears. Research investigated
the lubrication on the system with and without loading as well as the
wear of gears and bearing's conditions. Gear box investigated
includes the motor, pump, two spur gears, two shafts; speed change
used pulleys and belts. Load used equal one weight ones of gear.
Lubrication mechanism used jet system (upper and lower jet). Gear
box we used system of jet lubrication is perpendicular direction of
the contact line between two teeth. Results appeared in this work that
the lubrication is the vital parameter which is affected on the
performance and durability of gears and bearings. In macroscopic
observation, we noted that damage of bearings happened during the
absence of lubrication as well as abrasive of wear of teeth. Higher
speed of motor without lubrication increased the noise, but in the
presence of lubrication was decreased.
Abstract: The current paper presents a numerical approach in solving the conjugate heat transfer problems. A heat conduction code is coupled internally with a computational fluid dynamics solver for developing a couple conjugate heat transfer solver. Methodology of treating non-matching meshes at interface has also been proposed. The validation results of 1D and 2D cases for the developed conjugate heat transfer code have shown close agreement with the solutions given by analysis.
Abstract: This paper describes a rapid prototyping (RP)
technology for forming a hydroxyapatite (HA) bone scaffold model.
The HA powder and a silica sol are mixed into bioceramic slurry form
under a suitable viscosity. The HA particles are embedded in the
solidified silica matrix to form green parts via a wide range of process
parameters after processing by selective laser sintering (SLS). The
results indicate that the proposed process was possible to fabricate
multilayers and hollow shell structure with brittle property but
sufficient integrity for handling prior to post-processing. The
fabricated bone scaffold models had a surface finish of 25
Abstract: The aim of this work is to present a multi-objective optimization method to find maximum efficiency kinematics for a flapping wing unmanned aerial vehicle. We restrained our study to rectangular wings with the same profile along the span and to harmonic dihedral motion. It is assumed that the birdlike aerial vehicle (whose span and surface area were fixed respectively to 1m and 0.15m2) is in horizontal mechanically balanced motion at fixed speed. We used two flight physics models to describe the vehicle aerodynamic performances, namely DeLaurier-s model, which has been used in many studies dealing with flapping wings, and the model proposed by Dae-Kwan et al. Then, a constrained multi-objective optimization of the propulsive efficiency is performed using a recent evolutionary multi-objective algorithm called є-MOEA. Firstly, we show that feasible solutions (i.e. solutions that fulfil the imposed constraints) can be obtained using Dae-Kwan et al.-s model. Secondly, we highlight that a single objective optimization approach (weighted sum method for example) can also give optimal solutions as good as the multi-objective one which nevertheless offers the advantage of directly generating the set of the best trade-offs. Finally, we show that the DeLaurier-s model does not yield feasible solutions.
Abstract: This work concerns on experimentally investigation
of surfactant flooding in fractured porous media. In this study a series
of water and surfactant injection processes were performed on
micromodels initially saturated with a heavy crude oil. Eight
fractured glass micromodels were used to illustrate effects of
surfactant types and concentrations on oil recovery efficiency in
presence of fractures with different properties i.e. fracture
orientation, length and number of fractures. Two different
surfactants with different concentrations were tested. The results
showed that surfactant flooding would be more efficient by using
SDS surfactant aqueous solution and also by locating injection well
in a proper position respect to fracture properties. This study
demonstrates different physical and chemical conditions that affect
the efficiency of this method of enhanced oil recovery.
Abstract: This paper proposed classification models that would
be used as a proxy for hard disk drive (HDD) functional test equitant
which required approximately more than two weeks to perform the
HDD status classification in either “Pass" or “Fail". These models
were constructed by using committee network which consisted of a
number of single neural networks. This paper also included the
method to solve the problem of sparseness data in failed part, which
was called “enforce learning method". Our results reveal that the
constructed classification models with the proposed method could
perform well in the sparse data conditions and thus the models,
which used a few seconds for HDD classification, could be used to
substitute the HDD functional tests.
Abstract: Video streaming over lossy IP networks is very
important issues, due to the heterogeneous structure of networks.
Infrastructure of the Internet exhibits variable bandwidths, delays,
congestions and time-varying packet losses. Because of variable
attributes of the Internet, video streaming applications should not
only have a good end-to-end transport performance but also have a
robust rate control, furthermore multipath rate allocation mechanism.
So for providing the video streaming service quality, some other
components such as Bandwidth Estimation and Adaptive Rate
Controller should be taken into consideration. This paper gives an
overview of video streaming concept and bandwidth estimation tools
and then introduces special architectures for bandwidth adaptive
video streaming. A bandwidth estimation algorithm – pathChirp,
Optimized Rate Controllers and Multipath Rate Allocation Algorithm
are considered as all-in-one solution for video streaming problem.
This solution is directed and optimized by a decision center which is
designed for obtaining the maximum quality at the receiving side.
Abstract: Motion capture devices have been utilized in
producing several contents, such as movies and video games. However,
since motion capture devices are expensive and inconvenient to use,
motions segmented from captured data was recycled and synthesized
to utilize it in another contents, but the motions were generally
segmented by contents producers in manual. Therefore, automatic
motion segmentation is recently getting a lot of attentions. Previous
approaches are divided into on-line and off-line, where on-line
approaches segment motions based on similarities between
neighboring frames and off-line approaches segment motions by
capturing the global characteristics in feature space. In this paper, we
propose a graph-based high-level motion segmentation method. Since
high-level motions consist of several repeated frames within temporal
distances, we consider all similarities among all frames within the
temporal distance. This is achieved by constructing a graph, where
each vertex represents a frame and the edges between the frames are
weighted by their similarity. Then, normalized cuts algorithm is used
to partition the constructed graph into several sub-graphs by globally
finding minimum cuts. In the experiments, the results using the
proposed method showed better performance than PCA-based method
in on-line and GMM-based method in off-line, as the proposed method
globally segment motions from the graph constructed based
similarities between neighboring frames as well as similarities among
all frames within temporal distances.
Abstract: Evapotranspiration (ET) is a major component of the hydrologic cycle and its accurate estimation is essential for hydrological studies. In past, various estimation methods have been developed for different climatological data, and the accuracy of these methods varies with climatic conditions. Reference crop evapotranspiration (ET0) is a key variable in procedures established for estimating evapotranspiration rates of agricultural crops. Values of ET0 are used with crop coefficients for many aspects of irrigation and water resources planning and management. Numerous methods are used for estimating ET0. As per internationally accepted procedures outlined in the United Nations Food and Agriculture Organization-s Irrigation and Drainage Paper No. 56(FAO-56), use of Penman-Monteith equation is recommended for computing ET0 from ground based climatological observations. In the present study, seven methods have been selected for performance evaluation. User friendly software has been developed using programming language visual basic. The visual basic has ability to create graphical environment using less coding. For given data availability the developed software estimates reference evapotranspiration for any given area and period for which data is available. The accuracy of the software has been checked by the examples given in FAO-56.The developed software is a user friendly tool for estimating ET0 under different data availability and climatic conditions.
Abstract: In this paper, we present a matrix game-theoretic cross-layer optimization formulation to maximize the network lifetime in wireless ad hoc networks with network coding. To this end, we introduce a cross-layer formulation of general NUM (network utility maximization) that accommodates routing, scheduling, and stream control from different layers in the coded networks. Specifically, for the scheduling problem and then the objective function involved, we develop a matrix game with the strategy sets of the players corresponding to hyperlinks and transmission modes, and design the payoffs specific to the lifetime. In particular, with the inherit merit that matrix game can be solved with linear programming, our cross-layer programming formulation can benefit from both game-based and NUM-based approaches at the same time by cooperating the programming model for the matrix game with that for the other layers in a consistent framework. Finally, our numerical example demonstrates its performance results on a well-known wireless butterfly network to verify the cross-layer optimization scheme.
Abstract: A high-performance Monte Carlo simulation, which
simultaneously takes diffusion-controlled and chain-length-dependent
bimolecular termination reactions into account, is developed to
simulate atom transfer radical copolymerization of styrene and nbutyl
acrylate. As expected, increasing initial feed fraction of styrene
raises the fraction of styrene-styrene dyads (fAA) and reduces that of
n-butyl acrylate dyads (fBB). The trend of variation in randomness
parameter (fAB) during the copolymerization also varies significantly.
Also, there is a drift in copolymer heterogeneity and the highest drift
occurs in the initial feeds containing lower percentages of styrene, i.e.
20% and 5%.
Abstract: A technique proposed for the automatic detection
of spikes in electroencephalograms (EEG). A multi-resolution
approach and a non-linear energy operator are exploited. The
signal on each EEG channel is decomposed into three sub bands
using a non-decimated wavelet transform (WT). The WT is a
powerful tool for multi-resolution analysis of non-stationary signal
as well as for signal compression, recognition and restoration.
Each sub band is analyzed by using a non-linear energy operator,
in order to detect spikes. A decision rule detects the presence of
spikes in the EEG, relying upon the energy of the three sub-bands.
The effectiveness of the proposed technique was confirmed by
analyzing both test signals and EEG layouts.
Abstract: In this paper, we study statistical multiplexing of VBR
video in ATM networks. ATM promises to provide high speed realtime
multi-point to central video transmission for telemedicine
applications in rural hospitals and in emergency medical services.
Video coders are known to produce variable bit rate (VBR) signals
and the effects of aggregating these VBR signals need to be
determined in order to design a telemedicine network infrastructure
capable of carrying these signals. We first model the VBR video
signal and simulate it using a generic continuous-data autoregressive
(AR) scheme. We carry out the queueing analysis by the Fluid
Approximation Model (FAM) and the Markov Modulated Poisson
Process (MMPP). The study has shown a trade off: multiplexing
VBR signals reduces burstiness and improves resource utilization,
however, the buffer size needs to be increased with an associated
economic cost. We also show that the MMPP model and the Fluid
Approximation model fit best, respectively, the cell region and the
burst region. Therefore, a hybrid MMPP and FAM completely
characterizes the overall performance of the ATM statistical
multiplexer. The ramifications of this technology are clear: speed,
reliability (lower loss rate and jitter), and increased capacity in video
transmission for telemedicine. With migration to full IP-based
networks still a long way to achieving both high speed and high
quality of service, the proposed ATM architecture will remain of
significant use for telemedicine.
Abstract: Power line channel is proposed as an alternative for broadband data transmission especially in developing countries like Tanzania [1]. However the channel is affected by stochastic attenuation and deep notches which can lead to the limitation of channel capacity and achievable data rate. Various studies have characterized the channel without giving exactly the maximum performance and limitation in data transfer rate may be this is due to complexity of channel modeling being used. In this paper the channel performance of medium voltage, low voltage and indoor power line channel is presented. In the investigations orthogonal frequency division multiplexing (OFDM) with phase shift keying (PSK) as carrier modulation schemes is considered, for indoor, medium and low voltage channels with typical ten branches and also Golay coding is applied for medium voltage channel. From channels, frequency response deep notches are observed in various frequencies which can lead to reduce the achievable data rate. However, is observed that data rate up to 240Mbps is realized for a signal to noise ratio of about 50dB for indoor and low voltage channels, however for medium voltage a typical link with ten branches is affected by strong multipath and coding is required for feasible broadband data transfer.