Abstract: The characteristics of fluid flow and phase separation
in an oil-water separator were numerically analysed as part of the
work presented herein. Simulations were performed for different
velocities and droplet diameters, and the way this parameters can
influence the separator geometry was studied.
The simulations were carried out using the software package
Fluent 6.2, which is designed for numerical simulation of fluid flow
and mass transfer. The model consisted of a cylindrical horizontal
separator. A tetrahedral mesh was employed in the computational
domain. The condition of two-phase flow was simulated with the
two-fluid model, taking into consideration turbulence effects using
the k-ε model.
The results showed that there is a strong dependency of phase
separation on mixture velocity and droplet diameter. An increase in
mixture velocity will bring about a slow down in phase separation
and as a consequence will require a weir of greater height. An
increase in droplet diameter will produce a better phase separation.
The simulations are in agreement with results reported in literature
and show that CFD can be a useful tool in studying a horizontal oilwater
separator.
Abstract: In this paper we describe a computer-aided diagnosis (CAD) system for automated detection of pulmonary nodules in computed-tomography (CT) images. After extracting the pulmonary parenchyma using a combination of image processing techniques, a region growing method is applied to detect nodules based on 3D geometric features. We applied the CAD system to CT scans collected in a screening program for lung cancer detection. Each scan consists of a sequence of about 300 slices stored in DICOM (Digital Imaging and Communications in Medicine) format. All malignant nodules were detected and a low false-positive detection rate was achieved.
Abstract: Sickness absence represents a major economic and
social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is
often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient
and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using
a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model
selection and a critical analysis of the temporal trends, the occurrence
and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large
sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to
select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model
applicability to complicated longitudinal data.
Abstract: A novel thermo-sensitive superabsorbent hydrogel
with salt- and pH-responsiveness properties was obtained by grafting
of mixtures of acrylic acid (AA) and N-isopropylacrylamide
(NIPAM) monomers onto kappa-carrageenan, kC, using ammonium
persulfate (APS) as a free radical initiator in the presence of
methylene bisacrylamide (MBA) as a crosslinker. Infrared
spectroscopy was carried out to confirm the chemical structure of the
hydrogel. Moreover, morphology of the samples was examined by
scanning electron microscopy (SEM). The effect of MBA
concentration and AA/NIPAM weight ratio on the water absorbency
capacity has been investigated. The swelling variations of hydrogels
were explained according to swelling theory based on the hydrogel
chemical structure. The hydrogels exhibited salt-sensitivity and
cation exchange properties. The temperature- and pH-reversibility
properties of the hydrogels make the intelligent polymers as good
candidates for considering as potential carriers for bioactive agents,
e.g. drugs.
Abstract: In Peer-to-Peer service networks, where peers offer any kind of publicly available services or applications, intuitive navigation through all services in the network becomes more difficult as the number of services increases. In this article, a concept is discussed that enables users to intuitively browse and use large scale P2P service networks. The concept extends the idea of creating virtual 3D-environments solely based on Peer-to-Peer technologies. Aside from browsing, users shall have the possibility to emphasize services of interest using their own semantic criteria. The appearance of the virtual world shall intuitively reflect network properties that may be of interest for the user. Additionally, the concept comprises options for load- and traffic-balancing. In this article, the requirements concerning the underlying infrastructure and the graphical user interface are defined. First impressions of the appearance of future systems are presented and the next steps towards a prototypical implementation are discussed.
Abstract: Pretreatment of lignocellulosic biomass materials from
poplar, acacia, oak, and fir with different ionic liquids (ILs)
containing 1-alkyl-3-methyl-imidazolium cations and various anions
has been carried out. The dissolved cellulose from biomass was
precipitated by adding anti-solvents into the solution and vigorous
stirring. Commercial cellulases Celluclast 1.5L and Accelerase 1000
have been used for hydrolysis of untreated and pretreated
lignocellulosic biomass. Among the tested ILs, [Emim]COOCH3
showed the best efficiency, resulting in highest amount of liberated
reducing sugars. Pretreatment of lignocellulosic biomass using
glycerol-ionic liquids combined pretreatment and dilute acid-ionic
liquids combined pretreatment were evaluated and compared with
glycerol pretreatment, ionic liquids pretreatment and dilute acid
pretreatment.
Abstract: In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.
Abstract: Nowadays it is a trend for electronic circuit designers to
integrate all system components on a single-chip. This paper proposed
the design of a single-chip proportional to absolute temperature
(PTAT) sensor including a voltage reference circuit using CEDEC
0.18m CMOS Technology. It is a challenge to design asingle-chip
wide range linear response temperature sensor for many applications.
The channel widths between the compensation transistor and the
reference transistor are critical to design the PTAT temperature sensor
circuit. The designed temperature sensor shows excellent linearity
between -100°C to 200° and the sensitivity is about 0.05mV/°C.
The chip is designed to operate with a single voltage source of 1.6V.
Abstract: The traditional Failure Mode and Effects Analysis
(FMEA) uses Risk Priority Number (RPN) to evaluate the risk level
of a component or process. The RPN index is determined by
calculating the product of severity, occurrence and detection indexes.
The most critically debated disadvantage of this approach is that
various sets of these three indexes may produce an identical value of
RPN. This research paper seeks to address the drawbacks in
traditional FMEA and to propose a new approach to overcome these
shortcomings. The Risk Priority Code (RPC) is used to prioritize
failure modes, when two or more failure modes have the same RPN.
A new method is proposed to prioritize failure modes, when there is a
disagreement in ranking scale for severity, occurrence and detection.
An Analysis of Variance (ANOVA) is used to compare means of
RPN values. SPSS (Statistical Package for the Social Sciences)
statistical analysis package is used to analyze the data. The results
presented are based on two case studies. It is found that the proposed
new methodology/approach resolves the limitations of traditional
FMEA approach.
Abstract: Resins are used in nuclear power plants for water
ultrapurification. Two approaches are considered in this work:
column experiments and simulations. A software called OPTIPUR
was developed, tested and used. The approach simulates the onedimensional
reactive transport in porous medium with convectivedispersive
transport between particles and diffusive transport within
the boundary layer around the particles. The transfer limitation in the
boundary layer is characterized by the mass transfer coefficient
(MTC). The influences on MTC were measured experimentally. The
variation of the inlet concentration does not influence the MTC; on
the contrary of the Darcy velocity which influences. This is consistent
with results obtained using the correlation of Dwivedi&Upadhyay.
With the MTC, knowing the number of exchange site and the relative
affinity, OPTIPUR can simulate the column outlet concentration
versus time. Then, the duration of use of resins can be predicted in
conditions of a binary exchange.
Abstract: The study of the geometric shape of the plunging wave enclosed vortices as a possible indicator for the breaking intensity of ocean waves has been ongoing for almost 50 years with limited success. This paper investigates the validity of using the vortex ratio and vortex angle as methods of predicting breaking intensity. Previously published works on vortex parameters, based on regular wave flume results or solitary wave theory, present contradictory results and conclusions. Through the first complete analysis of field collected irregular wave breaking vortex parameters it is illustrated that the vortex ratio and vortex angle cannot be accurately predicted using standard breaking wave characteristics and hence are not suggested as a possible indicator for breaking intensity.
Abstract: Traditionally, terror groups have been formed by ideologically aligned actors who perceive a lack of options for achieving political or social change. However, terrorist attacks have been increasingly carried out by small groups of actors or lone individuals who may be only ideologically affiliated with larger, formal terrorist organizations. The formation of these groups represents the inverse of traditional organizational growth, whereby structural de-evolution within issue-based organizations leads to the formation of small, independent terror cells. Ideological franchising – the bypassing of formal affiliation to the “parent" organization – represents the de-evolution of traditional concepts of organizational structure in favor of an organic, independent, and focused unit. Traditional definitions of dark networks that are issue-based include focus on an identified goal, commitment to achieving this goal through unrestrained actions, and selection of symbolic targets. The next step in the de-evolution of small dark networks is the miniorganization, consisting of only a handful of actors working toward a common, violent goal. Information-sharing through social media platforms, coupled with civil liberties of democratic nations, provide the communication systems, access to information, and freedom of movement necessary for small dark networks to flourish without the aid of a parent organization. As attacks such as the 7/7 bombings demonstrate the effectiveness of small dark networks, terrorist actors will feel increasingly comfortable aligning with an ideology only, without formally organizing. The natural result of this de-evolving organization is the single actor event, where an individual seems to subscribe to a larger organization-s violent ideology with little or no formal ties.
Abstract: The recycling of concrete, bricks and masonry rubble
as concrete aggregates is an important way to contribute to a
sustainable material flow. However, there are still various
uncertainties limiting the widespread use of Recycled Concrete
Aggregates (RCA). The fluctuations in the composition of grade
recycled aggregates and their influence on the properties of fresh and
hardened concrete are of particular concern regarding the use of
RCA. Most of problems occurring while using recycled concrete
aggregates as aggregates are due to higher porosity and hence higher
water absorption, lower mechanical strengths, residual impurities on
the surface of the RCA forming weaker bond between cement paste
and aggregate. So, the reuse of RCA is still limited. Efficient
polymer based treatment is proposed in order to reuse RCA easier.
The silicon-based polymer treatments of RCA were carried out and
were compared. This kind of treatment can improve the properties of
RCA such as the rate of water absorption on treated RCA is
significantly reduced.
Abstract: The major problem that wireless communication
systems undergo is multipath fading caused by scattering of the
transmitted signal. However, we can treat multipath propagation as
multiple channels between the transmitter and receiver to improve
the signal-to-scattering-noise ratio. While using Single Input
Multiple Output (SIMO) systems, the diversity receivers extract
multiple signal branches or copies of the same signal received from
different channels and apply gain combining schemes such as Root
Mean Square Gain Combining (RMSGC). RMSGC asymptotically
yields an identical performance to that of the theoretically optimal
Maximum Ratio Combining (MRC) for values of mean Signal-to-
Noise-Ratio (SNR) above a certain threshold value without the need
for SNR estimation. This paper introduces an improvement of
RMSGC using two different issues. We found that post-detection and
de-noising the received signals improve the performance of RMSGC
and lower the threshold SNR.
Abstract: This document details the process of developing a
wireless device that captures the basic movements of the foot (plantar
flexion, dorsal flexion, abduction, adduction.), and the knee
movement (flexion). It implements a motion capture system by using
a hardware based on optical fiber sensors, due to the advantages in
terms of scope, noise immunity and speed of data transmission and
reception. The operating principle used by this system is the detection
and transmission of joint movement by mechanical elements and
their respective measurement by optical ones (in this case infrared).
Likewise, Visual Basic software is used for reception, analysis and
signal processing of data acquired by the device, generating a 3D
graphical representation in real time of each movement. The result is
a boot in charge of capturing the movement, a transmission module
(Implementing Xbee Technology) and a receiver module for
receiving information and sending it to the PC for their respective
processing.
The main idea with this device is to help on topics such as
bioengineering and medicine, by helping to improve the quality of
life and movement analysis.
Abstract: Building intelligent traffic guide systems has been an
interesting subject recently. A good system should be able to observe
all important visual information to be able to analyze the context of
the scene. To do so, signs in general, and traffic signs in particular,
are usually taken into account as they contain rich information to
these systems. Therefore, many researchers have put an effort on
sign recognition field. Sign localization or sign detection is the most
important step in the sign recognition process. This step filters out
non informative area in the scene, and locates candidates in later
steps. In this paper, we apply a new approach in detecting sign
locations using a new color invariant model. Experiments are carried
out with different datasets introduced in other works where authors
claimed the difficulty in detecting signs under unfavorable imaging
conditions. Our method is simple, fast and most importantly it gives
a high detection rate in locating signs.
Abstract: Project managers are the ultimate responsible for the
overall characteristics of a project, i.e. they should deliver the project
on time with minimum cost and with maximum quality. It is vital for
any manager to decide a trade-off between these conflicting
objectives and they will be benefited of any scientific decision
support tool. Our work will try to determine optimal solutions (rather
than a single optimal solution) from which the project manager will
select his desirable choice to run the project. In this paper, the
problem in project scheduling notated as
(1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The
problem is multi-objective and the purpose is finding the Pareto
optimal front of time, cost and quality of a project
(curve:quality,time,cost), whose activities belong to a start to finish
activity relationship network (cpm) and they can be done in different
possible modes (mu) which are non-continuous or discrete (disc), and
each mode has a different cost, time and quality . The project is
constrained to a non-renewable resource i.e. money (1,T). Because
the problem is NP-Hard, to solve the problem, a meta-heuristic is
developed based on a version of genetic algorithm specially adapted
to solve multi-objective problems namely FastPGA. A sample project
with 30 activities is generated and then solved by the proposed
method.
Abstract: This paper aims to develop an algorithm of finite
capacity material requirement planning (FCMRP) system for a multistage
assembly flow shop. The developed FCMRP system has two
main stages. The first stage is to allocate operations to the first and
second priority work centers and also determine the sequence of the
operations on each work center. The second stage is to determine the
optimal start time of each operation by using a linear programming
model. Real data from a factory is used to analyze and evaluate the
effectiveness of the proposed FCMRP system and also to guarantee a
practical solution to the user. There are five performance measures,
namely, the total tardiness, the number of tardy orders, the total
earliness, the number of early orders, and the average flow-time. The
proposed FCMRP system offers an adjustable solution which is a
compromised solution among the conflicting performance measures.
The user can adjust the weight of each performance measure to
obtain the desired performance. The result shows that the combination
of FCMRP NP3 and EDD outperforms other combinations
in term of overall performance index. The calculation time for the
proposed FCMRP system is about 10 minutes which is practical for
the planners of the factory.
Abstract: Negation is useful in the majority of the real world applications. However, its introduction leads to semantic and canonical problems. SEPN nets are well adapted extension of predicate nets for the definition and manipulation of stratified programs. This formalism is characterized by two main contributions. The first concerns the management of the whole class of stratified programs. The second contribution is related to usual operations optimization (maximal stratification, incremental updates ...). We propose, in this paper, useful algorithms for manipulating stratified programs using SEPN. These algorithms were implemented and validated with STRPRO tool.
Abstract: This study was to investigate the performance of
hybrid solvents blended between primary, secondary, or tertiary
amines and piperazine (PZ) for CO2 removal from flue gas in terms
of CO2 absorption capacity and regeneration efficiency at 90 oC.
Alkanolamines used in this work were monoethanolamine (MEA),
diethanolamine (DEA), and triethanolamine (TEA). The CO2
absorption was experimentally examined under atmospheric pressure
and room temperature. The results show that MEA blend with PZ
provided the maximum CO2 absorption capacity of 0.50 mol
CO2/mol amine while TEA provided the minimum CO2 absorption
capacity of 0.30 mol CO2/mol amine. TEA was easier to regenerate
for both first cycle and second cycle with less loss of absorption
capacity. The regeneration efficiency of TEA was 95.09 and 92.89 %,
for the first and second generation cycles, respectively.