Abstract: Positron emission particle tracking (PEPT) is a
technique in which a single radioactive tracer particle can be
accurately tracked as it moves. A limitation of PET is that in order to
reconstruct a tomographic image it is necessary to acquire a large
volume of data (millions of events), so it is difficult to study rapidly
changing systems. By considering this fact, PEPT is a very fast
process compared with PET.
In PEPT detecting both photons defines a line and the annihilation
is assumed to have occurred somewhere along this line. The location
of the tracer can be determined to within a few mm from coincident
detection of a small number of pairs of back-to-back gamma rays and
using triangulation. This can be achieved many times per second and
the track of a moving particle can be reliably followed. This
technique was invented at the University of Birmingham [1].
The attempt in PEPT is not to form an image of the tracer particle
but simply to determine its location with time. If this tracer is
followed for a long enough period within a closed, circulating system
it explores all possible types of motion.
The application of PEPT to industrial process systems carried out
at the University of Birmingham is categorized in two subjects: the
behaviour of granular materials and viscous fluids. Granular
materials are processed in industry for example in the manufacture of
pharmaceuticals, ceramics, food, polymers and PEPT has been used
in a number of ways to study the behaviour of these systems [2].
PEPT allows the possibility of tracking a single particle within the
bed [3]. Also PEPT has been used for studying systems such as: fluid
flow, viscous fluids in mixers [4], using a neutrally-buoyant tracer
particle [5].
Abstract: Automatic detection of bleeding is of practical
importance since capsule endoscopy produces an extremely large
number of images. Algorithm development of bleeding detection in
the digestive tract is difficult due to different contrasts among the
images, food dregs, secretion and others. In this study, were assigned
weighting factors derived from the independent features of the
contrast and brightness between bleeding and normality. Spectral
analysis based on weighting factors was fast and accurate. Results
were a sensitivity of 87% and a specificity of 90% when the accuracy
was determined for each pixel out of 42 endoscope images.
Abstract: The problem of Small Area Estimation (SAE) is complex because of various information sources and insufficient data. In this paper, an approach for SAE is presented for decision-making at national, regional and local level. We propose an Empirical Best Linear Unbiased Predictor (EBLUP) as an estimator in order to combine several information sources to evaluate various indicators. First, we present the urban audit project and its environmental, social and economic indicators. Secondly, we propose an approach for decision making in order to estimate indicators. An application is used to validate the theoretical proposal. Finally, a decision support system is presented based on open-source environment.
Abstract: An electronic portal image device (EPID) has become
a method of patient-specific IMRT dose verification for radiotherapy.
Research studies have focused on pre and post-treatment verification,
however, there are currently no interventional procedures using EPID
dosimetry that measure the dose in real time as a mechanism to
ensure that overdoses do not occur and underdoses are detected as
soon as is practically possible. As a result, an EPID-based real time
dose verification system for dynamic IMRT was developed and was
implemented with MATLAB/Simulink. The EPID image acquisition
was set to continuous acquisition mode at 1.4 images per second. The
system defined the time constraint gap, or execution gap at the image
acquisition time, so that every calculation must be completed before
the next image capture is completed. In addition, the
Abstract: This paper presents the fundamentals of Origami engineering and its application in nowadays as well as future industry. Several main cores of mathematical approaches such as Huzita- Hatori axioms, Maekawa and Kawasaki-s theorems are introduced briefly. Meanwhile flaps and circle packing by Robert Lang is explained to make understood the underlying principles in designing crease pattern. Rigid origami and its corrugation patterns which are potentially applicable for creating transformable or temporary spaces is discussed to show the transition of origami from paper to thick material. Moreover, some innovative applications of origami such as eyeglass, origami stent and high tech origami based on mentioned theories and principles are showcased in section III; while some updated origami technology such as Vacuumatics, self-folding of polymer sheets and programmable matter folding which could greatlyenhance origami structureare demonstrated in Section IV to offer more insight in future origami.
Abstract: Health problems linked to urban growth are current
major concerns of developing countries. In 2002 and 2005, an
interdisciplinary program “Populations et Espaces ├á Risques
SANitaires" (PERSAN) was set up under the patronage of the
Development and Research Institute. Centered on health in
Cameroon-s urban environment, the program mainly sought to (i)
identify diarrhoea risk factors in Yaoundé, (ii) to measure their
prevalence and apprehend their spatial distribution. The crosssectional
epidemiological study that was carried out revealed a
diarrheic prevalence of 14.4% (437 cases of diarrhoea on the 3,034
children examined). Also, among risk factors studied, household
refuse management methods used by city dwellers were statistically
associated to these diarrhoeas. Moreover, it happened that levels of
diarrhoeal attacks varied consistently from one neighbourhood to
another because of the discrepancy urbanization process of the
Yaoundé metropolis.
Abstract: This article presents the results using a parametric approach and a Wavelet Transform in analysing signals emitting from the sperm whale. The extraction of intrinsic characteristics of these unique signals emitted by marine mammals is still at present a difficult exercise for various reasons: firstly, it concerns non-stationary signals, and secondly, these signals are obstructed by interfering background noise. In this article, we compare the advantages and disadvantages of both methods: Auto Regressive models and Wavelet Transform. These approaches serve as an alternative to the commonly used estimators which are based on the Fourier Transform for which the hypotheses necessary for its application are in certain cases, not sufficiently proven. These modern approaches provide effective results particularly for the periodic tracking of the signal's characteristics and notably when the signal-to-noise ratio negatively effects signal tracking. Our objectives are twofold. Our first goal is to identify the animal through its acoustic signature. This includes recognition of the marine mammal species and ultimately of the individual animal (within the species). The second is much more ambitious and directly involves the intervention of cetologists to study the sounds emitted by marine mammals in an effort to characterize their behaviour. We are working on an approach based on the recordings of marine mammal signals and the findings from this data result from the Wavelet Transform. This article will explore the reasons for using this approach. In addition, thanks to the use of new processors, these algorithms once heavy in calculation time can be integrated in a real-time system.
Abstract: Lung cancer accounts for the most cancer related deaths for men as well as for women. The identification of cancer associated genes and the related pathways are essential to provide an important possibility in the prevention of many types of cancer. In this work two filter approaches, namely the information gain and the biomarker identifier (BMI) are used for the identification of different types of small-cell and non-small-cell lung cancer. A new method to determine the BMI thresholds is proposed to prioritize genes (i.e., primary, secondary and tertiary) using a k-means clustering approach. Sets of key genes were identified that can be found in several pathways. It turned out that the modified BMI is well suited for microarray data and therefore BMI is proposed as a powerful tool for the search for new and so far undiscovered genes related to cancer.
Abstract: Developing a supply chain management (SCM) system is costly, but important. However, because of its complicated nature, not many of such projects are considered successful. Few research publications directly relate to key success factors (KSFs) for implementing a SCM system. Motivated by the above, this research proposes a hierarchy of KSFs for SCM system implementation in the semiconductor industry by using a two-step approach. First, the literature review indicates the initial hierarchy. The second step includes a focus group approach to finalize the proposed KSF hierarchy by extracting valuable experiences from executives and managers that actively participated in a project, which successfully establish a seamless SCM integration between the world's largest semiconductor foundry manufacturing company and the world's largest assembly and testing company. Future project executives may refer the resulting KSF hierarchy as a checklist for SCM system implementation in semiconductor or related industries.
Abstract: Application of Geo-Informatic technology in land
tenure and land use on the economic crop area, to create sustainable
land, access to the area, and produce sustainable food for the demand
of its people in the community. The research objectives are to 1)
apply Geo-Informatic Technology on land ownership and agricultural
land use (cash crops) in the research area, 2) create GIS database on
land ownership and land use, 3) create database of an online Geoinformation
system on land tenure and land use. The results of this
study reveal that, first; the study area is on high slope, mountains and
valleys. The land is mainly in the forest zone which was included in
the Forest Act 1941 and National Conserved Forest 1964. Residents
gained the rights to exploit the land passed down from their
ancestors. The practice was recognized by communities. The land
was suitable for cultivating a wide variety of economic crops that was
the main income of the family. At present the local residents keep
expanding the land to grow cash crops. Second; creating a database
of the geographic information system consisted of the area range,
announcement from the Interior Ministry, interpretation of satellite
images, transportation routes, waterways, plots of land with a title
deed available at the provincial land office. Most pieces of land
without a title deed are located in the forest and national reserve
areas. Data were created from a field study and a land zone
determined by a GPS. Last; an online Geo-Informatic System can
show the information of land tenure and land use of each economic
crop. Satellite data with high resolution which could be updated and
checked on the online Geo-Informatic System simultaneously.
Abstract: Approximate tandem repeats in a genomic sequence are
two or more contiguous, similar copies of a pattern of nucleotides.
They are used in DNA mapping, studying molecular evolution
mechanisms, forensic analysis and research in diagnosis of inherited
diseases. All their functions are still investigated and not well
defined, but increasing biological databases together with tools for
identification of these repeats may lead to discovery of their specific
role or correlation with particular features. This paper presents a new
approach for finding approximate tandem repeats in a given sequence,
where the similarity between consecutive repeats is measured using
the Hamming distance. It is an enhancement of a method for finding
exact tandem repeats in DNA sequences based on the Burrows-
Wheeler transform.
Abstract: There are little subjects in macroeconomics that are so
widely discussed, but at the same time controversial and without a
clear solution such as the choice of exchange rate regime. National
authorities need to take into consideration numerous fundamentals,
trying to fulfil goals of economic growth, low and stable inflation
and international stability. This paper focuses on the countries of ex-
Yugoslavia and their exchange rate history as independent states. We
follow the development of the regimes in 6 countries during the
transition through the financial crisis of the second part of the 2000s
to the prospects of their final goal: full membership in the European
Union. Main question is to what extent has the exchange regime
contributed to their economic success, considering other objective
factors.
Abstract: Morphological operators transform the original image
into another image through the interaction with the other image of
certain shape and size which is known as the structure element.
Mathematical morphology provides a systematic approach to analyze
the geometric characteristics of signals or images, and has been
applied widely too many applications such as edge detection,
objection segmentation, noise suppression and so on. Fuzzy
Mathematical Morphology aims to extend the binary morphological
operators to grey-level images. In order to define the basic
morphological operations such as fuzzy erosion, dilation, opening
and closing, a general method based upon fuzzy implication and
inclusion grade operators is introduced. The fuzzy morphological
operations extend the ordinary morphological operations by using
fuzzy sets where for fuzzy sets, the union operation is replaced by a
maximum operation, and the intersection operation is replaced by a
minimum operation.
In this work, it consists of two articles. In the first one, fuzzy set
theory, fuzzy Mathematical morphology which is based on fuzzy
logic and fuzzy set theory; fuzzy Mathematical operations and their
properties will be studied in details. As a second part, the application
of fuzziness in Mathematical morphology in practical work such as
image processing will be discussed with the illustration problems.
Abstract: This paper describes the designs of a first and second
generation autonomous gas monitoring system and the successful
field trial of the final system (2nd generation). Infrared sensing
technology is used to detect and measure the greenhouse gases
methane (CH4) and carbon dioxide (CO2) at point sources. The
ability to monitor real-time events is further enhanced through the
implementation of both GSM and Bluetooth technologies to
communicate these data in real-time. These systems are robust,
reliable and a necessary tool where the monitoring of gas events in
real-time are needed.
Abstract: This paper investigates experimentally and
analytically the torsion behavior of steel fibered high strength self
compacting concrete beams reinforced by GFRP bars. Steel fibered
high strength self compacting concrete (SFHSSCC) and GFRP bars
became in the recent decades a very important materials in the
structural engineering field. The use of GFRP bars to replace steel
bars has emerged as one of the many techniques put forward to
enhance the corrosion resistance of reinforced concrete structures.
High strength concrete and GFRP bars attract designers and
architects as it allows improving the durability as well as the esthetics
of a construction. One of the trends in SFHSSCC structures is to
provide their ductile behavior and additional goal is to limit
development and propagation of macro-cracks in the body of
SFHSSCC elements. SFHSSCC and GFRP bars are tough, improve
the workability, enhance the corrosion resistance of reinforced
concrete structures, and demonstrate high residual strengths after
appearance of the first crack. Experimental studies were carried out
to select effective fiber contents. Three types of volume fraction from
hooked shape steel fibers are used in this study, the hooked steel
fibers were evaluated in volume fractions ranging between 0.0%,
0.75% and 1.5%. The beams shape is chosen to create the required
forces (i.e. torsion and bending moments simultaneously) on the test
zone. A total of seven beams were tested, classified into three groups.
All beams, have 200cm length, cross section of 10×20cm,
longitudinal bottom reinforcement of 3
Abstract: This paper discusses a method for improving accuracy
of fuzzy-rule-based classifiers using particle swarm optimization
(PSO). Two different fuzzy classifiers are considered and optimized.
The first classifier is based on Mamdani fuzzy inference system
(M_PSO fuzzy classifier). The second classifier is based on Takagi-
Sugeno fuzzy inference system (TS_PSO fuzzy classifier). The
parameters of the proposed fuzzy classifiers including premise
(antecedent) parameters, consequent parameters and structure of
fuzzy rules are optimized using PSO. Experimental results show that
higher classification accuracy can be obtained with a lower number
of fuzzy rules by using the proposed PSO fuzzy classifiers. The
performances of M_PSO and TS_PSO fuzzy classifiers are compared
to other fuzzy based classifiers
Abstract: This paper presents the findings of two experiments that were performed on the Redundancy in Wireless Connection Model (RiWC) using the 802.11b standard. The experiments were simulated using OPNET 11.5 Modeler software. The first was aimed at finding the maximum number of simultaneous Voice over Internet Protocol (VoIP) users the model would support under the G.711 and G.729 codec standards when the packetization interval was 10 milliseconds (ms). The second experiment examined the model?s VoIP user capacity using the G.729 codec standard along with background traffic using the same packetization interval as in the first experiment. To determine the capacity of the model under various experiments, we checked three metrics: jitter, delay and data loss. When background traffic was added, we checked the response time in addition to the previous three metrics. The findings of the first experiment indicated that the maximum number of simultaneous VoIP users the model was able to support was 5, which is consistent with recent research findings. When using the G.729 codec, the model was able to support up to 16 VoIP users; similar experiments in current literature have indicated a maximum of 7 users. The finding of the second experiment demonstrated that the maximum number of VoIP users the model was able to support was 12, with the existence of background traffic.
Abstract: Methane is the second most important greenhouse gas
(GHG) after carbon dioxide. Amount of methane emission from
energy sector is increasing day by day with various activities. In
present work, various sources of methane emission from upstream,
middle stream and downstream of oil & gas sectors are identified and
categorised as per IPCC-2006 guidelines. Data were collected from
various oil & gas sector like (i) exploration & production of oil & gas
(ii) supply through pipelines (iii) refinery throughput & production
(iv) storage & transportation (v) usage. Methane emission factors for
various categories were determined applying Tier-II and Tier-I
approach using the collected data. Total methane emission from
Indian Oil & Gas sectors was thus estimated for the year 1990 to
2007.
Abstract: This study used positivist quantitative approach to examine the mathematical concepts acquisition of- KS4 (14-16) Special Education Needs (SENs) students within the school sector education in England. The research is based on a pilot study and the design is completely holistic in its approach with mixing methodologies. The study combines the qualitative and quantitative methods of approach in gathering formative data for the design process. Although, the approach could best be described as a mix method, fundamentally with a strong positivist paradigm, hence my earlier understanding of the differentiation of the students, student – teacher body and the various elements of indicators that is being measured which will require an attenuated description of individual research subjects. The design process involves four phases with five key stages which are; literature review and document analysis, the survey, interview, and observation; then finally the analysis of data set. The research identified the need for triangulation with Reid-s phases of data management providing scaffold for the study. The study clearly identified the ideological and philosophical aspects of educational research design for the study of mathematics by the special education needs (SENs) students in England using the virtual learning environment (VLE) platform.
Abstract: Traditional object segmentation methods are time consuming and computationally difficult. In this paper, onedimensional object detection along the secant lines is applied. Statistical features of texture images are computed for the recognition process. Example matrices of these features and formulae for calculation of similarities between two feature patterns are expressed. And experiments are also carried out using these features.