Abstract: Positron emission particle tracking (PEPT) is a
technique in which a single radioactive tracer particle can be
accurately tracked as it moves. A limitation of PET is that in order to
reconstruct a tomographic image it is necessary to acquire a large
volume of data (millions of events), so it is difficult to study rapidly
changing systems. By considering this fact, PEPT is a very fast
process compared with PET.
In PEPT detecting both photons defines a line and the annihilation
is assumed to have occurred somewhere along this line. The location
of the tracer can be determined to within a few mm from coincident
detection of a small number of pairs of back-to-back gamma rays and
using triangulation. This can be achieved many times per second and
the track of a moving particle can be reliably followed. This
technique was invented at the University of Birmingham [1].
The attempt in PEPT is not to form an image of the tracer particle
but simply to determine its location with time. If this tracer is
followed for a long enough period within a closed, circulating system
it explores all possible types of motion.
The application of PEPT to industrial process systems carried out
at the University of Birmingham is categorized in two subjects: the
behaviour of granular materials and viscous fluids. Granular
materials are processed in industry for example in the manufacture of
pharmaceuticals, ceramics, food, polymers and PEPT has been used
in a number of ways to study the behaviour of these systems [2].
PEPT allows the possibility of tracking a single particle within the
bed [3]. Also PEPT has been used for studying systems such as: fluid
flow, viscous fluids in mixers [4], using a neutrally-buoyant tracer
particle [5].
Abstract: Feature and model selection are in the center of
attention of many researches because of their impact on classifiers-
performance. Both selections are usually performed separately but
recent developments suggest using a combined GA-SVM approach to
perform them simultaneously. This approach improves the
performance of the classifier identifying the best subset of variables
and the optimal parameters- values. Although GA-SVM is an
effective method it is computationally expensive, thus a rough
method can be considered. The paper investigates a joined approach
of Genetic Algorithm and kernel matrix criteria to perform
simultaneously feature and model selection for SVM classification
problem. The purpose of this research is to improve the classification
performance of SVM through an efficient approach, the Kernel
Matrix Genetic Algorithm method (KMGA).
Abstract: Using the animations video of teaching materials is an
effective learning method. However, we thought that more effective learning method is to produce the teaching video by learners
themselves. The learners who act as the producer must learn and understand well to produce and present video of teaching materials to
others. The purpose of this study is to propose the project based learning (PBL) technique by co-producing video of IT (information
technology) teaching materials. We used the T2V player to produce
the video based on TVML a TV program description language. By
proposed method, we have assigned the learners to produce the
animations video for “National Examination for Information
Processing Technicians (IPA examination)" in Japan, in order to get
them learns various knowledge and skill on IT field. Experimental
result showed that learning effect has occurred at the video production
process that useful for IT personnel resources development.
Abstract: Vapour recompression system has been used to
enhance reduction in energy consumption and improvement in
energy effectiveness of distillation columns. However, the effects of
certain parameters have not been taken into consideration. One of
such parameters is the column heat loss which has either been
assumed to be a certain percent of reboiler heat transfer or negligible.
The purpose of this study was to evaluate the heat loss from an
ethanol-water vapour recompression distillation column with
pressure increase across the compressor (VRCAS) and compare the
results obtained and its effect on some parameters in similar system
(VRCCS) where the column heat loss has been assumed or neglected.
Results show that the heat loss evaluated was higher when compared
with that obtained for the column VRCCS. The results also showed
that increase in heat loss could have significant effect on the total
energy consumption, reboiler heat transfer, the number of trays and
energy effectiveness of the column.
Abstract: Vision based tracking problem is solved through a
combination of optical flow, MACH filter and log r-θ mapping.
Optical flow is used for detecting regions of movement in video
frames acquired under variable lighting conditions. The region of
movement is segmented and then searched for the target. A template
is used for target recognition on the segmented regions for detecting
the region of interest. The template is trained offline on a sequence of
target images that are created using the MACH filter and log r-θ
mapping. The template is applied on areas of movement in
successive frames and strong correlation is seen for in-class targets.
Correlation peaks above a certain threshold indicate the presence of
target and the target is tracked over successive frames.
Abstract: The aim of this paper is to emphasize and alleviate the effect of phase noise due to imperfect local oscillators on the performances of a Multi-Carrier CDMA system. After the cancellation of Common Phase Error (CPE), an iterative approach is introduced which iteratively estimates Inter-Carrier Interference (ICI) components in the frequency domain and cancels their contribution in the time domain. Simulation are conducted in order to investigate the achievable performances for several parameters, such as the spreading factor, the modulation order, the phase noise power and the transmission Signal-to-Noise Ratio.
Abstract: Purpose of this work is to develop an automatic classification system that could be useful for radiologists in the breast cancer investigation. The software has been designed in the framework of the MAGIC-5 collaboration. In an automatic classification system the suspicious regions with high probability to include a lesion are extracted from the image as regions of interest (ROIs). Each ROI is characterized by some features based generally on morphological lesion differences. A study in the space features representation is made and some classifiers are tested to distinguish the pathological regions from the healthy ones. The results provided in terms of sensitivity and specificity will be presented through the ROC (Receiver Operating Characteristic) curves. In particular the best performances are obtained with the Neural Networks in comparison with the K-Nearest Neighbours and the Support Vector Machine: The Radial Basis Function supply the best results with 0.89 ± 0.01 of area under ROC curve but similar results are obtained with the Probabilistic Neural Network and a Multi Layer Perceptron.
Abstract: The last years have seen an increasing use of image analysis techniques in the field of biomedical imaging, in particular in microscopic imaging. The basic step for most of the image analysis techniques relies on a background image free of objects of interest, whether they are cells or histological samples, to perform further analysis, such as segmentation or mosaicing. Commonly, this image consists of an empty field acquired in advance. However, many times achieving an empty field could not be feasible. Or else, this could be different from the background region of the sample really being studied, because of the interaction with the organic matter. At last, it could be expensive, for instance in case of live cell analyses. We propose a non parametric and general purpose approach where the background is built automatically stemming from a sequence of images containing even objects of interest. The amount of area, in each image, free of objects just affects the overall speed to obtain the background. Experiments with different kinds of microscopic images prove the effectiveness of our approach.
Abstract: In this paper, we propose a novel concept of relative
distance measurement using Stereo Vision Technology and discuss
its implementation on a FPGA based real-time image processor. We
capture two images using two CCD cameras and compare them.
Disparity is calculated for each pixel using a real time dense disparity
calculation algorithm. This algorithm is based on the concept of
indexed histogram for matching. Disparity being inversely
proportional to distance (Proved Later), we can thus get the relative
distances of objects in front of the camera. The output is displayed on
a TV screen in the form of a depth image (optionally using pseudo
colors). This system works in real time on a full PAL frame rate (720
x 576 active pixels @ 25 fps).
Abstract: This paper presents the fundamentals of Origami engineering and its application in nowadays as well as future industry. Several main cores of mathematical approaches such as Huzita- Hatori axioms, Maekawa and Kawasaki-s theorems are introduced briefly. Meanwhile flaps and circle packing by Robert Lang is explained to make understood the underlying principles in designing crease pattern. Rigid origami and its corrugation patterns which are potentially applicable for creating transformable or temporary spaces is discussed to show the transition of origami from paper to thick material. Moreover, some innovative applications of origami such as eyeglass, origami stent and high tech origami based on mentioned theories and principles are showcased in section III; while some updated origami technology such as Vacuumatics, self-folding of polymer sheets and programmable matter folding which could greatlyenhance origami structureare demonstrated in Section IV to offer more insight in future origami.
Abstract: Tourism is a phenomenon respected by the human communities since a long time ago. It has been evoloving continually based on a variety of social and economic needs and with respect to increasingly development of communication and considerable increase of tourist-s number and resulted exchange income has attained much out come such as employment for the communities. or the purpose of tourism development in this zone suitable times and locations need to be specified in the zone for the tourist-s attendance. One of the most important needs of the tourists is the knowledge of climate conditions and suitable times for sightseeing. In this survey, the climate trend condition has been identified for attending the tourists in Isfahan province using the modified tourism climate index (TCI) as well as SPSS, GIS, excel, surfer softwares. This index evoluates systematically the climate conditions for tourism affairs and activities using the monthly maximum mean parameters of daily temperature, daily mean temperature, minimum relative humidity, daily mean relative humidity, precipitation (mm), total sunny hours, wind speed and dust. The results obtaind using kendal-s correlation test show that the months January, February, March, April, May, June, July, August, September, October, November and December are significant and have an increasing trend that indicates the best condition for attending the tourists. S, P, T mean , T max and dust are estimated from 1976-2005 and do kendal-s correlation test again to see which parameter has been effective. Based on the test, we also observed on the effective parameters that the rate of dust in February, March, April, May, June, July, August, October and November is decreasing and precipitation in September and January is increasing and also the radiation rate in May and August is increasing that indicate a better condition of convenience. Maximum temperature in June is also decreasing. Isfahan province has two spring and fall peaks and the best places for tourism are in the north and western areas.
Abstract: In this paper we present a technique to speed up
ICA based on the idea of reducing the dimensionality of the data
set preserving the quality of the results. In particular we refer to
FastICA algorithm which uses the Kurtosis as statistical property
to be maximized. By performing a particular Johnson-Lindenstrauss
like projection of the data set, we find the minimum dimensionality
reduction rate ¤ü, defined as the ratio between the size k of the reduced
space and the original one d, which guarantees a narrow confidence
interval of such estimator with high confidence level. The derived
dimensionality reduction rate depends on a system control parameter
β easily computed a priori on the basis of the observations only.
Extensive simulations have been done on different sets of real world
signals. They show that actually the dimensionality reduction is very
high, it preserves the quality of the decomposition and impressively
speeds up FastICA. On the other hand, a set of signals, on which the
estimated reduction rate is greater than 1, exhibits bad decomposition
results if reduced, thus validating the reliability of the parameter β.
We are confident that our method will lead to a better approach to
real time applications.
Abstract: Health problems linked to urban growth are current
major concerns of developing countries. In 2002 and 2005, an
interdisciplinary program “Populations et Espaces ├á Risques
SANitaires" (PERSAN) was set up under the patronage of the
Development and Research Institute. Centered on health in
Cameroon-s urban environment, the program mainly sought to (i)
identify diarrhoea risk factors in Yaoundé, (ii) to measure their
prevalence and apprehend their spatial distribution. The crosssectional
epidemiological study that was carried out revealed a
diarrheic prevalence of 14.4% (437 cases of diarrhoea on the 3,034
children examined). Also, among risk factors studied, household
refuse management methods used by city dwellers were statistically
associated to these diarrhoeas. Moreover, it happened that levels of
diarrhoeal attacks varied consistently from one neighbourhood to
another because of the discrepancy urbanization process of the
Yaoundé metropolis.
Abstract: Malaysia is aggressive in promoting the usage of ICT
to its mass population through the support by the government
policies and programs targeting the general population. However,
with the uneven distribution of the basic telecommunication
infrastructure between the urban and rural area, cost for being
“interconnected" that is considered high among the poorer rural
population and the lack of local contents that suit the rural population
needs or lifestyles, it is still a challenge for Malaysia to achieve its
Vision 2020 Agenda moving the nation towards an information
society by the year 2020. Among the existing programs that have
been carried out by the government to encourage the usage of ICT by
the rural population is “Kedaikom", a community telecenter with the
general aim is to engage the community to get exposed and to use the
ICT, encouraging the diffusion of the ICT technology to the rural
population. The research investigated by using a questionnaire
survey of how Kedaikom, as a community telecenter could play a
role in encouraging the rural or underserved community to use the
ICT. The result from the survey has proven that the community
telecenter could bridge the digital divide between the underserved
rural population and the well-accessed urban population in Malaysia.
More of the rural population, especially from the younger generation
and those with higher educational background are using the
community telecenter to be connected to the ICT.
Abstract: Added stresses due to adjacent structure should be
considered in foundation design and stress control in soil under the structure. This case is considered less than other cases in design and
calculation whereas stresses in implementation are greater than analytical stress.
Structure load are transmitted to earth by foundation and role of foundation is propagation of load on the continuous and half extreme
soil. This act cause that, present stresses lessen to allowable strength
of soil. Some researchers such as Boussinesq and westergaurd by
using of some assumption studied on this issue, theorically. Target of
this paper is study and evaluation of added stresses under structure
due to adjacent structure. For this purpose, by using of assumption, theoric relation and numeral methods, effects of adjacent structure
with 4 to 10 storeys on the main structure with 4 storeys are studied
and effect of parameters and sensitivity of them are evaluated.
Abstract: This article presents the results using a parametric approach and a Wavelet Transform in analysing signals emitting from the sperm whale. The extraction of intrinsic characteristics of these unique signals emitted by marine mammals is still at present a difficult exercise for various reasons: firstly, it concerns non-stationary signals, and secondly, these signals are obstructed by interfering background noise. In this article, we compare the advantages and disadvantages of both methods: Auto Regressive models and Wavelet Transform. These approaches serve as an alternative to the commonly used estimators which are based on the Fourier Transform for which the hypotheses necessary for its application are in certain cases, not sufficiently proven. These modern approaches provide effective results particularly for the periodic tracking of the signal's characteristics and notably when the signal-to-noise ratio negatively effects signal tracking. Our objectives are twofold. Our first goal is to identify the animal through its acoustic signature. This includes recognition of the marine mammal species and ultimately of the individual animal (within the species). The second is much more ambitious and directly involves the intervention of cetologists to study the sounds emitted by marine mammals in an effort to characterize their behaviour. We are working on an approach based on the recordings of marine mammal signals and the findings from this data result from the Wavelet Transform. This article will explore the reasons for using this approach. In addition, thanks to the use of new processors, these algorithms once heavy in calculation time can be integrated in a real-time system.
Abstract: Unlike this study focused extensively on trading
behavior of option market, those researches were just taken their
attention to model-driven option pricing. For example, Black-Scholes
(B-S) model is one of the most famous option pricing models.
However, the arguments of B-S model are previously mentioned by
some pricing models reviewing. This paper following suggests the
importance of the dynamic character for option pricing, which is also
the reason why using the genetic algorithm (GA). Because of its
natural selection and species evolution, this study proposed a hybrid
model, the Genetic-BS model which combining GA and B-S to
estimate the price more accurate. As for the final experiments, the
result shows that the output estimated price with lower MAE value
than the calculated price by either B-S model or its enhanced one,
Gram-Charlier garch (G-C garch) model. Finally, this work would
conclude that the Genetic-BS pricing model is exactly practical.
Abstract: One of object oriented software developing problem
is the difficulty of searching the appropriate and suitable objects for
starting the system. In this work, ontologies appear in the part of
supporting the object discovering in the initial of object oriented
software developing. There are many researches try to demonstrate
that there is a great potential between object model and ontologies.
Constructing ontology from object model is called ontology
engineering can be done; On the other hand, this research is aiming to
support the idea of building object model from ontology is also
promising and practical. Ontology classes are available online in any
specific areas, which can be searched by semantic search engine.
There are also many helping tools to do so; one of them which are
used in this research is Protégé ontology editor and Visual Paradigm.
To put them together give a great outcome. This research will be
shown how it works efficiently with the real case study by using
ontology classes in travel/tourism domain area. It needs to combine
classes, properties, and relationships from more than two ontologies
in order to generate the object model. In this paper presents a simple
methodology framework which explains the process of discovering
objects. The results show that this framework has great value while
there is possible for expansion. Reusing of existing ontologies offers
a much cheaper alternative than building new ones from scratch.
More ontologies are becoming available on the web, and online
ontologies libraries for storing and indexing ontologies are increasing
in number and demand. Semantic and Ontologies search engines have
also started to appear, to facilitate search and retrieval of online
ontologies.
Abstract: Memristor is also known as the fourth fundamental
passive circuit element. When current flows in one direction through
the device, the electrical resistance increases and when current flows
in the opposite direction, the resistance decreases. When the current
is stopped, the component retains the last resistance that it had, and
when the flow of charge starts again, the resistance of the circuit will
be what it was when it was last active. It behaves as a nonlinear
resistor with memory. Recently memristors have generated wide
research interest and have found many applications. In this paper we
survey the various applications of memristors which include non
volatile memory, nanoelectronic memories, computer logic,
neuromorphic computer architectures low power remote sensing
applications, crossbar latches as transistor replacements, analog
computations and switches.
Abstract: Nowadays, many manufacturing companies try to
reinforce their competitiveness or find a breakthrough by considering
collaboration. In Korea, more than 900 manufacturing companies are
using web-based collaboration systems developed by the
government-led project, referred to as i-Manufacturing. The system
supports some similar functions of Product Data Management (PDM)
as well as Project Management System (PMS). A web-based
collaboration system provides many useful functions for collaborative
works. This system, however, does not support new linking services
between buyers and suppliers. Therefore, in order to find new
collaborative partners, this paper proposes a framework which creates
new connections between buyers and suppliers facilitating their
collaboration, referred to as Excellent Manufacturer Scouting System
(EMSS). EMSS plays a role as a bridge between overseas buyers and
suppliers. As a part of study on EMSS, we also propose an evaluation
method of manufacturability of potential partners with six main factors.
Based on the results of evaluation, buyers may get a good guideline to
choose their new partners before getting into negotiation processes
with them.