Abstract: Automatic detection of bleeding is of practical
importance since capsule endoscopy produces an extremely large
number of images. Algorithm development of bleeding detection in
the digestive tract is difficult due to different contrasts among the
images, food dregs, secretion and others. In this study, were assigned
weighting factors derived from the independent features of the
contrast and brightness between bleeding and normality. Spectral
analysis based on weighting factors was fast and accurate. Results
were a sensitivity of 87% and a specificity of 90% when the accuracy
was determined for each pixel out of 42 endoscope images.
Abstract: In this paper we report the technique of optical
induction of 2 and 3-dimensional (2D and 3D) photonic lattices in
photorefractive materials based on diffraction grating self replication
-Talbot effect. 1D and 2D different rotational symmery diffraction
masks with the periods of few tens micrometers and 532 nm cw laser
beam were used in the experiments to form an intensity modulated
light beam profile. A few hundred micrometric scale replications of
mask generated intensity structures along the beam propagation axis
were observed. Up to 20 high contrast replications were detected for
1D annular mask with 30
Abstract: Rooted in the study of social functioning of space in architecture, Space Syntax (SS) and the more recent Network Pattern (NP) researches demonstrate the 'spatial structures' of city, i.e. the hierarchical patterns of streets, junctions and alley ends. Applying SS and NP models, planners can conceptualize the real city-s patterns. Although, both models yield the optimal path of the city their underpinning displays of the city-s spatial configuration differ. The Axial Map analyzes the topological non-distance-based connectivity structure, whereas, the Central-Node Map and the Shortcut-Path Map, in contrast, analyze the metrical distance-based structures. This research contrasts and combines them to understand various forms of city-s structures. It concludes that, while they reveal different spatial structures, Space Syntax and Network Pattern urban models support each the other. Combining together they simulate the global access and the locally compact structures namely the central nodes and the shortcuts for the city.
Abstract: This article presents a detailed analysis and comparative
performance evaluation of model reference adaptive control systems.
In contrast to classical control theory, adaptive control methods allow
to deal with time-variant processes. Inspired by the works [1] and
[2], two methods based on the MIT rule and Lyapunov rule are
applied to a linear first order system. The system is simulated and
it is investigated how changes to the adaptation gain affect the
system performance. Furthermore, variations in the reference model
parameters, that is changing the desired closed-loop behaviour are
examinded.
Abstract: This paper proposes a method for speckle reduction in
medical ultrasound imaging while preserving the edges with the
added advantages of adaptive noise filtering and speed. A nonlinear
image diffusion method that incorporates local image parameter,
namely, scatterer density in addition to gradient, to weight the
nonlinear diffusion process, is proposed. The method was tested for
the isotropic case with a contrast detail phantom and varieties of
clinical ultrasound images, and then compared to linear and some
other diffusion enhancement methods. Different diffusion parameters
were tested and tuned to best reduce speckle noise and preserve
edges. The method showed superior performance measured both
quantitatively and qualitatively when incorporating scatterer density
into the diffusivity function. The proposed filter can be used as a
preprocessing step for ultrasound image enhancement before
applying automatic segmentation, automatic volumetric calculations,
or 3D ultrasound volume rendering.
Abstract: Particle detection in very noisy and low contrast images
is an active field of research in image processing. In this article, a
method is proposed for the efficient detection and sizing of subsurface
spherical particles, which is used for the processing of softly fused
Au nanoparticles. Transmission Electron Microscopy is used for
imaging the nanoparticles, and the proposed algorithm has been
tested with the two-dimensional projected TEM images obtained.
Results are compared with the data obtained by transmission optical
spectroscopy, as well as with conventional circular object detection
algorithms.
Abstract: The design requirements for successful human
accommodation in urban spaces are well known; and the range of
facilities available for meeting urban water quality and quantity
requirements is also well established. Their competing requirements
must be reconciled in order for urban spaces to be successful for
both. This paper outlines the separate human and water imperatives
and their interactions in urban spaces. Stormwater management
facilities- relative potential contributions to urban spaces are
contrasted, and design choices for achieving those potentials are
described. This study uses human success of urban space as the
evaluative criterion of stormwater amenity: human values call on
stormwater facilities to contribute to successful human spaces.
Placing water-s contribution under the overall idea of successful
urban space is an evolution from previous subjective evaluations.
The information is based on photographs and notes from
approximately 1,000 stormwater facilities and urban sites collected
during the last 35 years in North America and overseas, and the
author-s experience on multi-disciplinary design teams. This
conceptual study combines the disciplinary roles of engineering,
landscape architecture, and sociology in effecting successful urban
design.
Abstract: Larval survey was carried out in 6 localities in the
urban areas (Putrajaya) and suburban areas (Kuala Selangor) from
January until December 2010. A total of 520 representative
households in 6 localities were selected. Breeding habitats were
sampled outdoors in the surroundings of housing areas. The study
indicated that the most predominant species found in both areas was
Aedes albopictus with the gardening utensil as a preferred breeding
microhabitat for Putrajaya, in contrast to the artificial containers for
Kuala Selangor. From a total of 1083 mosquito larvae species, 984
were Aedes albopictus larvae, 67 positive larvae of Aedes aegypti
and 32 of Culex larvae. Aedes Index and Container Index were
elevated in Putrajaya with 13% and 11% respectively which is higher
than the standard given by the Ministry of Health, Malaysia. This
results implicating dengue-sensitive skewed to the urban areas.
Breteau Index result also above the standard in both study locations.
Abstract: In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.
Abstract: This paper argues that networks, such as the ECN and the American network, are affected by certain small events which are inherent to path dependence and preclude the full evolution towards efficiency. It is advocated that the American network is superior to the ECN in many respects due to its greater flexibility and longer history. This stems in particular from the creation of the American network, which was based on a small number of cases. Such a structure encourages further changes and modifications which are not necessarily radical. The ECN, by contrast, was established by legislative action, which explains its rigid structure and resistance to change. This paper is an attempt to transpose the superiority of the American network on to the ECN. It looks at concepts such as judicial cooperation, harmonisation of procedure, peer review and regulatory impact assessments (RIAs), and dispute resolution procedures.
Abstract: In Multiple Sclerosis, pathological changes in the
brain results in deviations in signal intensity on Magnetic Resonance
Images (MRI). Quantitative analysis of these changes and their
correlation with clinical finding provides important information for
diagnosis. This constitutes the objective of our work. A new approach
is developed. After the enhancement of images contrast and the brain
extraction by mathematical morphology algorithm, we proceed to the
brain segmentation. Our approach is based on building statistical
model from data itself, for normal brain MRI and including clustering
tissue type. Then we detect signal abnormalities (MS lesions) as a
rejection class containing voxels that are not explained by the built
model. We validate the method on MR images of Multiple Sclerosis
patients by comparing its results with those of human expert
segmentation.
Abstract: The POD-assisted projective integration method based on the equation-free framework is presented in this paper. The method is essentially based on the slow manifold governing of given system. We have applied two variants which are the “on-line" and “off-line" methods for solving the one-dimensional viscous Bergers- equation. For the on-line method, we have computed the slow manifold by extracting the POD modes and used them on-the-fly along the projective integration process without assuming knowledge of the underlying slow manifold. In contrast, the underlying slow manifold must be computed prior to the projective integration process for the off-line method. The projective step is performed by the forward Euler method. Numerical experiments show that for the case of nonperiodic system, the on-line method is more efficient than the off-line method. Besides, the online approach is more realistic when apply the POD-assisted projective integration method to solve any systems. The critical value of the projective time step which directly limits the efficiency of both methods is also shown.
Abstract: All practical real-time scheduling algorithms in multiprocessor systems present a trade-off between their computational complexity and performance. In real-time systems, tasks have to be performed correctly and timely. Finding minimal schedule in multiprocessor systems with real-time constraints is shown to be NP-hard. Although some optimal algorithms have been employed in uni-processor systems, they fail when they are applied in multiprocessor systems. The practical scheduling algorithms in real-time systems have not deterministic response time. Deterministic timing behavior is an important parameter for system robustness analysis. The intrinsic uncertainty in dynamic real-time systems increases the difficulties of scheduling problem. To alleviate these difficulties, we have proposed a fuzzy scheduling approach to arrange real-time periodic and non-periodic tasks in multiprocessor systems. Static and dynamic optimal scheduling algorithms fail with non-critical overload. In contrast, our approach balances task loads of the processors successfully while consider starvation prevention and fairness which cause higher priority tasks have higher running probability. A simulation is conducted to evaluate the performance of the proposed approach. Experimental results have shown that the proposed fuzzy scheduler creates feasible schedules for homogeneous and heterogeneous tasks. It also and considers tasks priorities which cause higher system utilization and lowers deadline miss time. According to the results, it performs very close to optimal schedule of uni-processor systems.
Abstract: Biometric techniques are gaining importance for
personal authentication and identification as compared to the
traditional authentication methods. Biometric templates are
vulnerable to variety of attacks due to their inherent nature. When a
person-s biometric is compromised his identity is lost. In contrast to
password, biometric is not revocable. Therefore, providing security
to the stored biometric template is very crucial. Crypto biometric
systems are authentication systems, which blends the idea of
cryptography and biometrics. Fuzzy vault is a proven crypto
biometric construct which is used to secure the biometric templates.
However fuzzy vault suffer from certain limitations like nonrevocability,
cross matching. Security of the fuzzy vault is affected
by the non-uniform nature of the biometric data. Fuzzy vault when
hardened with password overcomes these limitations. Password
provides an additional layer of security and enhances user privacy.
Retina has certain advantages over other biometric traits. Retinal
scans are used in high-end security applications like access control to
areas or rooms in military installations, power plants, and other high
risk security areas. This work applies the idea of fuzzy vault for
retinal biometric template. Multimodal biometric system
performance is well compared to single modal biometric systems.
The proposed multi modal biometric fuzzy vault includes combined
feature points from retina and fingerprint. The combined vault is
hardened with user password for achieving high level of security.
The security of the combined vault is measured using min-entropy.
The proposed password hardened multi biometric fuzzy vault is
robust towards stored biometric template attacks.
Abstract: Decentralized eco-sanitation system is a promising and sustainable mode comparing to the century-old centralized conventional sanitation system. The decentralized concept relies on an environmentally and economically sound management of water, nutrient and energy fluxes. Source-separation systems for urban waste management collect different solid waste and wastewater streams separately to facilitate the recovery of valuable resources from wastewater (energy, nutrients). A resource recovery centre constituted for 20,000 people will act as the functional unit for the treatment of urban waste of a high-density population community, like Singapore. The decentralized system includes urine treatment, faeces and food waste co-digestion, and horticultural waste and organic fraction of municipal solid waste treatment in composting plants. A design model is developed to estimate the input and output in terms of materials and energy. The inputs of urine (yellow water, YW) and faeces (brown water, BW) are calculated by considering the daily mean production of urine and faeces by humans and the water consumption of no-mix vacuum toilet (0.2 and 1 L flushing water for urine and faeces, respectively). The food waste (FW) production is estimated to be 150 g wet weight/person/day. The YW is collected and discharged by gravity into tank. It was found that two days are required for urine hydrolysis and struvite precipitation. The maximum nitrogen (N) and phosphorus (P) recovery are 150-266 kg/day and 20-70 kg/day, respectively. In contrast, BW and FW are mixed for co-digestion in a thermophilic acidification tank and later a decentralized/centralized methanogenic reactor is used for biogas production. It is determined that 6.16-15.67 m3/h methane is produced which is equivalent to 0.07-0.19 kWh/ca/day. The digestion residues are treated with horticultural waste and organic fraction of municipal waste in co-composting plants.
Abstract: The mosaicing technique has been employed in more and more application fields, from entertainment to scientific ones. In the latter case, often the final evaluation is still left to human beings, that assess visually the quality of the mosaic. Many times, a lack of objective measurements in microscopic mosaicing may prevent the mosaic from being used as a starting image for further analysis. In this work we analyze three different metrics and indexes, in the domain of signal analysis, image analysis and visual quality, to measure the quality of different aspects of the mosaicing procedure, such as registration errors and visual quality. As the case study we consider the mosaicing algorithm we developed. The experiments have been carried out by considering mosaics with very different features: histological samples, that are made of detailed and contrasted images, and live stem cells, that show a very low contrast and low detail levels.
Abstract: Phase-Contrast MR imaging methods are widely used
for measurement of blood flow velocity components. Also there are
some other tools such as CT and Ultrasound for velocity map
detection in intravascular studies. These data are used in deriving
flow characteristics. Some clinical applications are investigated
which use pressure distribution in diagnosis of intravascular disorders
such as vascular stenosis. In this paper an approach to the problem of
measurement of intravascular pressure field by using velocity field
obtained from flow images is proposed. The method presented in this
paper uses an algorithm to calculate nonlinear equations of Navier-
Stokes, assuming blood as an incompressible and Newtonian fluid.
Flow images usually suffer the lack of spatial resolution. Our
attempt is to consider the effect of spatial resolution on the pressure
distribution estimated from this method. In order to achieve this aim,
velocity map of a numerical phantom is derived at six different
spatial resolutions. To determine the effects of vascular stenoses on
pressure distribution, a stenotic phantom geometry is considered. A
comparison between the pressure distribution obtained from the
phantom and the pressure resulted from the algorithm is presented. In
this regard we also compared the effects of collocated and staggered
computational grids on the pressure distribution resulted from this
algorithm.
Abstract: This paper presents an algorithm for the recognition
and tracking of moving objects, 1/10 scale model car is used to verify
performance of the algorithm. Presented algorithm for the recognition
and tracking of moving objects in the paper is as follows. SURF
algorithm is merged with Lucas-Kanade algorithm. SURF algorithm
has strong performance on contrast, size, rotation changes and it
recognizes objects but it is slow due to many computational
complexities. Processing speed of Lucas-Kanade algorithm is fast but
the recognition of objects is impossible. Its optical flow compares the
previous and current frames so that can track the movement of a pixel.
The fusion algorithm is created in order to solve problems which
occurred using the Kalman Filter to estimate the position and the
accumulated error compensation algorithm was implemented. Kalman
filter is used to create presented algorithm to complement problems
that is occurred when fusion two algorithms. Kalman filter is used to
estimate next location, compensate for the accumulated error. The
resolution of the camera (Vision Sensor) is fixed to be 640x480. To
verify the performance of the fusion algorithm, test is compared to
SURF algorithm under three situations, driving straight, curve, and
recognizing cars behind the obstacles. Situation similar to the actual is
possible using a model vehicle. Proposed fusion algorithm showed
superior performance and accuracy than the existing object
recognition and tracking algorithms. We will improve the performance
of the algorithm, so that you can experiment with the images of the
actual road environment.
Abstract: In this paper, a new technique for fast painting with
different colors is presented. The idea of painting relies on applying
masks with different colors to the background. Fast painting is
achieved by applying these masks in the frequency domain instead of
spatial (time) domain. New colors can be generated automatically as a
result from the cross correlation operation. This idea was applied
successfully for faster specific data (face, object, pattern, and code)
detection using neural algorithms. Here, instead of performing cross
correlation between the input input data (e.g., image, or a stream of
sequential data) and the weights of neural networks, the cross
correlation is performed between the colored masks and the
background. Furthermore, this approach is developed to reduce the
computation steps required by the painting operation. The principle of
divide and conquer strategy is applied through background
decomposition. Each background is divided into small in size subbackgrounds
and then each sub-background is processed separately by
using a single faster painting algorithm. Moreover, the fastest painting
is achieved by using parallel processing techniques to paint the
resulting sub-backgrounds using the same number of faster painting
algorithms. In contrast to using only faster painting algorithm, the
speed up ratio is increased with the size of the background when using
faster painting algorithm and background decomposition. Simulation
results show that painting in the frequency domain is faster than that in
the spatial domain.
Abstract: In the last few years, three multivariate spectral
analysis techniques namely, Principal Component Analysis (PCA),
Independent Component Analysis (ICA) and Non-negative Matrix
Factorization (NMF) have emerged as effective tools for oscillation
detection and isolation. While the first method is used in determining
the number of oscillatory sources, the latter two methods
are used to identify source signatures by formulating the detection
problem as a source identification problem in the spectral domain.
In this paper, we present a critical drawback of the underlying linear
(mixing) model which strongly limits the ability of the associated
source separation methods to determine the number of sources
and/or identify the physical source signatures. It is shown that the
assumed mixing model is only valid if each unit of the process gives
equal weighting (all-pass filter) to all oscillatory components in its
inputs. This is in contrast to the fact that each unit, in general, acts
as a filter with non-uniform frequency response. Thus, the model
can only facilitate correct identification of a source with a single
frequency component, which is again unrealistic. To overcome
this deficiency, an iterative post-processing algorithm that correctly
identifies the physical source(s) is developed. An additional issue
with the existing methods is that they lack a procedure to pre-screen
non-oscillatory/noisy measurements which obscure the identification
of oscillatory sources. In this regard, a pre-screening procedure
is prescribed based on the notion of sparseness index to eliminate
the noisy and non-oscillatory measurements from the data set used
for analysis.