Abstract: Segmentation and quantification of stenosis is an
important task in assessing coronary artery disease. One of the main
challenges is measuring the real diameter of curved vessels.
Moreover, uncertainty in segmentation of different tissues in the
narrow vessel is an important issue that affects accuracy. This paper
proposes an algorithm to extract coronary arteries and measure the
degree of stenosis. Markovian fuzzy clustering method is applied to
model uncertainty arises from partial volume effect problem. The
algorithm employs: segmentation, centreline extraction, estimation of
orthogonal plane to centreline, measurement of the degree of
stenosis. To evaluate the accuracy and reproducibility, the approach
has been applied to a vascular phantom and the results are compared
with real diameter. The results of 10 patient datasets have been
visually judged by a qualified radiologist. The results reveal the
superiority of the proposed method compared to the Conventional
thresholding Method (CTM) on both datasets.
Abstract: An attempt was made for availability of wastewater reuse/reclamation for irrigation purposes using phytoremediation “the low cost and less technology", using six local aquatic macrophytes “e.g. T. angustifolia, B. maritimus, Ph. australis, A. donax, A. plantago-aquatica and M. longifolia (Linn)" as biological waste purifiers. Outdoor experiments/designs were conducted from May 03, 2007 till October 15, 2008, close to one of the main sewage channels of Sulaimani City/Iraq*. All processes were mainly based on conventional wastewater treatment processes, besides two further modifications were tested, the first was sand filtration pots, implanted by individual species of experimental macrophytes and the second was constructed wetlands implanted by experimental macrophytes all together. Untreated and treated wastewater samples were analyzed for their key physico-chemical properties (only heavy metals Fe, Mn, Zn and Cu with particular reference to removal efficiency by experimental macrophytes are highlighted in this paper). On the other hand, vertical contents of heavy metals were also evaluated from both pots and the cells of constructed wetland. After 135 days, macrophytes were harvested and heavy metals were analyzed in their biomass (roots/shoots) for removal efficiency assessment (i.e. uptake/ bioaccumulation rate). Results showed that; removal efficiency of all studied heavy metals was much higher in T. angustifolia followed by Ph. Australis, B. maritimus and A. donax in triple experiment sand pots. Constructed wetland experiments have revealed that; the more replicated constructed wetland cells the highest heavy metal removal efficiency was indicated.
Abstract: The optimal grid spacing and turbulence model for the
2D numerical analysis of a vertical-axis water turbine (VAWaterT)
operating in a 2 m/s freestream current has been investigated. The
results of five different spatial domain discretizations and two
turbulence models (k-ω SST and k-ε RNG) have been compared, in
order to gain the optimal y+ parameter distribution along the blade
walls during a full rotor revolution. The resulting optimal mesh has
appeared to be quite similar to that obtained for the numerical
analysis of a vertical-axis wind turbine.
Abstract: Transition prediction of boundary layers has always
been an important problem in fluid mechanics both theoretically and
practically, yet notwithstanding the great effort made by many
investigators, there is no satisfactory answer to this problem. The most
popular method available is so-called e-N method which is heavily
dependent on experiments and experience. The author has proposed
improvements to the e-N method, so to reduce its dependence on
experiments and experience to a certain extent. One of the key
assumptions is that transition would occur whenever the velocity
amplitude of disturbance reaches 1-2% of the free stream velocity.
However, the reliability of this assumption needs to be verified. In this
paper, transition prediction on a flat plate is investigated by using both
the improved e-N method and the parabolized stability equations (PSE)
methods. The results show that the transition locations predicted by
both methods agree reasonably well with each other, under the above
assumption. For the supersonic case, the critical velocity amplitude in
the improved e-N method should be taken as 0.013, whereas in the
subsonic case, it should be 0.018, both are within the range 1-2%.
Abstract: Selective harmonic elimination-pulse width modulation techniques offer a tight control of the harmonic spectrum of a given voltage waveform generated by a power electronic converter along with a low number of switching transitions. Traditional optimization methods suffer from various drawbacks, such as prolonged and tedious computational steps and convergence to local optima; thus, the more the number of harmonics to be eliminated, the larger the computational complexity and time. This paper presents a novel method for output voltage harmonic elimination and voltage control of PWM AC/AC voltage converters using the principle of hybrid Real-Coded Genetic Algorithm-Pattern Search (RGA-PS) method. RGA is the primary optimizer exploiting its global search capabilities, PS is then employed to fine tune the best solution provided by RGA in each evolution. The proposed method enables linear control of the fundamental component of the output voltage and complete elimination of its harmonic contents up to a specified order. Theoretical studies have been carried out to show the effectiveness and robustness of the proposed method of selective harmonic elimination. Theoretical results are validated through simulation studies using PSIM software package.
Abstract: Nowadays, HPC, Grid and Cloud systems are evolving
very rapidly. However, the development of infrastructure solutions
related to HPC is lagging behind. While the existing infrastructure is
sufficient for simple cases, many computational problems have more
complex requirements.Such computational experiments use different
resources simultaneously to start a large number of computational
jobs.These resources are heterogeneous. They have different
purposes, architectures, performance and used software.Users need a
convenient tool that allows to describe and to run complex
computational experiments under conditions of HPC environment.
This paper introduces a modularworkflow system called SEGL
which makes it possible to run complex computational experiments
under conditions of a real HPC organization. The system can be used
in a great number of organizations, which provide HPC power.
Significant requirements to this system are high efficiency and
interoperability with the existing HPC infrastructure of the
organization without any changes.
Abstract: In this study, the effect of greywater irrigation on airwater interfacial area is investigated. Several soil column experiments were conducted for different greywater irrigation to develop the pressure-saturation curves. Surface tension was measured for different greywater concentration and fitted for Gibbs adsorption equation. Pressure-saturation curves show that the reduction of capillary rise stops when it reaches its critical micelle concentration (CMC). A simple theory is derived from pressure-saturation curves for calculating air-water interfacial area in porous medium during greywater irrigation by introducing a term 'hydraulic radius' for the pores. This term diminishes any effect of pore shapes on the air-water interfacial area. The air-water interfacial area was calculated using the pressure-saturation curves and found that it decreases with increasing moisture content. But no significant effect was observed on air-water interfacial area for different greywater irrigation. A maximum of 10% variation in interfacial area was observed at the residual saturation zone.
Abstract: The process of laser absorption in the skin during
laser irradiation was a critical point in medical application
treatments. Delivery the correct amount of laser light is a critical
element in photodynamic therapy (PDT). More amounts of laser
light able to affect tissues in the skin and small amount not able to
enhance PDT procedure in skin. The knowledge of the skin tone
laser dependent distribution of 635 nm radiation and its penetration
depth in skin is a very important precondition for the investigation of
advantage laser induced effect in (PDT) in epidermis diseases
(psoriasis). The aim of this work was to estimate an optimum effect
of diode laser (635 nm) on the treatment of epidermis diseases in
different color skin. Furthermore, it is to improve safety of laser in
PDT in epidermis diseases treatment. Advanced system analytical
program (ASAP) which is a new approach in investigating the PDT,
dependent on optical properties of different skin color was used in
present work. A two layered Realistic Skin Model (RSM); stratum
corneum and epidermal with red laser (635 nm, 10 mW) were used
for irradiative transfer to study fluence and absorbance in different
penetration for various human skin colors. Several skin tones very
fair, fair, light, medium and dark are used to irradiative transfer. This
investigation involved the principles of laser tissue interaction when
the skin optically injected by a red laser diode. The results
demonstrated that the power characteristic of a laser diode (635 nm)
can affect the treatment of epidermal disease in various color skins.
Power absorption of the various human skins were recorded and
analyzed in order to find the influence of the melanin in PDT
treatment in epidermal disease. A two layered RSM show that the
change in penetration depth in epidermal layer of the color skin has a
larger effect on the distribution of absorbed laser in the skin; this is
due to the variation of the melanin concentration for each color.
Abstract: To extract the important physiological factors related to
diabetes from an oral glucose tolerance test (OGTT) by mathematical
modeling, highly informative but convenient protocols are required.
Current models require a large number of samples and extended
period of testing, which is not practical for daily use. The purpose
of this study is to make model assessments possible even from a
reduced number of samples taken over a relatively short period.
For this purpose, test values were extrapolated using a support
vector machine. A good correlation was found between reference and
extrapolated values in evaluated 741 OGTTs. This result indicates
that a reduction in the number of clinical test is possible through a
computational approach.
Abstract: Array signal processing involves signal enumeration and source localization. Array signal processing is centered on the ability to fuse temporal and spatial information captured via sampling signals emitted from a number of sources at the sensors of an array in order to carry out a specific estimation task: source characteristics (mainly localization of the sources) and/or array characteristics (mainly array geometry) estimation. Array signal processing is a part of signal processing that uses sensors organized in patterns or arrays, to detect signals and to determine information about them. Beamforming is a general signal processing technique used to control the directionality of the reception or transmission of a signal. Using Beamforming we can direct the majority of signal energy we receive from a group of array. Multiple signal classification (MUSIC) is a highly popular eigenstructure-based estimation method of direction of arrival (DOA) with high resolution. This Paper enumerates the effect of missing sensors in DOA estimation. The accuracy of the MUSIC-based DOA estimation is degraded significantly both by the effects of the missing sensors among the receiving array elements and the unequal channel gain and phase errors of the receiver.
Abstract: Distance visualization of large datasets often takes the direction of remote viewing and zooming techniques of stored static images. However, the continuous increase in the size of datasets and visualization operation causes insufficient performance with traditional desktop computers. Additionally, the visualization techniques such as Isosurface depend on the available resources of the running machine and the size of datasets. Moreover, the continuous demand for powerful computing powers and continuous increase in the size of datasets results an urgent need for a grid computing infrastructure. However, some issues arise in current grid such as resources availability at the client machines which are not sufficient enough to process large datasets. On top of that, different output devices and different network bandwidth between the visualization pipeline components often result output suitable for one machine and not suitable for another. In this paper we investigate how the grid services could be used to support remote visualization of large datasets and to break the constraint of physical co-location of the resources by applying the grid computing technologies. We show our grid enabled architecture to visualize large medical datasets (circa 5 million polygons) for remote interactive visualization on modest resources clients.
Abstract: Many-core GPUs provide high computing ability and
substantial bandwidth; however, optimizing irregular applications
like SpMV on GPUs becomes a difficult but meaningful task. In this
paper, we propose a novel method to improve the performance of
SpMV on GPUs. A new storage format called HYB-R is proposed to
exploit GPU architecture more efficiently. The COO portion of the
matrix is partitioned recursively into a ELL portion and a COO
portion in the process of creating HYB-R format to ensure that there
are as many non-zeros as possible in ELL format. The method of
partitioning the matrix is an important problem for HYB-R kernel, so
we also try to tune the parameters to partition the matrix for higher
performance. Experimental results show that our method can get
better performance than the fastest kernel (HYB) in NVIDIA-s
SpMV library with as high as 17% speedup.
Abstract: The main problems of data centric and open source
project are large number of developers and changes of core
framework. Model-View-Control (MVC) design pattern significantly
improved the development and adjustments of complex projects.
Entity framework as a Model layer in MVC architecture has
simplified communication with the database. How often are the new
technologies used and whether they have potentials for designing
more efficient Enterprise Resource Planning (ERP) system that will
be more suited to accountants?
Abstract: The goal of data mining algorithms is to discover
useful information embedded in large databases. One of the most
important data mining problems is discovery of frequently occurring
patterns in sequential data. In a multidimensional sequence each
event depends on more than one dimension. The search space is quite
large and the serial algorithms are not scalable for very large
datasets. To address this, it is necessary to study scalable parallel
implementations of sequence mining algorithms.
In this paper, we present a model for multidimensional sequence
and describe a parallel algorithm based on data parallelism.
Simulation experiments show good load balancing and scalable and
acceptable speedup over different processors and problem sizes and
demonstrate that our approach can works efficiently in a real parallel
computing environment.
Abstract: Lectins have a good scope in current clinical
microbiology research. In the present study evaluated the
antimicrobial activities of a D-galactose binding lectin (PnL) was
purified from the annelid, Perinereis nuntia (polychaeta) by affinity
chromatography. The molecular mass of the lectin was determined to
be 32 kDa as a single polypeptide by SDS-PAGE under both reducing
and non-reducing conditions. The hemagglutinating activity of the
PnL showed against trypsinized and glutaraldehyde-fixed human
erythrocytes was specifically inhibited by D-Gal, GalNAc,
Galβ1-4Glc and Galα1-6Glc. PnL was evaluated for in vitro
antibacterial screening studies against 11 gram-positive and
gram-negative microorganisms. From the screening results, it was
revealed that PnL exhibited significant antibacterial activity against
gram-positive bacteria. Bacillus megaterium showed the highest
growth inhibition by the lectin (250 μg/disc). However, PnL did not
inhibit the growth of gram-negative bacteria such as Vibrio cholerae
and Pseudomonas sp. PnL was also examined for in vitro antifungal
activity against six fungal phytopathogens. PnL (100 μg/mL) inhibited
the mycelial growth of Alternaria alternata (24.4%). These results
indicate that future findings of lectin applications obtained from
annelids may be of importance to life sciences.
Abstract: In this paper a class of analog algorithms based on the
concept of Cellular Neural Network (CNN) is applied in some
processing operations of some important medical images, namely
retina images, for detecting various symptoms connected with
diabetic retinopathy. Some specific processing tasks like
morphological operations, linear filtering and thresholding are
proposed, the corresponding template values are given and
simulations on real retina images are provided.
Abstract: Obesity is frequent attendant phenomenon of patients
with endocrinological disease. Between BMI and endocrinological
diseases is close correlation. In thesis we focused on the allocation of
hormone concentration – PTH and TSH, CHOL a mineral element Ca
in a blood serum. The examined group was formed by 100
respondents (women) aged 36 – 83 years, who were divided into two
groups – control group (CG), group with diagnosed endocrine disease
(DED). The concentration of PTH and TSH, Ca and CHOL was
measured through the medium of analyzers Cobas e411 (Japan);
Cobas Integra 400 (Switzerland). At individuals was measured body
weight as well as stature and thereupon from those data we
enumerated BMI. On the basis of Student T-test in biochemical
parameter of PTH and Ca we found out significantly meaningful
difference (p
Abstract: Air bending is one of the important metal forming
processes, because of its simplicity and large field application.
Accuracy of analytical and empirical models reported for the analysis
of bending processes is governed by simplifying assumption and do
not consider the effect of dynamic parameters. Number of researches
is reported on the finite element analysis (FEA) of V-bending, Ubending,
and air V-bending processes. FEA of bending is found to be
very sensitive to many physical and numerical parameters. FE
models must be computationally efficient for practical use. Reported
work shows the 3D FEA of air bending process using Hyperform LSDYNA
and its comparison with, published 3D FEA results of air
bending in Ansys LS-DYNA and experimental results. Observing the
planer symmetry and based on the assumption of plane strain
condition, air bending problem was modeled in 2D with symmetric
boundary condition in width. Stress-strain results of 2D FEA were
compared with 3D FEA results and experiments. Simplification of
air bending problem from 3D to 2D resulted into tremendous
reduction in the solution time with only marginal effect on stressstrain
results. FE model simplification by studying the problem
symmetry is more efficient and practical approach for solution of
more complex large dimensions slow forming processes.
Abstract: Background noise is particularly damaging to speech
intelligibility for people with hearing loss especially for sensorineural
loss patients. Several investigations on speech intelligibility have
demonstrated sensorineural loss patients need 5-15 dB higher SNR
than the normal hearing subjects. This paper describes Discrete
Cosine Transform Power Normalized Least Mean Square algorithm
to improve the SNR and to reduce the convergence rate of the LMS
for Sensory neural loss patients. Since it requires only real arithmetic,
it establishes the faster convergence rate as compare to time domain
LMS and also this transformation improves the eigenvalue
distribution of the input autocorrelation matrix of the LMS filter.
The DCT has good ortho-normal, separable, and energy compaction
property. Although the DCT does not separate frequencies, it is a
powerful signal decorrelator. It is a real valued function and thus
can be effectively used in real-time operation. The advantages of
DCT-LMS as compared to standard LMS algorithm are shown via
SNR and eigenvalue ratio computations. . Exploiting the symmetry
of the basis functions, the DCT transform matrix [AN] can be
factored into a series of ±1 butterflies and rotation angles. This
factorization results in one of the fastest DCT implementation. There
are different ways to obtain factorizations. This work uses the fast
factored DCT algorithm developed by Chen and company. The
computer simulations results show superior convergence
characteristics of the proposed algorithm by improving the SNR at
least 10 dB for input SNR less than and equal to 0 dB, faster
convergence speed and better time and frequency characteristics.
Abstract: In the era of great competition, understanding and satisfying
customers- requirements are the critical tasks for a company
to make a profits. Customer relationship management (CRM) thus
becomes an important business issue at present. With the help of
the data mining techniques, the manager can explore and analyze
from a large quantity of data to discover meaningful patterns and
rules. Among all methods, well-known association rule is most
commonly seen. This paper is based on Apriori algorithm and uses
genetic algorithms combining a data mining method to discover fuzzy
classification rules. The mined results can be applied in CRM to
help decision marker make correct business decisions for marketing
strategies.