Abstract: Liver segmentation is the first significant process for
liver diagnosis of the Computed Tomography. It segments the liver
structure from other abdominal organs. Sophisticated filtering techniques
are indispensable for a proper segmentation. In this paper, we
employ a 3D anisotropic diffusion as a preprocessing step. While
removing image noise, this technique preserve the significant parts
of the image, typically edges, lines or other details that are important
for the interpretation of the image. The segmentation task is done
by using thresholding with automatic threshold values selection and
finally the false liver region is eliminated using 3D connected component.
The result shows that by employing the 3D anisotropic filtering,
better liver segmentation results could be achieved eventhough simple
segmentation technique is used.
Abstract: A great deal of research works in the field information
systems security has been based on a positivist paradigm. Applying
the reductionism concept of the positivist paradigm for information
security means missing the bigger picture and thus, the lack of holism
which could be one of the reasons why security is still overlooked,
comes as an afterthought or perceived from a purely technical
dimension. We need to reshape our thinking and attitudes towards
security especially in a complex and dynamic environment such as e-
Business to develop a holistic understanding of e-Business security in
relation to its context as well as considering all the stakeholders in
the problem area. In this paper we argue the suitability and need for
more inductive interpretive approach and qualitative research method
to investigate e-Business security. Our discussion is based on a
holistic framework of enquiry, nature of the research problem, the
underling theoretical lens and the complexity of e-Business
environment. At the end we present a research strategy for
developing a holistic framework for understanding of e-Business
security problems in the context of developing countries based on an
interdisciplinary inquiry which considers their needs and
requirements.
Abstract: The study aims to develop a framework of social
network management to enhance customer relationship. Social
network management of this research is derived from social network
site management, individual and organization social network usage
motivation. The survey was conducted with organization employees
who have used social network to interact with customers. The results
reveal that content, link, privacy and security, page design and
interactivity are the major issues of social network site management.
Content, link, privacy and security, individual and organization
motivation have major impacts on encouraging business knowledge
sharing among employees. Moreover, Page design and interactivity,
content, organization motivation and knowledge sharing can improve
customer relationships.
Abstract: Accurate timing alignment and stability is important
to maximize the true counts and minimize the random counts in
positron emission tomography So signals output from detectors must
be centering with the two isotopes to pre-operation and fed signals
into four units of pulse-processing units, each unit can accept up to
eight inputs. The dual source computed tomography consist two units
on the left for 15 detector signals of Cs-137 isotope and two units on
the right are for 15 detectors signals of Co-60 isotope. The gamma
spectrum consisting of either single or multiple photo peaks. This
allows for the use of energy discrimination electronic hardware
associated with the data acquisition system to acquire photon counts
data with a specific energy, even if poor energy resolution detectors
are used. This also helps to avoid counting of the Compton scatter
counts especially if a single discrete gamma photo peak is emitted by
the source as in the case of Cs-137. In this study the polyenergetic
version of the alternating minimization algorithm is applied to the
dual energy gamma computed tomography problem.
Abstract: Reinforced concrete has good durability and excellent structural performance. But there are cases of early deterioration due to a number of factors, one prominent factor being corrosion of steel reinforcement. The process of corrosion sets in due to ingress of moisture, oxygen and other ingredients into the body of concrete, which is unsound, permeable and absorbent. Cracks due to structural and other causes such as creep, shrinkage, etc also allow ingress of moisture and other harmful ingredients and thus accelerate the rate of corrosion. There are several interactive factors both external and internal, which lead to corrosion of reinforcement and ultimately failure of structures. Suitable addition of mineral admixture like silica fume (SF) in concrete improves the strength and durability of concrete due to considerable improvement in the microstructure of concrete composites, especially at the transition zone. Secondary reinforcement in the form of fibre is added to concrete, which provides three dimensional random reinforcement in the entire mass of concrete. Reinforced concrete beams of size 0.1 m X 0.15 m and length 1m have been cast using M 35 grade of concrete. The beams after curing process were subjected to corrosion process by impressing an external Direct Current (Galvanostatic Method) for a period of 15 days under stressed and unstressed conditions. The corroded beams were tested by applying two point loads to determine the ultimate load carrying capacity and cracking pattern and the results of specimens were compared with that of the companion specimens. Gravimetric method is used to quantify corrosion that has occurred.
Abstract: A mathematical model for the hydrodynamics of a
surface water treatment pilot plant was developed and validated by
the determination of the residence time distribution (RTD) for the
main equipments of the unit. The well known models of ideal/real
mixing, ideal displacement (plug flow) and (one-dimensional axial)
dispersion model were combined in order to identify the structure
that gives the best fitting of the experimental data for each equipment
of the pilot plant. RTD experimental results have shown that pilot
plant hydrodynamics can be quite well approximated by a
combination of simple mathematical models, structure which is
suitable for engineering applications. Validated hydrodynamic
models will be further used in the evaluation and selection of the
most suitable coagulation-flocculation reagents, optimum operating
conditions (injection point, reaction times, etc.), in order to improve
the quality of the drinking water.
Abstract: In this work, we present for the first time in our
perception an efficient digital watermarking scheme for mpeg audio
layer 3 files that operates directly in the compressed data domain,
while manipulating the time and subband/channel domain. In
addition, it does not need the original signal to detect the watermark.
Our scheme was implemented taking special care for the efficient
usage of the two limited resources of computer systems: time and
space. It offers to the industrial user the capability of watermark
embedding and detection in time immediately comparable to the real
music time of the original audio file that depends on the mpeg
compression, while the end user/audience does not face any artifacts
or delays hearing the watermarked audio file. Furthermore, it
overcomes the disadvantage of algorithms operating in the PCMData
domain to be vulnerable to compression/recompression attacks,
as it places the watermark in the scale factors domain and not in the
digitized sound audio data. The strength of our scheme, that allows it
to be used with success in both authentication and copyright
protection, relies on the fact that it gives to the users the enhanced
capability their ownership of the audio file not to be accomplished
simply by detecting the bit pattern that comprises the watermark
itself, but by showing that the legal owner knows a hard to compute
property of the watermark.
Abstract: Recent medical studies have investigated the importance of enteral feeding and the use of feeding pumps for recovering patients unable to feed themselves or gain nourishment and nutrients by natural means. The most of enteral feeding system uses a peristaltic tube pump. A peristaltic pump is a form of positive displacement pump in which a flexible tube is progressively squeezed externally to allow the resulting enclosed pillow of fluid to progress along it. The squeezing of the tube requires a precise and robust controller of the geared motor to overcome parametric uncertainty of the pumping system which generates due to a wide variation of friction and slip between tube and roller. So, this paper proposes fuzzy adaptive controller for the robust control of the peristaltic tube pump. This new adaptive controller uses a fuzzy multi-layered architecture which has several independent fuzzy controllers in parallel, each with different robust stability area. Out of several independent fuzzy controllers, the most suited one is selected by a system identifier which observes variations in the controlled system parameter. This paper proposes a design procedure which can be carried out mathematically and systematically from the model of a controlled system. Finally, the good control performance, accurate dose rate and robust system stability, of the developed feeding pump is confirmed through experimental and clinic testing.
Abstract: By taking advantage of both k-NN which is highly
accurate and K-means cluster which is able to reduce the time of classification, we can introduce Cluster-k-Nearest Neighbor as "variable k"-NN dealing with the centroid or mean point of all subclasses generated by clustering algorithm. In general the algorithm of K-means cluster is not stable, in term of accuracy, for that reason we develop another algorithm for clustering our space which gives a higher accuracy than K-means cluster, less
subclass number, stability and bounded time of classification with respect to the variable data size. We find between 96% and 99.7 % of accuracy in the lassification of 6 different types of Time series by using K-means cluster algorithm and we find 99.7% by using the new clustering algorithm.
Abstract: Ultrasound is useful in demonstrating bone mineral
density of regenerating osseous tissue as well as structural alterations.
A proposed ultrasound method, which included ultrasonography and
acoustic parameters measurement, was employed to evaluate its
efficacy in monitoring the bone callus changes in a rabbit tibial
distraction osteogenesis (DO) model.
The findings demonstrated that ultrasonographic images depicted
characteristic changes of the bone callus, typical of histology findings,
during the distraction phase. Follow-up acoustic parameters
measurement of the bone callus, including speed of sound, reflection
and attenuation, showed significant linear changes over time during
the distraction phase. The acoustic parameters obtained during the
distraction phase also showed moderate to strong correlation with
consolidated bone callus density and micro-architecture measured by
micro-computed tomography at the end of the consolidation phase.
The results support the preferred use of ultrasound imaging in the
early monitoring of bone callus changes during DO treatment.
Abstract: To simulate expected climate change, we implemented a two-factor (temperature and soil moisture) field design in a forest in Ontario, Canada. To manipulate moisture input, we erected rain-exclusion structures. Under each structure, plots were watered with one of three treatments and thermally controlled with three heat treatments to simulate changes in air temperature and rainfall based on the climate model (GCM) predictions for the study area. Environmental conditions (including untreated controls) were monitored tracking air temperature, soil temperature, soil moisture, and photosynthetically active radiation. We measured rainfall and relative humidity at the site outside the rain-exclusion structures. Analyses of environmental conditions demonstrates that the temperature manipulation was most effective at maintaining target temperature during the early part of the growing season, but it was more difficult to keep the warmest treatment at 5º C above ambient by late summer. Target moisture regimes were generally achieved however incoming solar radiation was slightly attenuated by the structures.
Abstract: A product goes through various processes in a production flow which is also known as assembly line in manufacturing process management. Toyota created a new concept which is known as lean concept in manufacturing industry. Today it is the leading model in manufacturing plants through the globe. The linear walking worker assembly line is a flexible assembly system where each worker travels down the line carrying out each assembly task at each station; and each worker accomplishes the assembly of a unit from start to finish. This paper attempts to combine the flexibility of the walking worker and lean in order to quantify the benefits from applying the shop floor principles of lean management.
Abstract: Response surface methodology was used for
quantitative investigation of water and solids transfer during osmotic
dehydration of beetroot in aqueous solution of salt. Effects of
temperature (25 – 45oC), processing time (30–150 min), salt
concentration (5–25%, w/w) and solution to sample ratio (5:1 – 25:1)
on osmotic dehydration of beetroot were estimated. Quadratic
regression equations describing the effects of these factors on the
water loss and solids gain were developed. It was found that effects
of temperature and salt concentrations were more significant on the
water loss than the effects of processing time and solution to sample
ratio. As for solids gain processing time and salt concentration were
the most significant factors. The osmotic dehydration process was
optimized for water loss, solute gain, and weight reduction. The
optimum conditions were found to be: temperature – 35oC,
processing time – 90 min, salt concentration – 14.31% and solution
to sample ratio 8.5:1. At these optimum values, water loss, solid gain
and weight reduction were found to be 30.86 (g/100 g initial sample),
9.43 (g/100 g initial sample) and 21.43 (g/100 g initial sample)
respectively.
Abstract: Truss spars are used for oil exploitation in deep and ultra-deep water if storage crude oil is not needed. The linear hydrodynamic analysis of truss spar in random sea wave load is necessary for determining the behaviour of truss spar. This understanding is not only important for design of the mooring lines, but also for optimising the truss spar design. In this paper linear hydrodynamic analysis of truss spar is carried out in frequency domain. The hydrodynamic forces are calculated using the modified Morison equation and diffraction theory. Added mass and drag coefficients of truss section computed by transmission matrix and normal acceleration and velocity component acting on each element and for hull section computed by strip theory. The stiffness properties of the truss spar can be separated into two components; hydrostatic stiffness and mooring line stiffness. Then, platform response amplitudes obtained by solved the equation of motion. This equation is non-linear due to viscous damping term therefore linearised by iteration method [1]. Finally computed RAOs and significant response amplitude and results are compared with experimental data.
Abstract: This paper will present the initial findings of a
research into distributed computer rendering. The goal of the
research is to create a distributed computer system capable of
rendering a 3D model into an MPEG-4 stream. This paper outlines
the initial design, software architecture and hardware setup for the
system.
Distributed computing means designing and implementing
programs that run on two or more interconnected computing systems.
Distributed computing is often used to speed up the rendering of
graphical imaging. Distributed computing systems are used to
generate images for movies, games and simulations.
A topic of interest is the application of distributed computing to
the MPEG-4 standard. During the course of the research, a
distributed system will be created that can render a 3D model into an
MPEG-4 stream. It is expected that applying distributed computing
principals will speed up rendering, thus improving the usefulness and
efficiency of the MPEG-4 standard
Abstract: The exhaustive quality control is becoming more and
more important when commercializing competitive products in the
world's globalized market. Taken this affirmation as an undeniable
truth, it becomes critical in certain sector markets that need to offer
the highest restrictions in quality terms. One of these examples is the
percussion cap mass production, a critical element assembled in
firearm ammunition. These elements, built in great quantities at a
very high speed, must achieve a minimum tolerance deviation in
their fabrication, due to their vital importance in firing the piece of
ammunition where they are built in. This paper outlines a machine
vision development for the 100% inspection of percussion caps
obtaining data from 2D and 3D simultaneous images. The acquisition
speed and precision of these images from a metallic reflective piece
as a percussion cap, the accuracy of the measures taken from these
images and the multiple fabrication errors detected make the main
findings of this work.
Abstract: This paper presents a highly efficient algorithm for detecting and tracking humans and objects in video surveillance sequences. Mean shift clustering is applied on backgrounddifferenced image sequences. For efficiency, all calculations are performed on integral images. Novel corresponding exponential integral kernels are introduced to allow the application of nonuniform kernels for clustering, which dramatically increases robustness without giving up the efficiency of the integral data structures. Experimental results demonstrating the power of this approach are presented.
Abstract: Wavelets have provided the researchers with
significant positive results, by entering the texture defect detection domain. The weak point of wavelets is that they are one-dimensional
by nature so they are not efficient enough to describe and analyze two-dimensional functions. In this paper we present a new method to
detect the defect of texture images by using curvelet transform.
Simulation results of the proposed method on a set of standard
texture images confirm its correctness. Comparing the obtained results indicates the ability of curvelet transform in describing
discontinuity in two-dimensional functions compared to wavelet
transform
Abstract: The urban centers within northeastern Brazil are
mainly influenced by the intense rainfalls, which can occur after long
periods of drought, when flood events can be observed during such
events. Thus, this paper aims to study the rainfall frequencies in such
region through the wavelet transform. An application of wavelet
analysis is done with long time series of the total monthly rainfall
amount at the capital cities of northeastern Brazil. The main
frequency components in the time series are studied by the global
wavelet spectrum and the modulation in separated periodicity bands
were done in order to extract additional information, e.g., the 8 and
16 months band was examined by an average of all scales, giving a
measure of the average annual variance versus time, where the
periods with low or high variance could be identified. The important
increases were identified in the average variance for some periods,
e.g. 1947 to 1952 at Teresina city, which can be considered as high
wet periods. Although, the precipitation in those sites showed similar
global wavelet spectra, the wavelet spectra revealed particular
features. This study can be considered an important tool for time
series analysis, which can help the studies concerning flood control,
mainly when they are applied together with rainfall-runoff
simulations.
Abstract: There are three main ways of categorizing capital in banking operations: accounting, regulatory and economic capital. However, the 2008-2009 global crisis has shown that none of these categories adequately reflects the real risks of bank operations, especially in light of the failures Bear Stearns, Lehman Brothers or Northern Rock. This paper deals with the economic capital allocation of global banks. In theory, economic capital should reflect the real risks of a bank and should be publicly available. Yet, as discovered during the global financial crisis, even when economic capital information was publicly disclosed, the underlying assumptions rendered the information useless. Specifically, some global banks that reported relatively high levels of economic capital before the crisis went bankrupt or had to be bailed-out by their government. And, only 15 out of 50 global banks reported their economic capital during the 2007-2010 period. In this paper, we analyze the changes in reported bank economic capital disclosure during this period. We conclude that relative shares of credit and business risks increased in 2010 compared to 2007, while both operational and market risks decreased their shares on the total economic capital of top-rated global banks. Generally speaking, higher levels of disclosure and transparency of bank operations are required to obtain more confidence from stakeholders. Moreover, additional risks such as liquidity risks should be included in these disclosures.