Abstract: In the present study, a numerical analysis is carried
out to investigate unsteady MHD (magneto-hydrodynamic) flow and
heat transfer of a non-Newtonian second grade viscoelastic fluid
over an oscillatory stretching sheet. The flow is induced due to an
infinite elastic sheet which is stretched oscillatory (back and forth) in
its own plane. Effect of viscous dissipation and joule heating are
taken into account. The non-linear differential equations governing
the problem are transformed into system of non-dimensional
differential equations using similarity transformations. A newly
developed meshfree numerical technique Element free Galerkin
method (EFGM) is employed to solve the coupled non linear
differential equations. The results illustrating the effect of various
parameters like viscoelastic parameter, Hartman number, relative
frequency amplitude of the oscillatory sheet to the stretching rate and
Eckert number on velocity and temperature field are reported in
terms of graphs and tables. The present model finds its application in
polymer extrusion, drawing of plastic films and wires, glass, fiber
and paper production etc.
Abstract: In this paper, a new secure watermarking scheme for
color image is proposed. It splits the watermark into two shares using
(2, 2)- threshold Visual Cryptography Scheme (V CS) with Adaptive
Order Dithering technique and embeds one share into high textured
subband of Luminance channel of the color image. The other share
is used as the key and is available only with the super-user or the
author of the image. In this scheme only the super-user can reveal
the original watermark. The proposed scheme is dynamic in the sense
that to maintain the perceptual similarity between the original and the
watermarked image the selected subband coefficients are modified
by varying the watermark scaling factor. The experimental results
demonstrate the effectiveness of the proposed scheme. Further, the
proposed scheme is able to resist all common attacks even with strong
amplitude.
Abstract: Eigenvector methods are gaining increasing acceptance in the area of spectrum estimation. This paper presents a successful attempt at testing and evaluating the performance of two of the most popular types of subspace techniques in determining the parameters of multiexponential signals with real decay constants buried in noise. In particular, MUSIC (Multiple Signal Classification) and minimum-norm techniques are examined. It is shown that these methods perform almost equally well on multiexponential signals with MUSIC displaying better defined peaks.
Abstract: The load flow study in a power system constitutes a study of paramount importance. The study reveals the electrical performance and power flows (real and reactive) for specified condition when the system is operating under steady state. This paper gives an overview of different techniques used for load flow study under different specified conditions.
Abstract: This study investigated the climatic factors associated
with Influenza incidence in Nakhon Si Thammarat, Southern
Thailand. Climatic factors comprised of the amount of rainfall,
percent of rainy days, relative humidity, wind speed, maximum,
minimum temperatures and temperature difference. A multiple
stepwise regression technique was used to fit the statistical model.
The result showed that the temperature difference and percent of
rainy days were positively associated with Influenza incidence in
Nakhon Si Thammarat.
Abstract: Arbitrarily shaped video objects are an important
concept in modern video coding methods. The techniques presently
used are not based on image elements but rather video objects having
an arbitrary shape. In this paper, spatial shape error concealment
techniques to be used for object-based image in error-prone
environments are proposed. We consider a geometric shape
representation consisting of the object boundary, which can be
extracted from the α-plane. Three different approaches are used to
replace a missing boundary segment: Bézier interpolation, Bézier
approximation and NURBS approximation. Experimental results on
object shape with different concealment difficulty demonstrate the
performance of the proposed methods. Comparisons with proposed
methods are also presented.
Abstract: Web applications have become very complex and
crucial, especially when combined with areas such as CRM
(Customer Relationship Management) and BPR (Business Process
Reengineering), the scientific community has focused attention to
Web applications design, development, analysis, and testing, by
studying and proposing methodologies and tools. This paper
proposes an approach to automatic multi-dimensional concern
mining for Web Applications, based on concepts analysis, impact
analysis, and token-based concern identification. This approach lets
the user to analyse and traverse Web software relevant to a particular
concern (concept, goal, purpose, etc.) via multi-dimensional
separation of concerns, to document, understand and test Web
applications. This technique was developed in the context of WAAT
(Web Applications Analysis and Testing) project. A semi-automatic
tool to support this technique is currently under development.
Abstract: The purpose of this study was to determine the
influence of physical activity and dietary fat intake on Body Mass
Index (BMI) of lecturers within a higher learning institutionalized
setting. The study adopted a Cross-sectional Correlational Design
and included 120 lecturers selected proportionately by simple
random sampling techniques from a population of 600 lecturers. Data
was collected using questionnaires, which had sections including
physical activity checklist adopted from the international physical
activity questionnaire (IPAQ), 24-hour food recall, anthropometric
measurements mainly weight and height. Analysis involved the use
of bivariate correlations and linear regression. A significant inverse
association was registered between BMI and duration (in minutes)
spent doing moderate intense physical activity per day (r=-0.322,
p
Abstract: The dispersion of heavy particles line in an isotropic
and incompressible three-dimensional turbulent flow has been
studied using the Kinematic Simulation techniques to find out the
evolution of the line fractal dimension. In this study, the fractal
dimension of the line is found for different cases of heavy particles
inertia (different Stokes numbers) in the absence of the particle
gravity with a comparison with the fractal dimension obtained in the
diffusion case of material line at the same Reynolds number. It can
be concluded for the dispersion of heavy particles line in turbulent
flow that the particle inertia affect the fractal dimension of a line
released in a turbulent flow for Stokes numbers 0.02 < St < 2. At the
beginning for small times, most of the different cases are not affected
by the inertia until a certain time, the particle response time τa, with
larger time as the particles inertia increases, the fractal dimension of
the line increases owing to the particles becoming more sensitive to
the small scales which cause the change in the line shape during its
journey.
Abstract: Optical Bursts Switching (OBS) is a relatively new
optical switching paradigm. Contention and burst loss in OBS
networks are major concerns. To resolve contentions, an interesting
alternative to discarding the entire data burst is to partially drop the
burst. Partial burst dropping is based on burst segmentation concept
that its implementation is constrained by some technical challenges,
besides the complexity added to the algorithms and protocols on both
edge and core nodes. In this paper, the burst segmentation concept is
investigated, and an implementation scheme is proposed and
evaluated. An appropriate dropping policy that effectively manages
the size of the segmented data bursts is presented. The dropping
policy is further supported by a new control packet format that
provides constant transmission overhead.
Abstract: Marketing is an essential issue to the survival of any
real estate company in Turkey. There are some factors which are
constraining the achievements of the marketing and sales strategies in
the Turkey real estate industry. This study aims to identify and
prioritise the most significant constraints to marketing in real estate
sector and new strategies based on those constraints. This study is
based on survey method, where the respondents such as credit
counsellors, real estate investors, consultants, academicians and
marketing representatives in Turkey were asked to rank forty seven
sub-factors according to their levels of impact. The results of Multiattribute
analytical technique indicated that the main subcomponents
having impact on marketing in real estate sector are interest rates, real
estate credit availability, accessibility, company image and consumer
real income, respectively. The identified constraints are expected to
guide the marketing team in a sales-effective way.
Abstract: A multi-agent system is developed here to predict
monthly details of the upcoming peak of the 24th solar magnetic
cycle. While studies typically predict the timing and magnitude of
cycle peaks using annual data, this one utilizes the unsmoothed
monthly sunspot number instead. Monthly numbers display more
pronounced fluctuations during periods of strong solar magnetic
activity than the annual sunspot numbers. Because strong magnetic
activities may cause significant economic damages, predicting
monthly variations should provide different and perhaps helpful
information for decision-making purposes. The multi-agent system
developed here operates in two stages. In the first, it produces twelve
predictions of the monthly numbers. In the second, it uses those
predictions to deliver a final forecast. Acting as expert agents, genetic
programming and neural networks produce the twelve fits and
forecasts as well as the final forecast. According to the results
obtained, the next peak is predicted to be 156 and is expected to
occur in October 2011- with an average of 136 for that year.
Abstract: The Brazilian legislation has only established
diagnostic reference levels (DRLs) in terms of Multiple Scan
Average Dose (MSAD) as a quality control parameter for computed
tomography (CT) scanners. Compliance with DRLs can be verified
by measuring the Computed Tomography Kerma Index (Ca,100) with
a pencil ionization chamber or by obtaining the kerma distribution in
CT scans with radiochromic films or rod shape lithium fluoride
termoluminescent dosimeters (TLD-100). TL dosimeters were used
to record kerma profiles and to determine MSAD values of a Bright
Speed model GE CT scanner. Measurements were done with
radiochromic films and TL dosimeters distributed in cylinders
positioned in the center and in four peripheral bores of a standard
polymethylmethacrylate (PMMA) body CT dosimetry phantom.
Irradiations were done using a protocol for adult chest. The
maximum values were found at the midpoint of the longitudinal axis.
The MSAD values obtained with three dosimetric techniques were
compared.
Abstract: Interventional cardiologists are at greater risk from
radiation exposure as a result of the procedures they undertake than
most other medical specialists. A study was performed to evaluate
operator dose during interventional cardiology procedures and to
establish methods of operator dose reduction with a radiation
protective device. Different procedure technique and use of
protective tools can explain big difference in the annual equivalent
dose received by the professionals. Strategies to prevent and
monitor radiation exposure, advanced protective shielding and
effective radiation monitoring methods should be applied.
Abstract: Since the pioneering work of Zadeh, fuzzy set theory has been applied to a myriad of areas. Song and Chissom introduced the concept of fuzzy time series and applied some methods to the enrollments of the University of Alabama. In recent years, a number of techniques have been proposed for forecasting based on fuzzy set theory methods. These methods have either used enrollment numbers or differences of enrollments as the universe of discourse. We propose using the year to year percentage change as the universe of discourse. In this communication, the approach of Jilani, Burney, and Ardil is modified by using the year to year percentage change as the universe of discourse. We use enrollment figures for the University of Alabama to illustrate our proposed method. The proposed method results in better forecasting accuracy than existing models.
Abstract: Characteristics of ad hoc networks and even their existence depend on the nodes forming them. Thus, services and applications designed for ad hoc networks should adapt to this dynamic and distributed environment. In particular, multicast algorithms having reliability and scalability requirements should abstain from centralized approaches. We aspire to define a reliable and scalable multicast protocol for ad hoc networks. Our target is to utilize epidemic techniques for this purpose. In this paper, we present a brief survey of epidemic algorithms for reliable multicasting in ad hoc networks, and describe formulations and analytical results for simple epidemics. Then, P2P anti-entropy algorithm for content distribution and our prototype simulation model are described together with our initial results demonstrating the behavior of the algorithm.
Abstract: The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.
Abstract: There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.
Abstract: The information revealed by derivatives can help to
better characterize digital near-end crosstalk signatures with the
ultimate goal of identifying the specific aggressor signal.
Unfortunately, derivatives tend to be very sensitive to even low
levels of noise. In this work we approximated the derivatives of both
quiet and noisy digital signals using a wavelet-based technique. The
results are presented for Gaussian digital edges, IBIS Model digital
edges, and digital edges in oscilloscope data captured from an actual
printed circuit board. Tradeoffs between accuracy and noise
immunity are presented. The results show that the wavelet technique
can produce first derivative approximations that are accurate to
within 5% or better, even under noisy conditions. The wavelet
technique can be used to calculate the derivative of a digital signal
edge when conventional methods fail.
Abstract: This paper presents a new fingerprint coding technique
based on contourlet transform and multistage vector quantization.
Wavelets have shown their ability in representing natural images that
contain smooth areas separated with edges. However, wavelets
cannot efficiently take advantage of the fact that the edges usually
found in fingerprints are smooth curves. This issue is addressed by
directional transforms, known as contourlets, which have the
property of preserving edges. The contourlet transform is a new
extension to the wavelet transform in two dimensions using
nonseparable and directional filter banks. The computation and
storage requirements are the major difficulty in implementing a
vector quantizer. In the full-search algorithm, the computation and
storage complexity is an exponential function of the number of bits
used in quantizing each frame of spectral information. The storage
requirement in multistage vector quantization is less when compared
to full search vector quantization. The coefficients of contourlet
transform are quantized by multistage vector quantization. The
quantized coefficients are encoded by Huffman coding. The results
obtained are tabulated and compared with the existing wavelet based
ones.