Abstract: Smoothing or filtering of data is first preprocessing step
for noise suppression in many applications involving data analysis.
Moving average is the most popular method of smoothing the data,
generalization of this led to the development of Savitzky-Golay filter.
Many window smoothing methods were developed by convolving
the data with different window functions for different applications;
most widely used window functions are Gaussian or Kaiser. Function
approximation of the data by polynomial regression or Fourier
expansion or wavelet expansion also gives a smoothed data. Wavelets
also smooth the data to great extent by thresholding the wavelet
coefficients. Almost all smoothing methods destroys the peaks and
flatten them when the support of the window is increased. In certain
applications it is desirable to retain peaks while smoothing the data
as much as possible. In this paper we present a methodology called
as peak-wise smoothing that will smooth the data to any desired level
without losing the major peak features.
Abstract: The objective of this research is to calculate the
optimal inventory lot-sizing for each supplier and minimize the total
inventory cost which includes joint purchase cost of the products,
transaction cost for the suppliers, and holding cost for remaining
inventory. Genetic algorithms (GAs) are applied to the multi-product
and multi-period inventory lot-sizing problems with supplier
selection under storage space. Also a maximum storage space for the
decision maker in each period is considered. The decision maker
needs to determine what products to order in what quantities with
which suppliers in which periods. It is assumed that demand of
multiple products is known over a planning horizon. The problem is
formulated as a mixed integer programming and is solved with the
GAs. The detailed computation results are presented.
Abstract: This paper aims to explain the project carried out at
the University of Cordoba, specifically at the High Polytechnic
School in collaboration with two other organizations belonging to the
Andalusian Ministry of Innovation, Science and Business:
Andalusian Innovation and Development Agency (IDEA agency) [1]
and the Territorial Net of Entrepreneurship Support (in Spanish Red
Territorial de Apoyo al Emprendedor) [11].
The project is being developed in several stages of which only the
first one has already been completed. However, several important
preliminary results derive from it, based mainly in the description of
the nature of entrepreneurship in the field of university education and
its impact on student-s competency as recommended by the European
Higher Education Area. Some problems holding back the correct
future development will also be shown as derived from the specific
context of application of the project.
Abstract: High precision in motion is required to manipulate the
micro objects in precision industries for micro assembly, cell
manipulation etc. Precision manipulation is achieved based on the
appropriate mechanism design of micro devices such as
microgrippers. Design of a compliant based mechanism is the better
option to achieve a highly precised and controlled motion. This
research article highlights the method of designing a compliant based
three fingered microgripper suitable for holding asymmetric objects.
Topological optimization technique, a systematic method is
implemented in this research work to arrive a topologically optimized
design of the mechanism needed to perform the required micro
motion of the gripper. Optimization technique has a drawback of
generating senseless regions such as node to node connectivity and
staircase effect at the boundaries. Hence, it is required to have post
processing of the design to make it manufacturable. To reduce the
effect of post processing stage and to preserve the edges of the image,
a cubic spline interpolation technique is introduced in the MATLAB
program. Structural performance of the topologically developed
mechanism design is tested using finite element method (FEM)
software. Further the microgripper structure is examined to find its
fatigue life and vibration characteristics.
Abstract: In this paper, we propose ablock-wise watermarking scheme for color image authentication to resist malicious tampering of digital media. The thresholding technique is incorporated into the scheme such that the tampered region of the color image can be recovered with high quality while the proofing result is obtained. The watermark for each block consists of its dual authentication data and the corresponding feature information. The feature information for recovery iscomputed bythe thresholding technique. In the proofing process, we propose a dual-option parity check method to proof the validity of image blocks. In the recovery process, the feature information of each block embedded into the color image is rebuilt for high quality recovery. The simulation results show that the proposed watermarking scheme can effectively proof the tempered region with high detection rate and can recover the tempered region with high quality.
Abstract: Most simple nonlinear thresholding rules for
wavelet- based denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. This paper attempts to give a recipe for selecting one of the popular image-denoising algorithms based
on VisuShrink, SureShrink, OracleShrink, BayesShrink and BiShrink and also this paper compares different Bivariate models used for image denoising applications. The first part of the paper
compares different Shrinkage functions used for image-denoising.
The second part of the paper compares different bivariate models
and the third part of this paper uses the Bivariate model with modified marginal variance which is based on Laplacian assumption. This paper gives an experimental comparison on six 512x512 commonly used images, Lenna, Barbara, Goldhill,
Clown, Boat and Stonehenge. The following noise powers 25dB,26dB, 27dB, 28dB and 29dB are added to the six standard images and the corresponding Peak Signal to Noise Ratio (PSNR) values
are calculated for each noise level.
Abstract: Segmentation is an important step in medical image
analysis and classification for radiological evaluation or computer
aided diagnosis. This paper presents the problem of inaccurate lung
segmentation as observed in algorithms presented by researchers
working in the area of medical image analysis. The different lung
segmentation techniques have been tested using the dataset of 19
patients consisting of a total of 917 images. We obtained datasets of
11 patients from Ackron University, USA and of 8 patients from
AGA Khan Medical University, Pakistan. After testing the algorithms
against datasets, the deficiencies of each algorithm have been
highlighted.
Abstract: Supply Chain Management (SCM) is the integration
between manufacturer, transporter and customer in order to form one
seamless chain that allows smooth flow of raw materials, information
and products throughout the entire network that help in minimizing
all related efforts and costs. The main objective of this paper is to
develop a model that can accept a specified number of spare-parts
within the supply chain, simulating its inventory operations
throughout all stages in order to minimize the inventory holding
costs, base-stock, safety-stock, and to find the optimum quantity of
inventory levels, thereby suggesting a way forward to adapt some
factors of Just-In-Time to minimizing the inventory costs throughout
the entire supply chain. The model has been developed using Micro-
Soft Excel & Visual Basic in order to study inventory allocations in
any network of the supply chain. The application and reproducibility
of this model were tested by comparing the actual system that was
implemented in the case study with the results of the developed
model. The findings showed that the total inventory costs of the
developed model are about 50% less than the actual costs of the
inventory items within the case study.
Abstract: This paper introduces a new signal denoising based on the Empirical mode decomposition (EMD) framework. The method is a fully data driven approach. Noisy signal is decomposed adaptively into oscillatory components called Intrinsic mode functions (IMFs) by means of a process called sifting. The EMD denoising involves filtering or thresholding each IMF and reconstructs the estimated signal using the processed IMFs. The EMD can be combined with a filtering approach or with nonlinear transformation. In this work the Savitzky-Golay filter and shoftthresholding are investigated. For thresholding, IMF samples are shrinked or scaled below a threshold value. The standard deviation of the noise is estimated for every IMF. The threshold is derived for the Gaussian white noise. The method is tested on simulated and real data and compared with averaging, median and wavelet approaches.
Abstract: Speckle noise affects all coherent imaging systems
including medical ultrasound. In medical images, noise suppression
is a particularly delicate and difficult task. A tradeoff between noise
reduction and the preservation of actual image features has to be made
in a way that enhances the diagnostically relevant image content.
Even though wavelets have been extensively used for denoising
speckle images, we have found that denoising using contourlets gives
much better performance in terms of SNR, PSNR, MSE, variance and
correlation coefficient. The objective of the paper is to determine the
number of levels of Laplacian pyramidal decomposition, the number
of directional decompositions to perform on each pyramidal level and
thresholding schemes which yields optimal despeckling of medical
ultrasound images, in particular. The proposed method consists of the
log transformed original ultrasound image being subjected to contourlet
transform, to obtain contourlet coefficients. The transformed
image is denoised by applying thresholding techniques on individual
band pass sub bands using a Bayes shrinkage rule. We quantify the
achieved performance improvement.
Abstract: The soil moisture content is an important property of
the soil. The results of mean weekly gravimetric soil moisture
content, measured for the three soil layers within the A horizon,
showed that it was higher for the top 5 cm over the whole period of
monitoring (15/7/2004 up to 10/11/05) with the variation becoming
greater during winter time. This reflects the pattern of rainfall in
Ireland which is spread over the whole year and shows that light
rainfall events during summer time were compensated by loss
through evapotranspiration, but only in the top 5 cm of soil. This
layer had the highest porosity and highest moisture holding capacity
due to the high content of organic matter. The gravimetric soil
moisture contents of the top 5 cm and the underlying 5-15 and 15-25
cm layers show that bottom site of the Hill Field had higher soil
moisture content than the middle and top sites during the whole
period of monitoring.
Abstract: This paper presents a new steganography approach suitable for Arabic texts. It can be classified under steganography feature coding methods. The approach hides secret information bits within the letters benefiting from their inherited points. To note the specific letters holding secret bits, the scheme considers the two features, the existence of the points in the letters and the redundant Arabic extension character. We use the pointed letters with extension to hold the secret bit 'one' and the un-pointed letters with extension to hold 'zero'. This steganography technique is found attractive to other languages having similar texts to Arabic such as Persian and Urdu.
Abstract: Sharing the manufacturing facility through remote
operation and monitoring of a machining process is challenge for
effective use the production facility. Several automation tools in term
of hardware and software are necessary for successfully remote
operation of a machine. This paper presents a prototype of workpiece
holding attachment for remote operation of milling process by self
configuration the workpiece setup. The prototype is designed with
mechanism to reorient the work surface into machining spindle
direction with high positioning accuracy. Variety of parts geometry
is hold by attachment to perform single setup machining. Pin type
with array pattern additionally clamps the workpiece surface from
two opposite directions for increasing the machining rigidity.
Optimum pins configuration for conforming the workpiece geometry
with minimum deformation is determined through hybrid algorithms,
Genetic Algorithms (GA) and Particle Swarm Optimization (PSO).
Prototype with intelligent optimization technique enables to hold
several variety of workpiece geometry which is suitable for
machining low of repetitive production in remote operation.
Abstract: In this paper a new robust and efficient algorithm to automatic text extraction from colored book and journal cover sheets is proposed. First, we perform wavelet transform. Next for edge detecting from detail wavelet coefficient, we use dynamic threshold. By blurring approximate coefficients with alternative heuristic thresholding, achieve effective edge,. Afterward, with ROI technique get binary image. Finally text boxes would be extracted with new projection profile.
Abstract: This study assesses the vulnerability of Bulgarian
agriculture to drought using the WINISAREG model and seasonal
standard precipitation index SPI(2) for the period 1951-2004. This
model was previously validated for maize on soils of different water
holding capacity (TAW) in various locations. Simulations are
performed for Plovdiv, Stara Zagora and Sofia. Results relative to
Plovdiv show that in soils of large TAW (180 mm m-1) net irrigation
requirements (NIRs) range 0-40 mm in wet years and 350-380 mm in
dry years. In soils of small TAW (116 mm m-1), NIRs reach 440 mm
in the very dry year. NIRs in Sofia are about 80 mm smaller. Rainfed
maize is associated with great yield variability (29%
Abstract: This paper presents parametric probability density
models for call holding times (CHTs) into emergency call center
based on the actual data collected for over a week in the public
Emergency Information Network (EIN) in Mongolia. When the set of
chosen candidates of Gamma distribution family is fitted to the call
holding time data, it is observed that the whole area in the CHT
empirical histogram is underestimated due to spikes of higher
probability and long tails of lower probability in the histogram.
Therefore, we provide the Gaussian parametric model of a mixture of
lognormal distributions with explicit analytical expressions for the
modeling of CHTs of PSNs. Finally, we show that the CHTs for
PSNs are fitted reasonably by a mixture of lognormal distributions
via the simulation of expectation maximization algorithm. This result
is significant as it expresses a useful mathematical tool in an explicit
manner of a mixture of lognormal distributions.
Abstract: A new conceptual architecture for low-level neural
pattern recognition is presented. The key ideas are that the brain
implements support vector machines and that support vectors are
represented as memory patterns in competitive queuing memories. A
binary classifier is built from two competitive queuing memories
holding positive and negative valence training examples respectively.
The support vector machine classification function is calculated in
synchronized evaluation cycles. The kernel is computed by bisymmetric
feed-forward networks feed by sensory input and by
competitive queuing memories traversing the complete sequence of
support vectors. Temporary summation generates the output
classification. It is speculated that perception apparatus in the brain
reuses structures that have evolved for enabling fluent execution of
prepared action sequences so that pattern recognition is built on
internalized motor programmes.
Abstract: This paper describes a novel projection algorithm, the Projection Onto Span Algorithm (POSA) for wavelet-based superresolution and removing speckle (in wavelet domain) of unknown variance from Synthetic Aperture Radar (SAR) images. Although the POSA is good as a new superresolution algorithm for image enhancement, image metrology and biometric identification, here one will use it like a tool of despeckling, being the first time that an algorithm of super-resolution is used for despeckling of SAR images. Specifically, the speckled SAR image is decomposed into wavelet subbands; POSA is applied to the high subbands, and reconstruct a SAR image from the modified detail coefficients. Experimental results demonstrate that the new method compares favorably to several other despeckling methods on test SAR images.
Abstract: In this paper, supply policy and procurement of
shared resources in some kinds of concurrent construction projects
are investigated. This could be oriented to the problems of holding
construction companies who involve in different projects
concurrently and they have to supply limited resources to several
projects as well as prevent delays to any project. Limits on
transportation vehicles and storage facilities for potential
construction materials and also the available resources (such as cash
or manpower) are some of the examples which affect considerably on
management of all projects over all. The research includes
investigation of some real multi-storey buildings during their
execution periods and surveying the history of the activities. It is
shown that the common resource demand variation curve of the
projects may be expanded or displaced to achieve an optimum
distribution scheme. Of course, it may cause some delay to some
projects, but it has minimum influence on whole execution period of
all projects and its influence on procurement cost of the projects is
considerable. These observations on investigation of some
multistorey building which are built in Iran will be presented in this
paper.
Abstract: In this work a novel approach for color image
segmentation using higher order entropy as a textural feature for
determination of thresholds over a two dimensional image histogram
is discussed. A similar approach is applied to achieve multi-level
thresholding in both grayscale and color images. The paper discusses
two methods of color image segmentation using RGB space as the
standard processing space. The threshold for segmentation is decided
by the maximization of conditional entropy in the two dimensional
histogram of the color image separated into three grayscale images of
R, G and B. The features are first developed independently for the
three ( R, G, B ) spaces, and combined to get different color
component segmentation. By considering local maxima instead of the
maximum of conditional entropy yields multiple thresholds for the
same image which forms the basis for multilevel thresholding.