Abstract: This paper proposed a novel model for short term load
forecast (STLF) in the electricity market. The prior electricity
demand data are treated as time series. The model is composed of
several neural networks whose data are processed using a wavelet
technique. The model is created in the form of a simulation program
written with MATLAB. The load data are treated as time series data.
They are decomposed into several wavelet coefficient series using
the wavelet transform technique known as Non-decimated Wavelet
Transform (NWT). The reason for using this technique is the belief
in the possibility of extracting hidden patterns from the time series
data. The wavelet coefficient series are used to train the neural
networks (NNs) and used as the inputs to the NNs for electricity load
prediction. The Scale Conjugate Gradient (SCG) algorithm is used as
the learning algorithm for the NNs. To get the final forecast data, the
outputs from the NNs are recombined using the same wavelet
technique. The model was evaluated with the electricity load data of
Electronic Engineering Department in Mandalay Technological
University in Myanmar. The simulation results showed that the
model was capable of producing a reasonable forecasting accuracy in
STLF.
Abstract: Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.
Abstract: In recent years, everything is trending toward digitalization
and with the rapid development of the Internet technologies,
digital media needs to be transmitted conveniently over the network.
Attacks, misuse or unauthorized access of information is of great
concern today which makes the protection of documents through
digital media a priority problem. This urges us to devise new data
hiding techniques to protect and secure the data of vital significance.
In this respect, steganography often comes to the fore as a tool for
hiding information. Steganography is a process that involves hiding
a message in an appropriate carrier like image or audio. It is of
Greek origin and means "covered or hidden writing". The goal of
steganography is covert communication. Here the carrier can be sent
to a receiver without any one except the authenticated receiver only
knows existence of the information. Considerable amount of work
has been carried out by different researchers on steganography. In this
work the authors propose a novel Steganographic method for hiding
information within the spatial domain of the gray scale image. The
proposed approach works by selecting the embedding pixels using
some mathematical function and then finds the 8 neighborhood of
the each selected pixel and map each bit of the secret message in
each of the neighbor pixel coordinate position in a specified manner.
Before embedding a checking has been done to find out whether the
selected pixel or its neighbor lies at the boundary of the image or not.
This solution is independent of the nature of the data to be hidden
and produces a stego image with minimum degradation.
Abstract: One of the determinants of a firm-s prosperity is the
customers- perceived service quality and satisfaction. While service
quality is wide in scope, and consists of various dimensions, there
may be differences in the relative importance of these dimensions in
affecting customers- overall satisfaction of service quality.
Identifying the relative rank of different dimensions of service quality
is very important in that it can help managers to find out which
service dimensions have a greater effect on customers- overall
satisfaction. Such an insight will consequently lead to more effective
resource allocation which will finally end in higher levels of
customer satisfaction. This issue –despite its criticality- has not
received enough attention so far. Therefore, using a sample of 240
bank customers in Iran, an artificial neural network is developed to
address this gap in the literature. As customers- evaluation of service
quality is a subjective process, artificial neural networks –as a brain
metaphor- may appear to have a potentiality to model such a
complicated process. Proposing a neural network which is able to
predict the customers- overall satisfaction of service quality with a
promising level of accuracy is the first contribution of this study. In
addition, prioritizing the service quality dimensions in affecting
customers- overall satisfaction –by using sensitivity analysis of
neural network- is the second important finding of this paper.
Abstract: This paper presents the source extraction system which can extract only target signals with constraints on source localization in on-line systems. The proposed system is a kind of methods for enhancing a target signal and suppressing other interference signals. But, the performance of proposed system is superior to any other methods and the extraction of target source is comparatively complete. The method has a beamforming concept and uses an improved time-frequency (TF) mask-based BSS algorithm to separate a target signal from multiple noise sources. The target sources are assumed to be in front and test data was recorded in a reverberant room. The experimental results of the proposed method was evaluated by the PESQ score of real-recording sentences and showed a noticeable speech enhancement.
Abstract: Deformable active contours are widely used in
computer vision and image processing applications for image
segmentation, especially in biomedical image analysis. The active
contour or “snake" deforms towards a target object by controlling the
internal, image and constraint forces. However, if the contour
initialized with a lesser number of control points, there is a high
probability of surpassing the sharp corners of the object during
deformation of the contour. In this paper, a new technique is
proposed to construct the initial contour by incorporating prior
knowledge of significant corners of the object detected using the
Harris operator. This new reconstructed contour begins to deform, by
attracting the snake towards the targeted object, without missing the
corners. Experimental results with several synthetic images show the
ability of the new technique to deal with sharp corners with a high
accuracy than traditional methods.
Abstract: Short term electricity demand forecasts are required
by power utilities for efficient operation of the power grid. In a
competitive market environment, suppliers and large consumers also
require short term forecasts in order to estimate their energy
requirements in advance. Electricity demand is influenced (among
other things) by the day of the week, the time of year and special
periods and/or days such as Ramadhan, all of which must be
identified prior to modelling. This identification, known as day-type
identification, must be included in the modelling stage either by
segmenting the data and modelling each day-type separately or by
including the day-type as an input. Day-type identification is the
main focus of this paper. A Kohonen map is employed to identify the
separate day-types in Algerian data.
Abstract: Fair share objective has been included into the goaloriented
parallel computer job scheduling policy recently. However,
the previous work only presented the overall scheduling performance.
Thus, the per-user performance of the policy is still lacking. In this
work, the details of per-user fair share performance under the
Tradeoff-fs(Tx:avgX) policy will be further evaluated. A basic fair
share priority backfill policy namely RelShare(1d) is also studied.
The performance of all policies is collected using an event-driven
simulator with three real job traces as input. The experimental results
show that the high demand users are usually benefited under most
policies because their jobs are large or they have a lot of jobs. In the
large job case, one job executed may result in over-share during that
period. In the other case, the jobs may be backfilled for
performances. However, the users with a mixture of jobs may suffer
because if the smaller jobs are executing the priority of the remaining
jobs from the same user will be lower. Further analysis does not show
any significant impact of users with a lot of jobs or users with a large
runtime approximation error.