Abstract: Forecasting electricity load is important for various purposes like planning, operation and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria namely, the Mean Absolute Error and Root Mean Square Error. The National Renewable Energy Laboratory (NREL) residential energy consumption data are used to train the models. The results of this study show that SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts we can improve the robustness of the models for 24 hour ahead electricity load forecasting.
Abstract: Urban advances and the growing need for developing infrastructures has increased the importance of deep excavations. In this study, after the introducing probability analysis as an important issue, an attempt has been made to apply it for the deep excavation project of Bangkok’s Metro as a case study. For this, the numerical probability model has been developed based on the Finite Difference Method and Monte Carlo sampling approach. The results indicate that disregarding the issue of probability in this project will result in an inappropriate design of the retaining structure. Therefore, probabilistic redesign of the support is proposed and carried out as one of the applications of probability analysis. A 50% reduction in the flexural strength of the structure increases the failure probability just by 8% in the allowable range and helps improve economic conditions, while maintaining mechanical efficiency. With regard to the lack of efficient design in most deep excavations, by considering geometrical and geotechnical variability, an attempt was made to develop an optimum practical design standard for deep excavations based on failure probability. On this basis, a practical relationship is presented for estimating the maximum allowable horizontal displacement, which can help improve design conditions without developing the probability analysis.
Abstract: This research presents the first constant approximation
algorithm to the p-median network design problem with multiple
cable types. This problem was addressed with a single cable type and
there is a bifactor approximation algorithm for the problem. To the
best of our knowledge, the algorithm proposed in this paper is the first
constant approximation algorithm for the p-median network design
with multiple cable types. The addressed problem is a combination of
two well studied problems which are p-median problem and network
design problem. The introduced algorithm is a random sampling
approximation algorithm of constant factor which is conceived by
using some random sampling techniques form the literature. It is
based on a redistribution Lemma from the literature and a steiner tree
problem as a subproblem. This algorithm is simple, and it relies on the
notions of random sampling and probability. The proposed approach
gives an approximation solution with one constant ratio without
violating any of the constraints, in contrast to the one proposed in the
literature. This paper provides a (21 + 2)-approximation algorithm
for the p-median network design problem with multiple cable types
using random sampling techniques.
Abstract: High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.
Abstract: In this paper, we propose labeling based RANSAC algorithm for lane detection. Advanced driver assistance systems (ADAS) have been widely researched to avoid unexpected accidents. Lane detection is a necessary system to assist keeping lane and lane departure prevention. The proposed vision based lane detection method applies Canny edge detection, inverse perspective mapping (IPM), K-means algorithm, mathematical morphology operations and 8 connected-component labeling. Next, random samples are selected from each labeling region for RANSAC. The sampling method selects the points of lane with a high probability. Finally, lane parameters of straight line or curve equations are estimated. Through the simulations tested on video recorded at daytime and nighttime, we show that the proposed method has better performance than the existing RANSAC algorithm in various environments.
Abstract: Climate change has both negative and positive effects in agricultural production. For agriculture to be sustainable in adverse climate change condition, some natural measures are needed. The issue is to produce more food with available natural resources and reduce the contribution of agriculture to climate change. The study reviewed climate change and sustainable agriculture in southeast Nigeria. Data from the study were from secondary sources. Ten scientific papers were consulted and data for the review were collected from three. The objectives of the paper were as follows: to review the effect of climate change on one major arable crop in southeast Nigeria (yam; Dioscorea rotundata); evident of climate change impact and methods for sustainable agricultural production in adverse weather condition. Some climatic parameter as sunshine, relative humidity and rainfall have negative relationship with yam production and significant at 10% probability. Crop production was predicted to decline by 25% per hectare by 2060 while livestock production has increased the incidence of diseases and pathogens as the major effect to agriculture. Methods for sustainable agriculture and damage of natural resources by climate change were highlighted. Agriculture needs to be transformed as climate changes to enable the sector to be sustainable. There should be a policy in place to facilitate the integration of sustainability in Nigeria agriculture.
Abstract: The principal purpose of this paper is to find the influence of maximum fatigue load on the probabilistic aspect of fatigue crack propagation life at a specified grown crack in magnesium alloys. The experiments of fatigue crack propagation are carried out in laboratory air under different conditions of the maximum fatigue loads to obtain the fatigue crack propagation data for the statistical analysis. In order to analyze the probabilistic aspect of fatigue crack propagation life, the goodness-of fit test for probability distribution of the fatigue crack propagation life at a specified grown crack is implemented through Anderson-Darling test. The good probability distribution of the fatigue crack propagation life is also verified under the conditions of the maximum fatigue loads.
Abstract: This study presents a conformational model of the helical structures of globular protein particularly ferritin in the framework of white noise path integral formulation by using Associated Legendre functions, Bessel and convolution of Bessel and trigonometric functions as modulating functions. The model incorporates chirality features of proteins and their helix-turn-helix sequence structural motif.
Abstract: The purpose of this study is to determine the
relationship of anxiety level between male and female undergraduates
at a private university in Malaysia. Convenient sampling method used
in this study in which the students were selected based on the
grouping assigned by the faculty. There were 214 undergraduates
who registered the probability courses had participated in this study.
Mathematics Anxiety Rating Scale (MARS) was the instrument used
in study which used to determine students’ anxiety level towards
probability. Reliability and validity of instrument was done before the
major study was conducted. In the major study, students were given
briefing about the study conducted. Participation of this study was
voluntary. Students were given consent form to determine whether
they agree to participate in the study. Duration of two weeks was
given for students to complete the given online questionnaire. The
data collected will be analyzed using Statistical Package for the
Social Sciences (SPSS) to determine the level of anxiety. There were
three anxiety level, i.e., low, average and high. Students’ anxiety
level was determined based on their scores obtained compared with
the mean and standard deviation. If the scores obtained were below
mean and standard deviation, the anxiety level was low. If the scores
were at below and above the mean and between one standard
deviation, the anxiety level was average. If the scores were above the
mean and greater than one standard deviation, the anxiety level was
high. Results showed that both of genders were having average
anxiety level. Among low, average and high anxiety level, frequency
of males were found to be higher as compared to females. Hence, the
mean values obtained for males (M = 3.62) was higher than females
(M = 3.42). In order to be significant of anxiety level among the
gender, the p-value should be less than .05. The p-value obtained in
this study was .117. However, this value was greater than .05. Thus,
there was no significant difference of anxiety level among the gender.
In other words, there was no relationship of anxiety level with the
gender.
Abstract: In this research the effect of moisture at three levels
(47, 57, and 67 w.b.%) on the physical properties of the Pofaki pea
variety including, dimensions, geometric mean diameter, volume,
sphericity index and the surface area was determined. The influence
of different moisture levels (47, 57 and 67 w.b.%), in two loading
orientation (longitudinal and transverse) and three loading speed (4,6
and 8 mm min-1) on the mechanical properties of pea such as
maximum deformation, rupture force, rupture energy, toughness and
the power to break the pea was investigated. It was observed in the
physical properties that moisture changes were affective at 1% on,
dimensions, geometric mean diameter, volume, sphericity index and
the surface area. It was observed in the mechanical properties that
moisture changes were effective at 1% on, maximum deformation,
rupture force, rupture energy, toughness and the power to break.
Loading speed was effective on maximum deformation, rupture
force, rupture energy at 1% and it was effective on toughness at 5%.
Loading orientation was effective on maximum deformation, rupture
force, rupture energy, toughness at 1% and it was effective on power
at 5%. The mutual effect of speed and orientation were effective on
rupture energy at 1% and were effective on toughness at 5%
probability. The mutual effect of moisture and speed were effective
on rupture force and rupture energy at 1% and were effective on
toughness 5% probability. The mutual effect of orientation and
moisture on rupture energy and toughness were effective at 1%.
Abstract: IEEE 802.11a/b/g standards provide multiple
transmission rates, which can be changed dynamically according to the
channel condition. Cooperative communications were introduced to
improve the overall performance of wireless LANs with the help of
relay nodes with higher transmission rates. The cooperative
communications are based on the fact that the transmission is much
faster when sending data packets to a destination node through a relay
node with higher transmission rate, rather than sending data directly to
the destination node at low transmission rate. To apply the cooperative
communications in wireless LAN, several MAC protocols have been
proposed. Some of them can result in collisions among relay nodes in a
dense network. In order to solve this problem, we propose a new
protocol. Relay nodes are grouped based on their transmission rates.
And then, relay nodes only in the highest group try to get channel
access. Performance evaluation is conducted using simulation, and
shows that the proposed protocol significantly outperforms the
previous protocol in terms of throughput and collision probability.
Abstract: One of the most important tasks in the risk
management is the correct determination of probability of default
(PD) of particular financial subjects. In this paper a possibility of
determination of financial institution’s PD according to the creditscoring
models is discussed. The paper is divided into the two parts.
The first part is devoted to the estimation of the three different
models (based on the linear discriminant analysis, logit regression
and probit regression) from the sample of almost three hundred US
commercial banks. Afterwards these models are compared and
verified on the control sample with the view to choose the best one.
The second part of the paper is aimed at the application of the chosen
model on the portfolio of three key Czech banks to estimate their
present financial stability. However, it is not less important to be able
to estimate the evolution of PD in the future. For this reason, the
second task in this paper is to estimate the probability distribution of
the future PD for the Czech banks. So, there are sampled randomly
the values of particular indicators and estimated the PDs’ distribution,
while it’s assumed that the indicators are distributed according to the
multidimensional subordinated Lévy model (Variance Gamma model
and Normal Inverse Gaussian model, particularly). Although the
obtained results show that all banks are relatively healthy, there is
still high chance that “a financial crisis” will occur, at least in terms
of probability. This is indicated by estimation of the various quantiles
in the estimated distributions. Finally, it should be noted that the
applicability of the estimated model (with respect to the used data) is
limited to the recessionary phase of the financial market.
Abstract: In this study, we estimated the seismic ground motion parameters based on microtremor measurements atPalu City. Several earthquakes have struck along the Palu-Koro Fault during recent years. The USGS epicenter, magnitude Mw 6.3 event that occurred on January 23, 2005 caused several casualties. We conducted a microtremor survey to estimate the strong ground motion distribution during the earthquake. From this surveywe produced a map of the peak ground acceleration, velocity, seismic vulnerability index and ground shear strain maps in Palu City. We performed single observations of microtremor at 151 sites in Palu City. We also conducted8-site microtremors array investigation to gain a representative determination of the soil condition of subsurface structures in Palu City.From the array observations, Palu City corresponds to relatively soil condition with Vs ≤ 300m/s, the predominant periods due to horizontal vertical ratios (HVSRs) are in the range of 0.4 to 1.8 s and the frequency are in the range of 0.7 to 3.3 Hz. Strong ground motions of the Palu area were predicted based on the empirical stochastic green’s function method. Peak ground acceleration and velocity becomes more than 400 gal and 30 kine in some areas, which causes severe damage for buildings in high probability. Microtremor survey results showed that in hilly areas had low seismic vulnerability index and ground shear strain, whereas in coastal alluvium was composed of material having a high seismic vulnerability and ground shear strain indication.
Abstract: We study the movement of a two-level atom in
interaction with time dependent nonuniform magnetic filed using the
path integral formalism. The propagator is first written in the standard
form by replacing the spin by a unit vector aligned along the polar and
azimuthal directions. Then it is determined exactly using perturbation
methods. Thus the Rabi formula of the system are deduced.
Abstract: This paper presents a finite buffer renewal input single working vacation and vacation interruption queue with state dependent services and state dependent vacations, which has a wide range of applications in several areas including manufacturing, wireless communication systems. Service times during busy period, vacation period and vacation times are exponentially distributed and are state dependent. As a result of the finite waiting space, state dependent services and state dependent vacation policies, the analysis of these queueing models needs special attention. We provide a recursive method using the supplementary variable technique to compute the stationary queue length distributions at pre-arrival and arbitrary epochs. An efficient computational algorithm of the model is presented which is fast and accurate and easy to implement. Various performance measures have been discussed. Finally, some special cases and numerical results have been depicted in the form of tables and graphs.
Abstract: Signature amortization schemes have been introduced
for authenticating multicast streams, in which, a single signature is
amortized over several packets. The hash value of each packet is
computed, some hash values are appended to other packets, forming
what is known as hash chain. These schemes divide the stream into
blocks, each block is a number of packets, the signature packet in
these schemes is either the first or the last packet of the block.
Amortization schemes are efficient solutions in terms of computation
and communication overhead, specially in real-time environment.
The main effictive factor of amortization schemes is it-s hash chain
construction. Some studies show that signing the first packet of each
block reduces the receiver-s delay and prevents DoS attacks, other
studies show that signing the last packet reduces the sender-s delay.
To our knowledge, there is no studies that show which is better, to
sign the first or the last packet in terms of authentication probability
and resistance to packet loss.
In th is paper we will introduce another scheme for authenticating
multicast streams that is robust against packet loss, reduces the
overhead, and prevents the DoS attacks experienced by the receiver
in the same time. Our scheme-The Multiple Connected Chain signing
the First packet (MCF) is to append the hash values of specific
packets to other packets,then append some hashes to the signature
packet which is sent as the first packet in the block. This scheme
is aspecially efficient in terms of receiver-s delay. We discuss and
evaluate the performance of our proposed scheme against those that
sign the last packet of the block.
Abstract: An electric utility-s main concern is to plan, design, operate and maintain its power supply to provide an acceptable level of reliability to its users. This clearly requires that standards of reliability be specified and used in all three sectors of the power system, i.e., generation, transmission and distribution. That is why reliability of a power system is always a major concern to power system planners. This paper presents the reliability analysis of Bangladesh Power System (BPS). Reliability index, loss of load probability (LOLP) of BPS is evaluated using recursive algorithm and considering no de-rated states of generators. BPS has sixty one generators and a total installed capacity of 5275 MW. The maximum demand of BPS is about 5000 MW. The relevant data of the generators and hourly load profiles are collected from the National Load Dispatch Center (NLDC) of Bangladesh and reliability index 'LOLP' is assessed for the period of last ten years.
Abstract: Bloom filter is a probabilistic and memory efficient
data structure designed to answer rapidly whether an element is
present in a set. It tells that the element is definitely not in the set but
its presence is with certain probability. The trade-off to use Bloom
filter is a certain configurable risk of false positives. The odds of a
false positive can be made very low if the number of hash function is
sufficiently large. For spam detection, weight is attached to each set
of elements. The spam weight for a word is a measure used to rate the
e-mail. Each word is assigned to a Bloom filter based on its weight.
The proposed work introduces an enhanced concept in Bloom filter
called Bin Bloom Filter (BBF). The performance of BBF over
conventional Bloom filter is evaluated under various optimization
techniques. Real time data set and synthetic data sets are used for
experimental analysis and the results are demonstrated for bin sizes 4,
5, 6 and 7. Finally analyzing the results, it is found that the BBF
which uses heuristic techniques performs better than the traditional
Bloom filter in spam detection.
Abstract: In this paper we are interested in classification problems
with a performance constraint on error probability. In such
problems if the constraint cannot be satisfied, then a rejection option
is introduced. For binary labelled classification, a number of SVM
based methods with rejection option have been proposed over the
past few years. All of these methods use two thresholds on the SVM
output. However, in previous works, we have shown on synthetic data
that using thresholds on the output of the optimal SVM may lead to
poor results for classification tasks with performance constraint. In
this paper a new method for supervised classification with rejection
option is proposed. It consists in two different classifiers jointly
optimized to minimize the rejection probability subject to a given
constraint on error rate. This method uses a new kernel based linear
learning machine that we have recently presented. This learning
machine is characterized by its simplicity and high training speed
which makes the simultaneous optimization of the two classifiers
computationally reasonable. The proposed classification method with
rejection option is compared to a SVM based rejection method
proposed in recent literature. Experiments show the superiority of
the proposed method.