Abstract: Variable channel conditions in underwater networks,
and variable distances between sensors due to water current, leads to
variable bit error rate (BER). This variability in BER has great
effects on energy efficiency of error correction techniques used. In
this paper an efficient energy adaptive hybrid error correction
technique (AHECT) is proposed. AHECT adaptively changes error
technique from pure retransmission (ARQ) in a low BER case to a
hybrid technique with variable encoding rates (ARQ & FEC) in a
high BER cases. An adaptation algorithm depends on a precalculated
packet acceptance rate (PAR) look-up table, current BER,
packet size and error correction technique used is proposed. Based
on this adaptation algorithm a periodically 3-bit feedback is added to
the acknowledgment packet to state which error correction technique
is suitable for the current channel conditions and distance.
Comparative studies were done between this technique and other
techniques, and the results show that AHECT is more energy
efficient and has high probability of success than all those
techniques.
Abstract: This paper deals with condition monitoring of electric switch machine for railway points. Point machine, as a complex electro-mechanical device, switch the track between two alternative routes. There has been an increasing interest in railway safety and the optimal management of railway equipments maintenance, e.g. point machine, in order to enhance railway service quality and reduce system failure. This paper explores the development of Kolmogorov- Smirnov (K-S) test to detect some point failures (external to the machine, slide chairs, fixing, stretchers, etc), while the point machine (inside the machine) is in its proper condition. Time-domain stator Current signatures of normal (healthy) and faulty points are taken by 3 Hall Effect sensors and are analyzed by K-S test. The test is simulated by creating three types of such failures, namely putting a hard stone and a soft stone between stock rail and switch blades as obstacles and also slide chairs- friction. The test has been applied for those three faults which the results show that K-S test can effectively be developed for the aim of other point failures detection, which their current signatures deviate parametrically from the healthy current signature. K-S test as an analysis technique, assuming that any defect has a specific probability distribution. Empirical cumulative distribution functions (ECDF) are used to differentiate these probability distributions. This test works based on the null hypothesis that ECDF of target distribution is statistically similar to ECDF of reference distribution. Therefore by comparing a given current signature (as target signal) from unknown switch state to a number of template signatures (as reference signal) from known switch states, it is possible to identify which is the most likely state of the point machine under analysis.
Abstract: Network security attacks are the violation of
information security policy that received much attention to the
computational intelligence society in the last decades. Data mining
has become a very useful technique for detecting network intrusions
by extracting useful knowledge from large number of network data
or logs. Naïve Bayesian classifier is one of the most popular data
mining algorithm for classification, which provides an optimal way
to predict the class of an unknown example. It has been tested that
one set of probability derived from data is not good enough to have
good classification rate. In this paper, we proposed a new learning
algorithm for mining network logs to detect network intrusions
through naïve Bayesian classifier, which first clusters the network
logs into several groups based on similarity of logs, and then
calculates the prior and conditional probabilities for each group of
logs. For classifying a new log, the algorithm checks in which cluster
the log belongs and then use that cluster-s probability set to classify
the new log. We tested the performance of our proposed algorithm by
employing KDD99 benchmark network intrusion detection dataset,
and the experimental results proved that it improves detection rates
as well as reduces false positives for different types of network
intrusions.
Abstract: This paper deals with efficient computation of
probability coefficients which offers computational simplicity as
compared to spectral coefficients. It eliminates the need of inner
product evaluations in determination of signature of a combinational
circuit realizing given Boolean function. The method for computation
of probability coefficients using transform matrix, fast transform
method and using BDD is given. Theoretical relations for achievable
computational advantage in terms of required additions in computing
all 2n probability coefficients of n variable function have been
developed. It is shown that for n ≥ 5, only 50% additions are needed
to compute all probability coefficients as compared to spectral
coefficients. The fault detection techniques based on spectral
signature can be used with probability signature also to offer
computational advantage.
Abstract: Prediction of fault-prone modules provides one way to
support software quality engineering. Clustering is used to determine
the intrinsic grouping in a set of unlabeled data. Among various
clustering techniques available in literature K-Means clustering
approach is most widely being used. This paper introduces K-Means
based Clustering approach for software finding the fault proneness of
the Object-Oriented systems. The contribution of this paper is that it
has used Metric values of JEdit open source software for generation
of the rules for the categorization of software modules in the
categories of Faulty and non faulty modules and thereafter
empirically validation is performed. The results are measured in
terms of accuracy of prediction, probability of Detection and
Probability of False Alarms.
Abstract: This paper proposes a novel spectrum sensing technique
for the digital video broadcasting-terrestrial (DVB-T) systems, which
utilizes the periodicity of pilot signals in the orthogonal frequency
division multiplexing (OFDM) symbols. The proposed scheme can
overcome the effect of the timing synchronization error by recorrelating
the correlation values in the same sample distances. The
numerical results demonstrate that the detection probability performance
of the proposed scheme outperforms that of the conventional
scheme when there exists a timing synchronization error.
Abstract: Our study is concerned with the development of an Emergency Medical Services (EMS) ambulance location and allocation model called the Time-based Ambulance Zoning Optimization Model (TAZ_OPT). This paper presents the framework of the study. The model is formulated using the goal programming (GP), where the goals are to determine the satellite locations of ambulances and the number of ambulances to be allocated at these locations. The model aims at maximizing the expected demand coverage based on probability of reaching the emergency location within targetted time, and minimizing the ambulance busyness likelihood value. Among the benefits of the model is the increased accessibility and availability of ambulances, thus, enhanced quality of the EMS ambulance services.
Abstract: Many multimedia communication applications require a
source to transmit messages to multiple destinations subject to quality
of service (QoS) delay constraint. To support delay constrained
multicast communications, computer networks need to guarantee an
upper bound end-to-end delay from the source node to each of
the destination nodes. This is known as multicast delay problem.
On the other hand, if the same message fails to arrive at each
destination node at the same time, there may arise inconsistency and
unfairness problem among users. This is related to multicast delayvariation
problem. The problem to find a minimum cost multicast
tree with delay and delay-variation constraints has been proven to
be NP-Complete. In this paper, we propose an efficient heuristic
algorithm, namely, Economic Delay and Delay-Variation Bounded
Multicast (EDVBM) algorithm, based on a novel heuristic function,
to construct an economic delay and delay-variation bounded multicast
tree. A noteworthy feature of this algorithm is that it has very high
probability of finding the optimal solution in polynomial time with
low computational complexity.
Abstract: A two-parameter fatigue model explicitly accounting for the cyclic as well as the mean stress was used to fit static and fatigue data available in literature concerning carbon fiber reinforced composite laminates subjected tension-tension fatigue. The model confirms the strength–life equal rank assumption and predicts reasonably the probability of failure under cyclic loading. The model parameters were found by best fitting procedures and required a minimum of experimental tests.
Abstract: There is strong evidence that water channel proteins
'aquaporins (AQPs)' are central components in plant-water relations
as well as a number of other physiological parameters. We had
previously reported the isolation of 24 plasma membrane intrinsic
protein (PIP) type AQPs. However, the gene numbers in rice and the
polyploid nature of bread wheat indicated a high probability of
further genes in the latter. The present work focused on identification
of further AQP isoforms in bread wheat. With the use of altered
primer design, we identified five genes homologous, designated
PIP1;5b, PIP2;9b, TaPIP2;2, TaPIP2;2a, TaPIP2;2b. Sequence
alignments indicate PIP1;5b, PIP2;9b are likely to be homeologues of
two previously reported genes while the other three are new genes
and could be homeologs of each other. The results indicate further
AQP diversity in wheat and the sequence data will enable physical
mapping of these genes to identify their genomes as well as genetic to
determine their association with any quantitative trait loci (QTLs)
associated with plant-water relation such as salinity or drought
tolerance.
Abstract: We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.
Abstract: This paper treats a discrete-time batch arrival queue with single working vacation. The main purpose of this paper is to present a performance analysis of this system by using the supplementary variable technique. For this purpose, we first analyze the Markov chain underlying the queueing system and obtain its ergodicity condition. Next, we present the stationary distributions of the system length as well as some performance measures at random epochs by using the supplementary variable method. Thirdly, still based on the supplementary variable method we give the probability generating function (PGF) of the number of customers at the beginning of a busy period and give a stochastic decomposition formulae for the PGF of the stationary system length at the departure epochs. Additionally, we investigate the relation between our discretetime system and its continuous counterpart. Finally, some numerical examples show the influence of the parameters on some crucial performance characteristics of the system.
Abstract: In this paper, we extend the compound binomial model to the case where the premium income process, based on a binomial process, is no longer a linear function. First, a mathematically recursive formula is derived for non ruin probability, and then, we examine the expected discounted penalty function, satisfy a defect renewal equation. Third, the asymptotic estimate for the expected discounted penalty function is then given. Finally, we give two examples of ruin quantities to illustrate applications of the recursive formula and the asymptotic estimate for penalty function.
Abstract: In this work, propagation of uncertainty during calibration
process of TRANUS, an integrated land use and transport model
(ILUTM), has been investigated. It has also been examined, through a
sensitivity analysis, which input parameters affect the variation of the
outputs the most. Moreover, a probabilistic verification methodology
of calibration process, which equates the observed and calculated
production, has been proposed. The model chosen as an application is
the model of the city of Grenoble, France. For sensitivity analysis and
uncertainty propagation, Monte Carlo method was employed, and a
statistical hypothesis test was used for verification. The parameters of
the induced demand function in TRANUS, were assumed as uncertain
in the present case. It was found that, if during calibration, TRANUS
converges, then with a high probability the calibration process is
verified. Moreover, a weak correlation was found between the inputs
and the outputs of the calibration process. The total effect of the
inputs on outputs was investigated, and the output variation was found
to be dictated by only a few input parameters.
Abstract: In this paper, the link quality in SHF and EHF ranges
are studied. In order to achieve high data rate higher frequencies must
be used – centimeter waves (SHF), millimeter waves (EHF) or optical
range. However, there are significant problem when a radio link work
in that diapason – rain attenuation and attenuation in earth-s
atmosphere. Based on statistical rain rates data for Bulgaria, the link
availability can be determined, depending on the working frequency,
the path length and the Power Budget of the link. For the calculations
of rain attenuation and atmosphere-s attenuation the ITU
recommendations are used.
Abstract: This paper presents a new method for estimating the nonstationary
noise power spectral density given a noisy signal. The
method is based on averaging the noisy speech power spectrum using
time and frequency dependent smoothing factors. These factors are
adjusted based on signal-presence probability in individual frequency
bins. Signal presence is determined by computing the ratio of the
noisy speech power spectrum to its local minimum, which is updated
continuously by averaging past values of the noisy speech power
spectra with a look-ahead factor. This method adapts very quickly to
highly non-stationary noise environments. The proposed method
achieves significant improvements over a system that uses voice
activity detector (VAD) in noise estimation.
Abstract: the objective of this study is to measure the levels of
cellulas activity of ostrich GI microorganisms, and comparing it with
the levels of cellulas activity of rumen-s microorganisms, and also to
estimate the probability of increasing enzyme activity with injecting
different dosages (30%, 50% and 70%) of pure anaerobic goat rumen
fungi. The experiment was conducted in laboratory and under a
complete anaerobic condition (in vitro condition). 40 ml of
“CaldWell" medium and 1.4g wheat straw were placed in incubator
for an hour. The cellulase activity of ostrich microorganisms was
compared with other treatments, and then different dosages (30%,
50% and 70%) of pure anaerobic goat rumen fungi were injected to
ostrich microorganism-s media. Due to the results, cattle and goat
with 2.13 and 2.08 I.U (international units) respectively showed the
highest activity and ostrich with 0.91 (I.U) had the lowest cellulose
activity (p < 0.05). Injecting 30% and 50% of anaerobic fungi had no
significant incensement in enzyme activity, but with injecting 70% of
rumen fungi to ostrich microorganisms culture a significant increase
was observed 1.48 I.U. (p < 0.05).
Abstract: In this study, a frame work for verification of famous seismic codes is utilized. To verify the seismic codes performance, damage quantity of RC frames is compared with the target performance. Due to the randomness property of seismic design and earthquake loads excitation, in this paper, fragility curves are developed. These diagrams are utilized to evaluate performance level of structures which are designed by the seismic codes. These diagrams further illustrate the effect of load combination and reduction factors of codes on probability of damage exceedance. Two types of structures; very high important structures with high ductility and medium important structures with intermediate ductility are designed by different seismic codes. The Results reveal that usually lower damage ratio generate lower probability of exceedance. In addition, the findings indicate that there are buildings with higher quantity of bars which they have higher probability of damage exceedance. Life-cycle cost analysis utilized for comparison and final decision making process.
Abstract: In this paper, we propose an effective relay
communication for layered video transmission as an alternative to
make the most of limited resources in a wireless communication
network where loss often occurs. Relaying brings stable multimedia
services to end clients, compared to multiple description coding
(MDC). Also, retransmission of only parity data about one or more
video layer using channel coder to the end client of the relay device is
paramount to the robustness of the loss situation. Using these
methods in resource-constrained environments, such as real-time user
created content (UCC) with layered video transmission, can provide
high-quality services even in a poor communication environment.
Minimal services are also possible. The mathematical analysis shows
that the proposed method reduced the probability of GOP loss rate
compared to MDC and raptor code without relay. The GOP loss rate
is about zero, while MDC and raptor code without relay have a GOP
loss rate of 36% and 70% in case of 10% frame loss rate.
Abstract: Probability-based identity disclosure risk
measurement may give the same overall risk for different
anonymization strategy of the same dataset. Some entities in the
anonymous dataset may have higher identification risks than the
others. Individuals are more concerned about higher risks than the
average and are more interested to know if they have a possibility of
being under higher risk. A notation of overall risk in the above
measurement method doesn-t indicate whether some of the involved
entities have higher identity disclosure risk than the others. In this
paper, we have introduced an identity disclosure risk measurement
method that not only implies overall risk, but also indicates whether
some of the members have higher risk than the others. The proposed
method quantifies the overall risk based on the individual risk values,
the percentage of the records that have a risk value higher than the
average and how larger the higher risk values are compared to the
average. We have analyzed the disclosure risks for different
disclosure control techniques applied to original microdata and
present the results.