Abstract: In this paper optical code-division multiple-access (OCDMA) packet network is considered, which offers inherent security in the access networks. Two types of random access protocols are proposed for packet transmission. In protocol 1, all distinct codes and in protocol 2, distinct codes as well as shifted versions of all these codes are used. O-CDMA network performance using optical orthogonal codes (OOCs) 1-D and two-dimensional (2-D) wavelength/time single-pulse-per-row (W/T SPR) codes are analyzed. The main advantage of using 2-D codes instead of onedimensional (1-D) codes is to reduce the errors due to multiple access interference among different users. In this paper, correlation receiver is considered in the analysis. Using analytical model, we compute and compare packet-success probability for 1-D and 2-D codes in an O-CDMA network and the analysis shows improved performance with 2-D codes as compared to 1-D codes.
Abstract: Support vector machines (SVMs) are considered to be
the best machine learning algorithms for minimizing the predictive
probability of misclassification. However, their drawback is that for
large data sets the computation of the optimal decision boundary is a
time consuming function of the size of the training set. Hence several
methods have been proposed to speed up the SVM algorithm. Here
three methods used to speed up the computation of the SVM
classifiers are compared experimentally using a musical genre
classification problem. The simplest method pre-selects a random
sample of the data before the application of the SVM algorithm. Two
additional methods use proximity graphs to pre-select data that are
near the decision boundary. One uses k-Nearest Neighbor graphs and
the other Relative Neighborhood Graphs to accomplish the task.
Abstract: The development of aid's systems for the medical
diagnosis is not easy thing because of presence of inhomogeneities in
the MRI, the variability of the data from a sequence to the other as
well as of other different source distortions that accentuate this
difficulty. A new automatic, contextual, adaptive and robust
segmentation procedure by MRI brain tissue classification is
described in this article. A first phase consists in estimating the
density of probability of the data by the Parzen-Rozenblatt method.
The classification procedure is completely automatic and doesn't
make any assumptions nor on the clusters number nor on the
prototypes of these clusters since these last are detected in an
automatic manner by an operator of mathematical morphology called
skeleton by influence zones detection (SKIZ). The problem of
initialization of the prototypes as well as their number is transformed
in an optimization problem; in more the procedure is adaptive since it
takes in consideration the contextual information presents in every
voxel by an adaptive and robust non parametric model by the
Markov fields (MF). The number of bad classifications is reduced by
the use of the criteria of MPM minimization (Maximum Posterior
Marginal).
Abstract: Although there have been many researches in cluster
analysis to consider on feature weights, little effort is made on sample
weights. Recently, Yu et al. (2011) considered a probability
distribution over a data set to represent its sample weights and then
proposed sample-weighted clustering algorithms. In this paper, we
give a sample-weighted version of generalized fuzzy clustering
regularization (GFCR), called the sample-weighted GFCR
(SW-GFCR). Some experiments are considered. These experimental
results and comparisons demonstrate that the proposed SW-GFCR is
more effective than the most clustering algorithms.
Abstract: In this paper, a semi-fragile watermarking scheme is proposed for color image authentication. In this particular scheme, the color image is first transformed from RGB to YST color space, suitable for watermarking the color media. Each channel is divided into 4×4 non-overlapping blocks and its each 2×2 sub-block is selected. The embedding space is created by setting the two LSBs of selected sub-block to zero, which will hold the authentication and recovery information. For verification of work authentication and parity bits denoted by 'a' & 'p' are computed for each 2×2 subblock. For recovery, intensity mean of each 2×2 sub-block is computed and encoded upto six to eight bits depending upon the channel selection. The size of sub-block is important for correct localization and fast computation. For watermark distribution 2DTorus Automorphism is implemented using a private key to have a secure mapping of blocks. The perceptibility of watermarked image is quite reasonable both subjectively and objectively. Our scheme is oblivious, correctly localizes the tampering and able to recovery the original work with probability of near one.
Abstract: The objective of this research is to study the people’s level of participation in activities of the community, their satisfaction towards the community, the attachment they have to the community, factors that influence the attachment, as well as the characteristics of the relationships of military families’ of the Royal Guards community of Dusit District. The method used was non-probability sampling by quota sampling according to people’s age. The determined age group was 18 years or older.
One set of a sample group was done per family. The questionnaires were conducted by 287 people. Snowball sampling was also used by interviewing people of the community, starting from the Royal Guards Community’s leader, then by 20 of the community’s well-respected persons. The data was analyzed by using descriptive statistics, such as arithmetic mean and standard deviation, as well as by inferential statistics, such as Independent - Samples T test (T-test), One-Way ANOVA (F-test), Chi-Square. Descriptive analysis according to the structure of the interview content was also used. The results of the research is that the participation of the population in the Royal Guards Community in various activities is at a medium level, with the average participation level during Mother’s and Father’s Days. The people’s general level of satisfaction towards the premises of the Royal Guards Community is at the highest level.
The people were most satisfied with the transportation within the community and in contacting with people from outside the premises. The access to the community is convenient and there are various entrances. The attachment of the people to the Royal Guards Community in general and by each category is at a high level. The feeling that the community is their home rated the highest average. Factors that influence the attachment of the people of the Royal Guards Community are age, status, profession, income, length of stay in the community, membership of social groups, having neighbors they feel close and familiar with, and as well as the benefits they receive from the community. In addition, it was found that people that participate in activities have a high level of positive relationship towards the attachment of the people to the Royal Guards Community. The satisfaction of the community has a very high level of positive relationship with the attachment of the people to the Royal Guards Community.
The characteristics of the attachment of military families’ is that they live in big houses that everyone has to protect and care for, starting from the leader of the family as well as all members. Therefore, they all love the community they live in. The characteristics that show the participation of activities within the community and the high level of satisfaction towards the premises of the community will enable the people to be more attached to the community. The people feel that everyone is close neighbors within the community, as if they are one big family.
Abstract: Software Development Risks Identification (SDRI),
using Fault Tree Analysis (FTA), is a proposed technique to identify
not only the risk factors but also the causes of the appearance of the
risk factors in software development life cycle. The method is based
on analyzing the probable causes of software development failures
before they become problems and adversely affect a project. It uses
Fault tree analysis (FTA) to determine the probability of a particular
system level failures that are defined by A Taxonomy for Sources of
Software Development Risk to deduce failure analysis in which an
undesired state of a system by using Boolean logic to combine a
series of lower-level events. The major purpose of this paper is to use
the probabilistic calculations of Fault Tree Analysis approach to
determine all possible causes that lead to software development risk
occurrence
Abstract: The inherent iterative nature of product design and development poses significant challenge to reduce the product design and development time (PD). In order to shorten the time to market, organizations have adopted concurrent development where multiple specialized tasks and design activities are carried out in parallel. Iterative nature of work coupled with the overlap of activities can result in unpredictable time to completion and significant rework. Many of the products have missed the time to market window due to unanticipated or rather unplanned iteration and rework. The iterative and often overlapped processes introduce greater amounts of ambiguity in design and development, where the traditional methods and tools of project management provide less value. In this context, identifying critical metrics to understand the iteration probability is an open research area where significant contribution can be made given that iteration has been the key driver of cost and schedule risk in PD projects. Two important questions that the proposed study attempts to address are: Can we predict and identify the number of iterations in a product development flow? Can we provide managerial insights for a better control over iteration? The proposal introduces the concept of decision points and using this concept intends to develop metrics that can provide managerial insights into iteration predictability. By characterizing the product development flow as a network of decision points, the proposed research intends to delve further into iteration probability and attempts to provide more clarity.
Abstract: In this article we explore the application of a formal
proof system to verification problems in cryptography. Cryptographic
properties concerning correctness or security of some cryptographic
algorithms are of great interest. Beside some basic lemmata, we
explore an implementation of a complex function that is used in
cryptography. More precisely, we describe formal properties of this
implementation that we computer prove. We describe formalized
probability distributions (σ-algebras, probability spaces and conditional
probabilities). These are given in the formal language of the
formal proof system Isabelle/HOL. Moreover, we computer prove
Bayes- Formula. Besides, we describe an application of the presented
formalized probability distributions to cryptography. Furthermore,
this article shows that computer proofs of complex cryptographic
functions are possible by presenting an implementation of the Miller-
Rabin primality test that admits formal verification. Our achievements
are a step towards computer verification of cryptographic primitives.
They describe a basis for computer verification in cryptography.
Computer verification can be applied to further problems in cryptographic
research, if the corresponding basic mathematical knowledge
is available in a database.
Abstract: Software reliability, defined as the probability of a
software system or application functioning without failure or errors
over a defined period of time, has been an important area of research
for over three decades. Several research efforts aimed at developing
models to improve reliability are currently underway. One of the
most popular approaches to software reliability adopted by some of
these research efforts involves the use of operational profiles to
predict how software applications will be used. Operational profiles
are a quantification of usage patterns for a software application. The
research presented in this paper investigates an innovative multiagent
framework for automatic creation and management of
operational profiles for generic distributed systems after their release
into the market. The architecture of the proposed Operational Profile
MAS (Multi-Agent System) is presented along with detailed
descriptions of the various models arrived at following the analysis
and design phases of the proposed system. The operational profile in
this paper is extended to comprise seven different profiles. Further,
the criticality of operations is defined using a new composed metrics
in order to organize the testing process as well as to decrease the time
and cost involved in this process. A prototype implementation of the
proposed MAS is included as proof-of-concept and the framework is
considered as a step towards making distributed systems intelligent
and self-managing.
Abstract: This paper studies ruin probabilities in two discrete-time
risk models with premiums, claims and rates of interest modelled by
three autoregressive moving average processes. Generalized Lundberg
inequalities for ruin probabilities are derived by using recursive
technique. A numerical example is given to illustrate the applications
of these probability inequalities.
Abstract: In this paper a fast motion estimation method for
H.264/AVC named Triplet Search Motion Estimation (TS-ME) is
proposed. Similar to some of the traditional fast motion estimation
methods and their improved proposals which restrict the search points
only to some selected candidates to decrease the computation
complexity, proposed algorithm separate the motion search process to
several steps but with some new features. First, proposed algorithm try
to search the real motion area using proposed triplet patterns instead of
some selected search points to avoid dropping into the local minimum.
Then, in the localized motion area a novel 3-step motion search
algorithm is performed. Proposed search patterns are categorized into
three rings on the basis of the distance from the search center. These
three rings are adaptively selected by referencing the surrounding
motion vectors to early terminate the motion search process. On the
other hand, computation reduction for sub pixel motion search is also
discussed considering the appearance probability of the sub pixel
motion vector. From the simulation results, motion estimation speed
improved by a factor of up to 38 when using proposed algorithm than
that of the reference software of H.264/AVC with ignorable picture
quality loss.
Abstract: This paper deals with the application for contentbased
image retrieval to extract color feature from natural images
stored in the image database by segmenting the image through
clustering. We employ a class of nonparametric techniques in which
the data points are regarded as samples from an unknown probability
density. Explicit computation of the density is avoided by using the
mean shift procedure, a robust clustering technique, which does not
require prior knowledge of the number of clusters, and does not
constrain the shape of the clusters. A non-parametric technique for
the recovery of significant image features is presented and
segmentation module is developed using the mean shift algorithm to
segment each image. In these algorithms, the only user set parameter
is the resolution of the analysis and either gray level or color images
are accepted as inputs. Extensive experimental results illustrate
excellent performance.
Abstract: Based on the Lagrangian for the Gross –Pitaevskii
equation as derived by H. Sakaguchi and B.A Malomed [5] we have
derived a double well model for the nonlinear optical lattice. This
model explains the various features of nonlinear optical lattices.
Further, from this model we obtain and simulate the probability for
tunneling from one well to another which agrees with experimental
results [4].
Abstract: The Connection Admission Control (CAC) problem is formulated in this paper as a discrete time optimal control problem. The control variables account for the acceptance/ rejection of new connections and forced dropping of in-progress connections. These variables are constrained to meet suitable conditions which account for the QoS requirements (Link Availability, Blocking Probability, Dropping Probability). The performance index evaluates the total throughput. At each discrete time, the problem is solved as an integer-valued linear programming one. The proposed procedure was successfully tested against suitably simulated data.
Abstract: This paper presents parametric probability density
models for call holding times (CHTs) into emergency call center
based on the actual data collected for over a week in the public
Emergency Information Network (EIN) in Mongolia. When the set of
chosen candidates of Gamma distribution family is fitted to the call
holding time data, it is observed that the whole area in the CHT
empirical histogram is underestimated due to spikes of higher
probability and long tails of lower probability in the histogram.
Therefore, we provide the Gaussian parametric model of a mixture of
lognormal distributions with explicit analytical expressions for the
modeling of CHTs of PSNs. Finally, we show that the CHTs for
PSNs are fitted reasonably by a mixture of lognormal distributions
via the simulation of expectation maximization algorithm. This result
is significant as it expresses a useful mathematical tool in an explicit
manner of a mixture of lognormal distributions.
Abstract: The most widely used semiconductor memory types
are the Dynamic Random Access Memory (DRAM) and Static
Random Access memory (SRAM). Competition among memory
manufacturers drives the need to decrease power consumption and
reduce the probability of read failure. A technology that is relatively
new and has not been explored is the FinFET technology. In this
paper, a single cell Schmitt Trigger Based Static RAM using FinFET
technology is proposed and analyzed. The accuracy of the result is
validated by means of HSPICE simulations with 32nm FinFET
technology and the results are then compared with 6T SRAM using
the same technology.
Abstract: Existing experiences indicate that one of the most
prominent reasons that some ERP implementations fail is related to
selecting an improper ERP package. Among those important factors
resulting in inappropriate ERP selections, one is to ignore preliminary
activities that should be done before the evaluation of ERP packages.
Another factor yielding these unsuitable selections is that usually
organizations employ prolonged and costly selection processes in
such extent that sometimes the process would never be finalized
or sometimes the evaluation team might perform many key final
activities in an incomplete or inaccurate way due to exhaustion, lack
of interest or out-of-date data. In this paper, a systematic approach
that recommends some activities to be done before and after the
main selection phase is introduced for choosing an ERP package. On
the other hand, the proposed approach has utilized some ideas that
accelerates the selection process at the same time that reduces the
probability of an erroneous final selection.
Abstract: Collateralized Debt Obligations are not as widely used
nowadays as they were before 2007 Subprime crisis. Nonetheless
there remains an enthralling challenge to optimize cash flows
associated with synthetic CDOs. A Gaussian-based model is used
here in which default correlation and unconditional probabilities of
default are highlighted. Then numerous simulations are performed
based on this model for different scenarios in order to evaluate the
associated cash flows given a specific number of defaults at different
periods of time. Cash flows are not solely calculated on a single
bought or sold tranche but rather on a combination of bought and
sold tranches. With some assumptions, the simplex algorithm gives
a way to find the maximum cash flow according to correlation of
defaults and maturities. The used Gaussian model is not realistic in
crisis situations. Besides present system does not handle buying or
selling a portion of a tranche but only the whole tranche. However the
work provides the investor with relevant elements on how to know
what and when to buy and sell.
Abstract: CDMA cellular networks support soft handover,
which guarantees the continuity of wireless services and enhanced
communication quality. Cellular networks support multimedia
services under varied propagation environmental conditions. In this
paper, we have shown the effect of characteristic parameters of the
cellular environments on the soft handover performance. We
consider path loss exponent, standard deviation of shadow fading and
correlation coefficient of shadow fading as the characteristic
parameters of the radio propagation environment. A very useful
statistical measure for characterizing the performance of mobile radio
system is the probability of outage. It is shown through numerical
results that above parameters have decisive effect on the probability
of outage and hence the overall performance of the soft handover
algorithm.