Abstract: This paper presents a computational methodology
based on matrix operations for a computer based solution to the
problem of performance analysis of software reliability models
(SRMs). A set of seven comparison criteria have been formulated to
rank various non-homogenous Poisson process software reliability
models proposed during the past 30 years to estimate software
reliability measures such as the number of remaining faults, software
failure rate, and software reliability. Selection of optimal SRM for
use in a particular case has been an area of interest for researchers in
the field of software reliability. Tools and techniques for software
reliability model selection found in the literature cannot be used with
high level of confidence as they use a limited number of model
selection criteria. A real data set of middle size software project from
published papers has been used for demonstration of matrix method.
The result of this study will be a ranking of SRMs based on the
Permanent value of the criteria matrix formed for each model based
on the comparison criteria. The software reliability model with
highest value of the Permanent is ranked at number – 1 and so on.
Abstract: In this paper, a new approach based on the extent of
friendship between the nodes is proposed which makes the nodes to
co-operate in an ad hoc environment. The extended DSR protocol is
tested under different scenarios by varying the number of malicious
nodes and node moving speed. It is also tested varying the number of
nodes in simulation used. The result indicates the achieved
throughput by extended DSR is greater than the standard DSR and
indicates the percentage of malicious drops over total drops are less
in the case of extended DSR than the standard DSR.
Abstract: The response surface methodology (RSM) is a
collection of mathematical and statistical techniques useful in the
modeling and analysis of problems in which the dependent variable
receives the influence of several independent variables, in order to
determine which are the conditions under which should operate these
variables to optimize a production process. The RSM estimated a
regression model of first order, and sets the search direction using the
method of maximum / minimum slope up / down MMS U/D.
However, this method selects the step size intuitively, which can
affect the efficiency of the RSM. This paper assesses how the step
size affects the efficiency of this methodology. The numerical
examples are carried out through Monte Carlo experiments,
evaluating three response variables: efficiency gain function, the
optimum distance and the number of iterations. The results in the
simulation experiments showed that in response variables efficiency
and gain function at the optimum distance were not affected by the
step size, while the number of iterations is found that the efficiency if
it is affected by the size of the step and function type of test used.
Abstract: Non-saturated soils that while saturation greatly
decrease their volume, have sudden settlement due to increasing
humidity, fracture and structural crack are called loess soils. Whereas
importance of civil projects including: dams, canals and
constructions bearing this type of soil and thereof problems, it is
required for carrying out more research and study in relation to loess
soils. This research studies shear strength parameters by using
grading test, Atterberg limit, compression, direct shear and
consolidation and then effect of using cement and lime additives on
stability of loess soils is studied. In related tests, lime and cement are
separately added to mixed ratios under different percentages of soil
and for different times the stabilized samples are processed and effect
of aforesaid additives on shear strength parameters of soil is studied.
Results show that upon passing time the effect of additives and
collapsible potential is greatly decreased and upon increasing
percentage of cement and lime the maximum dry density is
decreased; however, optimum humidity is increased. In addition,
liquid limit and plastic index is decreased; however, plastic index
limit is increased. It is to be noted that results of direct shear test
reveal increasing shear strength of soil due to increasing cohesion
parameter and soil friction angle.
Abstract: In this paper a new definition of adjacency matrix in
the simple graphs is presented that is called fuzzy adjacency matrix,
so that elements of it are in the form of 0 and
n N
n
1 , ∈
that are
in the interval [0, 1], and then some charactristics of this matrix are
presented with the related examples . This form matrix has complete
of information of a graph.
Abstract: The present work compares the performance of three
turbulence modeling approach (based on the two-equation k -ε
model) in predicting erosive wear in multi-size dense slurry flow
through rotating channel. All three turbulence models include
rotation modification to the production term in the turbulent kineticenergy
equation. The two-phase flow field obtained numerically
using Galerkin finite element methodology relates the local flow
velocity and concentration to the wear rate via a suitable wear model.
The wear models for both sliding wear and impact wear mechanisms
account for the particle size dependence. Results of predicted wear
rates using the three turbulence models are compared for a large
number of cases spanning such operating parameters as rotation rate,
solids concentration, flow rate, particle size distribution and so forth.
The root-mean-square error between FE-generated data and the
correlation between maximum wear rate and the operating
parameters is found less than 2.5% for all the three models.
Abstract: Within the domain of Systems Engineering the need
to perform property aggregation to understand, analyze and manage
complex systems is unequivocal. This can be seen in numerous
domains such as capability analysis, Mission Essential Competencies
(MEC) and Critical Design Features (CDF). Furthermore, the need
to consider uncertainty propagation as well as the sensitivity of
related properties within such analysis is equally as important when
determining a set of critical properties within such a system.
This paper describes this property breakdown in a number of
domains within Systems Engineering and, within the area of CDFs,
emphasizes the importance of uncertainty analysis. As part of this, a
section of the paper describes possible techniques which may be used
within uncertainty propagation and in conclusion an example is
described utilizing one of the techniques for property and uncertainty
aggregation within an aircraft system to aid the determination of
Critical Design Features.
Abstract: the aim of this study was to analyze ethnopsychological content of “Aitys" as a process of creative competition in Kazakh traditional folklore by means of Transaction analysis (three types of Ego states are Parent, Adult and Child). “Aitys" is as sources of Kazakh national self-consciousness and form of oral Kazakh national creativity. Comparative psychological analysis of classical and modern “aityses" is carried out. Empirical proved that the victory in “Aitys" is provided with a position of egostate “Adult".
Abstract: In this paper, a new probability density function (pdf)
is proposed to model the statistics of wavelet coefficients, and a
simple Kalman-s filter is derived from the new pdf using Bayesian
estimation theory. Specifically, we decompose the speckled image
into wavelet subbands, we apply the Kalman-s filter to the high
subbands, and reconstruct a despeckled image from the modified
detail coefficients. Experimental results demonstrate that our method
compares favorably to several other despeckling methods on test
synthetic aperture radar (SAR) images.
Abstract: Whole genome duplication (WGD) increased the
number of yeast Saccharomyces cerevisiae chromosomes from 8 to
16. In spite of retention the number of chromosomes in the genome
of this organism after WGD to date, chromosomal rearrangement
events have caused an evolutionary distance between current genome
and its ancestor. Studies under evolutionary-based approaches on
eukaryotic genomes have shown that the rearrangement distance is an
approximable problem. In the case of S. cerevisiae, we describe that
rearrangement distance is accessible by using dedoubled adjacency
graph drawn for 55 large paired chromosomal regions originated
from WGD. Then, we provide a program extracted from a C program
database to draw a dedoubled genome adjacency graph for S.
cerevisiae. From a bioinformatical perspective, using the duplicated
blocks of current genome in S. cerevisiae, we infer that genomic
organization of eukaryotes has the potential to provide valuable
detailed information about their ancestrygenome.
Abstract: A steady two-dimensional magnetohydrodynamics
flow and heat transfer over a stretching vertical sheet influenced by
radiation and porosity is studied. The governing boundary layer
equations of partial differential equations are reduced to a system of
ordinary differential equations using similarity transformation. The
system is solved numerically by using a finite difference scheme
known as the Keller-box method for some values of parameters,
namely the radiation parameter N, magnetic parameter M, buoyancy
parameter l , Prandtl number Pr and permeability parameter K. The
effects of the parameters on the heat transfer characteristics are
analyzed and discussed. It is found that both the skin friction
coefficient and the local Nusselt number decrease as the magnetic
parameter M and permeability parameter K increase. Heat transfer
rate at the surface decreases as the radiation parameter increases.
Abstract: Optimization of filter banks based on the knowledge of input statistics has been of interest for a long time. Finite impulse response (FIR) Compaction filters are used in the design of optimal signal adapted orthonormal FIR filter banks. In this paper we discuss three different approaches for the design of interpolated finite impulse response (IFIR) compaction filters. In the first method, the magnitude squared response satisfies Nyquist constraint approximately. In the second and third methods Nyquist constraint is exactly satisfied. These methods yield FIR compaction filters whose response is comparable with that of the existing methods. At the same time, IFIR filters enjoy significant saving in the number of multipliers and can be implemented efficiently. Since eigenfilter approach is used here, the method is less complex. Design of IFIR filters in the least square sense is presented.
Abstract: The case study was conducted to show the effect of milking method in goat called half day milking on the milk production and the growth of kids. Data were collected by interviewing farmers and investigating goat production in the communal goat housing from June 2008 to May 2009. The interview was conducted to collect data about goat management. The observations were conducted on 10 goats, which were selected based on the uniformity of age, number of kid born/goat and the milking method in practice. The samples were divided into two groups; those were full 3 months nursing and half day milked goats (in this group the kids were separated from goat during the previous night milking and then the kids were allowed to suck the goat during the day). The result showed that the communal goat housing had 138 goats and 25% of the farmers milked the goat. The implementation of half day milking increased the milk production significantly (P
Abstract: This paper aims to perform the second law analysis of
thermodynamics on the laminar film condensation of pure saturated
vapor flowing in the direction of gravity on an ellipsoid with variable
wall temperature. The analysis provides us understanding how the
geometric parameter- ellipticity and non-isothermal wall temperature
variation amplitude “A." affect entropy generation during film-wise
condensation heat transfer process. To understand of which
irreversibility involved in this condensation process, we derived an
expression for the entropy generation number in terms of ellipticity
and A. The result indicates that entropy generation increases with
ellipticity. Furthermore, the irreversibility due to finite temperature
difference heat transfer dominates over that due to condensate film
flow friction and the local entropy generation rate decreases with
increasing A in the upper half of ellipsoid. Meanwhile, the local
entropy generation rate enhances with A around the rear lower half of
ellipsoid.
Abstract: The crossed cube is one of the most notable variations of hypercube, but some properties of the former are superior to those of the latter. For example, the diameter of the crossed cube is almost the half of that of the hypercube. In this paper, we focus on the problem embedding a Hamiltonian cycle through an arbitrary given edge in the crossed cube. We give necessary and sufficient condition for determining whether a given permutation with n elements over Zn generates a Hamiltonian cycle pattern of the crossed cube. Moreover, we obtain a lower bound for the number of different Hamiltonian cycles passing through a given edge in an n-dimensional crossed cube. Our work extends some recently obtained results.
Abstract: Protective relays are components of a protection system
in a power system domain that provides decision making element for
correct protection and fault clearing operations. Failure of the
protection devices may reduce the integrity and reliability of the power
system protection that will impact the overall performance of the
power system. Hence it is imperative for power utilities to assess the
reliability of protective relays to assure it will perform its intended
function without failure. This paper will discuss the application of
reliability analysis using statistical method called Life Data Analysis
in Tenaga Nasional Berhad (TNB), a government linked power utility
company in Malaysia, namely Transmission Division, to assess and
evaluate the reliability of numerical overcurrent protective relays from
two different manufacturers.
Abstract: In this paper we discuss the behaviour of the longitudinal modes of a magnetized non collisional plasma subjected to an external electromagnetic field. We apply a semiclassical formalism, with the electrons being studied in a quantum mechanical viewpoint whereas the electromagnetic field in the classical context. We calculate the dielectric function in order to obtains the modes and found that, unlike the Bernstein modes, the presence of radiation induces oscillations around the cyclotron harmonics, which are smoothed as the energy stored in the radiation field becomes small compared to the thermal energy of the electrons. We analyze the influence of the number of photon involved in the electronic transitions between the Landau levels and how the parameters such as the external fields strength, plasma density and temperature affect the dispersion relation
Abstract: In this paper we present a soft timing phase estimation (STPE) method for wireless mobile receivers operating in low signal to noise ratios (SNRs). Discrete Polyphase Matched (DPM) filters, a Log-maximum a posterior probability (MAP) and/or a Soft-output Viterbi algorithm (SOVA) are combined to derive a new timing recovery (TR) scheme. We apply this scheme to wireless cellular communication system model that comprises of a raised cosine filter (RCF), a bit-interleaved turbo-coded multi-level modulation (BITMM) scheme and the channel is assumed to be memory-less. Furthermore, no clock signals are transmitted to the receiver contrary to the classical data aided (DA) models. This new model ensures that both the bandwidth and power of the communication system is conserved. However, the computational complexity of ideal turbo synchronization is increased by 50%. Several simulation tests on bit error rate (BER) and block error rate (BLER) versus low SNR reveal that the proposed iterative soft timing recovery (ISTR) scheme outperforms the conventional schemes.
Abstract: Bio-chips are used for experiments on genes and
contain various information such as genes, samples and so on. The
two-dimensional bio-chips, in which one axis represent genes and the
other represent samples, are widely being used these days. Instead of
experimenting with real genes which cost lots of money and much
time to get the results, bio-chips are being used for biological
experiments. And extracting data from the bio-chips with high
accuracy and finding out the patterns or useful information from such
data is very important. Bio-chip analysis systems extract data from
various kinds of bio-chips and mine the data in order to get useful
information. One of the commonly used methods to mine the data is
classification. The algorithm that is used to classify the data can be
various depending on the data types or number characteristics and so
on. Considering that bio-chip data is extremely large, an algorithm that
imitates the ecosystem such as the ant algorithm is suitable to use as an
algorithm for classification. This paper focuses on finding the
classification rules from the bio-chip data using the Ant Colony
algorithm which imitates the ecosystem. The developed system takes
in consideration the accuracy of the discovered rules when it applies it
to the bio-chip data in order to predict the classes.
Abstract: A trend in agent community or enterprises is that they are shifting from closed to open architectures composed of a large number of autonomous agents. One of its implications could be that interface agent framework is getting more important in multi-agent system (MAS); so that systems constructed for different application domains could share a common understanding in human computer interface (HCI) methods, as well as human-agent and agent-agent interfaces. However, interface agent framework usually receives less attention than other aspects of MAS. In this paper, we will propose an interface web agent framework which is based on our former project called WAF and a Distributed HCI template. A group of new functionalities and implications will be discussed, such as web agent presentation, off-line agent reference, reconfigurable activation map of agents, etc. Their enabling techniques and current standards (e.g. existing ontological framework) are also suggested and shown by examples from our own implementation in WAF.