Abstract: Despite the recent surge of research in control of
worm propagation, currently, there is no effective defense system
against such cyber attacks. We first design a distributed detection
architecture called Detection via Distributed Blackholes (DDBH).
Our novel detection mechanism could be implemented via virtual
honeypots or honeynets. Simulation results show that a worm can be
detected with virtual honeypots on only 3% of the nodes. Moreover,
the worm is detected when less than 1.5% of the nodes are infected.
We then develop two control strategies: (1) optimal dynamic trafficblocking,
for which we determine the condition that guarantees
minimum number of removed nodes when the worm is contained and
(2) predictive dynamic traffic-blocking–a realistic deployment of
the optimal strategy on scale-free graphs. The predictive dynamic
traffic-blocking, coupled with the DDBH, ensures that more than
40% of the network is unaffected by the propagation at the time
when the worm is contained.
Abstract: Secure electronic payment system is presented in this
paper. This electronic payment system is to be secure for clients such
as customers and shop owners. The security architecture of the
system is designed by RC5 encryption / decryption algorithm. This
eliminates the fraud that occurs today with stolen credit card
numbers. The symmetric key cryptosystem RC5 can protect
conventional transaction data such as account numbers, amount and
other information. This process can be done electronically using RC5
encryption / decryption program written by Microsoft Visual Basic
6.0. There is no danger of any data sent within the system being
intercepted, and replaced. The alternative is to use the existing
network, and to encrypt all data transmissions. The system with
encryption is acceptably secure, but that the level of encryption has
to be stepped up, as computing power increases. Results In order to
be secure the system the communication between modules is
encrypted using symmetric key cryptosystem RC5. The system will
use simple user name, password, user ID, user type and cipher
authentication mechanism for identification, when the user first
enters the system. It is the most common method of authentication in
most computer system.
Abstract: We present a method for fast volume rendering using
graphics hardware (GPU). To our knowledge, it is the first implementation
on the GPU. Based on the Shear-Warp algorithm, our
GPU-based method provides real-time frame rates and outperforms
the CPU-based implementation. When the number of slices is not
sufficient, we add in-between slices computed by interpolation. This
improves then the quality of the rendered images. We have also
implemented the ray marching algorithm on the GPU. The results
generated by the three algorithms (CPU-based and GPU-based Shear-
Warp, GPU-based Ray Marching) for two test models has proved that
the ray marching algorithm outperforms the shear-warp methods in
terms of speed up and image quality.
Abstract: The aim of this article is to explain how features of attacks could be extracted from the packets. It also explains how vectors could be built and then applied to the input of any analysis stage. For analyzing, the work deploys the Feedforward-Back propagation neural network to act as misuse intrusion detection system. It uses ten types if attacks as example for training and testing the neural network. It explains how the packets are analyzed to extract features. The work shows how selecting the right features, building correct vectors and how correct identification of the training methods with nodes- number in hidden layer of any neural network affecting the accuracy of system. In addition, the work shows how to get values of optimal weights and use them to initialize the Artificial Neural Network.
Abstract: Primary and secondary data from the Bauchi abattoir were utilized to determine the relative contributions of different livestock species to meat supply in Bauchi Metropolis. Daily livestock slaughter figures for five months (June – October 2011) indicated that more goats (64.0) were slaughtered than either sheep (47.3) or cattle (41.30) each day (P
Abstract: The concurrent era is characterised by strengthened interactions among financial markets and increased capital mobility globally. In this frames we examine the effects the international financial integration process has on the European bond markets. We perform a comparative study of the interactions of the European and international bond markets and exploit Cointegration analysis results on the elimination of stochastic trends and the decomposition of the underlying long run equilibria and short run causal relations. Our investigation provides evidence on the relation between the European integration process and that of globalisation, viewed through the bond markets- sector. Additionally the structural formulation applied, offers significant implications of the findings. All in all our analysis offers a number of answers on crucial queries towards the European bond markets integration process.
Abstract: Flows over a harmonically oscillating NACA 0012
airfoil are simulated here using a two-dimensional, unsteady,
incompressibleNavier-Stokes solver.Both pure-plunging and
pitching-plunging combined oscillations are considered at a Reynolds
number of 5000. Special attention is paid to the vortex shedding and
interaction mechanism of the motions. For all the simulations
presented here, the reduced frequency (k) is fixed at a value of 2.5
and plunging amplitude (h) is selected to be in the range of 0.2-0.5.
The simulation results show that the interaction mechanism between
the leading and trailing edge vortices has a decisive effect on the
values of the resulting thrust and propulsive efficiency.
Abstract: This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.
Abstract: Transesterification of candlenut (aleurites moluccana)
oil with methanol using potassium hydroxide as catalyst was
studied. The objective of the present investigation was to produce
the methyl ester for use as biodiesel. The operation variables
employed were methanol to oil molar ratio (3:1 – 9:1), catalyst
concentration (0.50 – 1.5 %) and temperature (303 – 343K). Oil
volume of 150 mL, reaction time of 75 min were fixed as common
parameters in all the experiments. The concentration of methyl ester
was evaluated by mass balance of free glycerol formed which was
analyzed by using periodic acid. The optimal triglyceride conversion
was attained by using methanol to oil ratio of 6:1, potassium
hydroxide as catalyst was of 1%, at room temperature. Methyl ester
formed was characterized by its density, viscosity, cloud and pour
points. The biodiesel properties had properties similar to those of
diesel oil, except for the viscosity that was higher.
Abstract: In Peer-to-Peer service networks, where peers offer any kind of publicly available services or applications, intuitive navigation through all services in the network becomes more difficult as the number of services increases. In this article, a concept is discussed that enables users to intuitively browse and use large scale P2P service networks. The concept extends the idea of creating virtual 3D-environments solely based on Peer-to-Peer technologies. Aside from browsing, users shall have the possibility to emphasize services of interest using their own semantic criteria. The appearance of the virtual world shall intuitively reflect network properties that may be of interest for the user. Additionally, the concept comprises options for load- and traffic-balancing. In this article, the requirements concerning the underlying infrastructure and the graphical user interface are defined. First impressions of the appearance of future systems are presented and the next steps towards a prototypical implementation are discussed.
Abstract: In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.
Abstract: A large number of chemical, bio-chemical and pollution-control processes use heterogeneous fixed-bed reactors. The use of finite hollow cylindrical catalyst pellets can enhance conversion levels in such reactors. The absence of the pellet core can significantly lower the diffusional resistance associated with the solid phase. This leads to a better utilization of the catalytic material, which is reflected in the higher values for the effectiveness factor, leading ultimately to an enhanced conversion level in the reactor. It is however important to develop a rigorous heterogeneous model for the reactor incorporating the two-dimensional feature of the solid phase owing to the presence of the finite hollow cylindrical catalyst pellet. Presently, heterogeneous models reported in the literature invariably employ one-dimension solid phase models meant for spherical catalyst pellets. The objective of the paper is to present a rigorous model of the fixed-bed reactors containing finite hollow cylindrical catalyst pellets. The reaction kinetics considered here is the widely used Michaelis–Menten kinetics for the liquid-phase bio-chemical reactions. The reaction parameters used here are for the enzymatic degradation of urea. Results indicate that increasing the height to diameter ratio helps to improve the conversion level. On the other hand, decreasing the thickness is apparently not as effective. This could however be explained in terms of the higher void fraction of the bed that causes a smaller amount of the solid phase to be packed in the fixed-bed bio-chemical reactor.
Abstract: The trial in the city, located 170 kilometers from the
Iranian city of Ahvaz was Omidiyeh. The main factor in this project
includes 4 levels in control (without hormones), use of hormones in
the seed, vegetative and flowering stage respectively. And sub-plots
included 3 varieties of vetch in three levels, with local names, was the
jewel in the study of light and Auxin in the vegetative and
reproductive different times in different varieties of vetch was
investigated. This test has been taken in the plots in a randomized
complete block with four replications. In order to study the effects of
the hormone Auxin in the growth stages (seed, vegetative and
flowering) to control (no hormone Auxin) on three local varieties of
vetch, the essence of light and plant height, number of pods per plant,
seed number The pods, seeds per plant, grain weight, grain yield,
plant dry weight and protein content were measured. Among the
vetch varieties for plant height, number of pods per plant, a seed per
plant, grain weight, grain yield, and plant dry weight and protein
levels of 1 percent of plant and seed number per pod per plant at 5%
level of There was no significant difference. Interactions for grain
yield per plant, grain yield and protein levels of 1 percent and the
number of seeds per pod and seed weight are significant differences
in levels 5 and plant height and plant dry weight of the interaction
were INFLUENCE There was no significant difference in them.
Abstract: The traditional Failure Mode and Effects Analysis
(FMEA) uses Risk Priority Number (RPN) to evaluate the risk level
of a component or process. The RPN index is determined by
calculating the product of severity, occurrence and detection indexes.
The most critically debated disadvantage of this approach is that
various sets of these three indexes may produce an identical value of
RPN. This research paper seeks to address the drawbacks in
traditional FMEA and to propose a new approach to overcome these
shortcomings. The Risk Priority Code (RPC) is used to prioritize
failure modes, when two or more failure modes have the same RPN.
A new method is proposed to prioritize failure modes, when there is a
disagreement in ranking scale for severity, occurrence and detection.
An Analysis of Variance (ANOVA) is used to compare means of
RPN values. SPSS (Statistical Package for the Social Sciences)
statistical analysis package is used to analyze the data. The results
presented are based on two case studies. It is found that the proposed
new methodology/approach resolves the limitations of traditional
FMEA approach.
Abstract: Resins are used in nuclear power plants for water
ultrapurification. Two approaches are considered in this work:
column experiments and simulations. A software called OPTIPUR
was developed, tested and used. The approach simulates the onedimensional
reactive transport in porous medium with convectivedispersive
transport between particles and diffusive transport within
the boundary layer around the particles. The transfer limitation in the
boundary layer is characterized by the mass transfer coefficient
(MTC). The influences on MTC were measured experimentally. The
variation of the inlet concentration does not influence the MTC; on
the contrary of the Darcy velocity which influences. This is consistent
with results obtained using the correlation of Dwivedi&Upadhyay.
With the MTC, knowing the number of exchange site and the relative
affinity, OPTIPUR can simulate the column outlet concentration
versus time. Then, the duration of use of resins can be predicted in
conditions of a binary exchange.
Abstract: In this paper, an estimation accuracy of multiple moving
talker tracking using a microphone array is improved. The tracking
can be achieved by the adaptive method in which two algorithms are integrated, namely, the PAST (Projection Approximation Subspace
Tracking) algorithm and the IPLS (Interior Point Least Square) algorithm. When either talker begins to speak again after a silent
period, an appropriate feasible region for an evaluation function of
the IPLS algorithm might not be set. Then, the tracking fails due to the incorrect updating. Therefore, if an increment of the number of
active talkers is detected, the feasible region must be reset. Then, a low cost realization is required for the high speed tracking and a high
accuracy realization is desired for the precise tracking. In this paper,
the directions roughly estimated using the delayed-sum-array method
are used for the resetting. Several results of experiments performed in
an actual room environment show the effectiveness of the proposed method.
Abstract: Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.
Abstract: The world economic crises and budget constraints
have caused authorities, especially those in developing countries, to
rationalize water quality monitoring activities. Rationalization
consists of reducing the number of monitoring sites, the number of
samples, and/or the number of water quality variables measured. The
reduction in water quality variables is usually based on correlation. If
two variables exhibit high correlation, it is an indication that some of
the information produced may be redundant. Consequently, one
variable can be discontinued, and the other continues to be measured.
Later, the ordinary least squares (OLS) regression technique is
employed to reconstitute information about discontinued variable by
using the continuously measured one as an explanatory variable. In
this paper, two record extension techniques are employed to
reconstitute information about discontinued water quality variables,
the OLS and the Line of Organic Correlation (LOC). An empirical
experiment is conducted using water quality records from the Nile
Delta water quality monitoring network in Egypt. The record
extension techniques are compared for their ability to predict
different statistical parameters of the discontinued variables. Results
show that the OLS is better at estimating individual water quality
records. However, results indicate an underestimation of the variance
in the extended records. The LOC technique is superior in preserving
characteristics of the entire distribution and avoids underestimation
of the variance. It is concluded from this study that the OLS can be
used for the substitution of missing values, while LOC is preferable
for inferring statements about the probability distribution.
Abstract: In 3D-wavelet video coding framework temporal
filtering is done along the trajectory of motion using Motion
Compensated Temporal Filtering (MCTF). Hence computationally
efficient motion estimation technique is the need of MCTF. In this
paper a predictive technique is proposed in order to reduce the
computational complexity of the MCTF framework, by exploiting
the high correlation among the frames in a Group Of Picture (GOP).
The proposed technique applies coarse and fine searches of any fast
block based motion estimation, only to the first pair of frames in a
GOP. The generated motion vectors are supplied to the next
consecutive frames, even to subsequent temporal levels and only fine
search is carried out around those predicted motion vectors. Hence
coarse search is skipped for all the motion estimation in a GOP
except for the first pair of frames. The technique has been tested for
different fast block based motion estimation algorithms over different
standard test sequences using MC-EZBC, a state-of-the-art scalable
video coder. The simulation result reveals substantial reduction (i.e.
20.75% to 38.24%) in the number of search points during motion
estimation, without compromising the quality of the reconstructed
video compared to non-predictive techniques. Since the motion
vectors of all the pair of frames in a GOP except the first pair will
have value ±1 around the motion vectors of the previous pair of
frames, the number of bits required for motion vectors is also
reduced by 50%.
Abstract: This paper aims to develop an algorithm of finite
capacity material requirement planning (FCMRP) system for a multistage
assembly flow shop. The developed FCMRP system has two
main stages. The first stage is to allocate operations to the first and
second priority work centers and also determine the sequence of the
operations on each work center. The second stage is to determine the
optimal start time of each operation by using a linear programming
model. Real data from a factory is used to analyze and evaluate the
effectiveness of the proposed FCMRP system and also to guarantee a
practical solution to the user. There are five performance measures,
namely, the total tardiness, the number of tardy orders, the total
earliness, the number of early orders, and the average flow-time. The
proposed FCMRP system offers an adjustable solution which is a
compromised solution among the conflicting performance measures.
The user can adjust the weight of each performance measure to
obtain the desired performance. The result shows that the combination
of FCMRP NP3 and EDD outperforms other combinations
in term of overall performance index. The calculation time for the
proposed FCMRP system is about 10 minutes which is practical for
the planners of the factory.