Abstract: It is important problems to increase the detection rates
and reduce false positive rates in Intrusion Detection System (IDS).
Although preventative techniques such as access control and
authentication attempt to prevent intruders, these can fail, and as a
second line of defence, intrusion detection has been introduced. Rare
events are events that occur very infrequently, detection of rare
events is a common problem in many domains. In this paper we
propose an intrusion detection method that combines Rough set and
Fuzzy Clustering. Rough set has to decrease the amount of data and
get rid of redundancy. Fuzzy c-means clustering allow objects to
belong to several clusters simultaneously, with different degrees of
membership. Our approach allows us to recognize not only known
attacks but also to detect suspicious activity that may be the result of
a new, unknown attack. The experimental results on Knowledge
Discovery and Data Mining-(KDDCup 1999) Dataset show that the
method is efficient and practical for intrusion detection systems.
Abstract: The purpose of this paper is to elucidate the flow unsteady behavior for moving plug in convergent-divergent variable thrust nozzle. Compressible axisymmetric Navier-Stokes equations are used to study this physical phenomenon. Different velocities are set for plug to investigate the effect of plug movement on flow unsteadiness. Variation of mass flow rate and thrust are compared under two conditions: First, the plug is placed at different positions and flow is simulated to reach the steady state (quasi steady simulation) and second, the plug is moved with assigned velocity and flow simulation is coupled with plug movement (unsteady simulation). If plug speed is high enough and its movement time scale is at the same order of the flow time scale, variation of the mass flow rate and thrust level versus plug position demonstrate a vital discrepancy under the quasi steady and unsteady conditions. This phenomenon should be considered especially from response time viewpoints in thrusters design.
Abstract: In the past many uneconomic solutions for limitation
and interruption of short-circuit currents in low power applications
have been introduced, especially polymer switch based on the
positive temperature coefficient of resistance (PCTR) concept.
However there are many limitations in the active material, which
consists of conductive fillers. This paper presents a significantly
improved and simplified approach that replaces the existing current
limiters with faster switching elements. Its elegance lies in the
remarkable simplicity and low-cost processes of producing the device
using polyaniline (PANI) doped with methane-sulfonic acid (MSA).
Samples characterized as lying in the metallic and critical regimes of
metal insulator transition have been studied by means of electrical
performance in the voltage range from 1V to 5 V under different
environmental conditions. Moisture presence is shown to increase the
resistivity and also improved its current limiting performance.
Additionally, the device has also been studied for electrical resistivity
in the temperature range 77 K-300 K. The temperature dependence of
the electrical conductivity gives evidence for a transport mechanism
based on variable range hopping in three dimensions.
Abstract: This paper applies fuzzy AHP to evaluate the service
quality of online auction. Service quality is a composition of various
criteria. Among them many intangible attributes are difficult to
measure. This characteristic introduces the obstacles for respondents
on reply in the survey. So as to overcome this problem, we invite
fuzzy set theory into the measurement of performance and use AHP in
obtaining criteria. We found the most concerned dimension of service
quality is Transaction Safety Mechanism and the least is Charge Item.
Other criteria such as information security, accuracy and information
are too vital.
Abstract: In the present paper some recommendations for the
use of software package “Mathematica" in a basic numerical analysis
course are presented. The methods which are covered in the course
include solution of systems of linear equations, nonlinear equations
and systems of nonlinear equations, numerical integration,
interpolation and solution of ordinary differential equations. A set of
individual assignments developed for the course covering all the
topics is discussed in detail.
Abstract: Modular multiplication is the basic operation
in most public key cryptosystems, such as RSA, DSA, ECC,
and DH key exchange. Unfortunately, very large operands
(in order of 1024 or 2048 bits) must be used to provide
sufficient security strength. The use of such big numbers
dramatically slows down the whole cipher system, especially
when running on embedded processors.
So far, customized hardware accelerators - developed on
FPGAs or ASICs - were the best choice for accelerating
modular multiplication in embedded environments. On the
other hand, many algorithms have been developed to speed
up such operations. Examples are the Montgomery modular
multiplication and the interleaved modular multiplication
algorithms. Combining both customized hardware with
an efficient algorithm is expected to provide a much faster
cipher system.
This paper introduces an enhanced architecture for computing
the modular multiplication of two large numbers X
and Y modulo a given modulus M. The proposed design is
compared with three previous architectures depending on
carry save adders and look up tables. Look up tables should
be loaded with a set of pre-computed values. Our proposed
architecture uses the same carry save addition, but replaces
both look up tables and pre-computations with an enhanced
version of sign detection techniques. The proposed architecture
supports higher frequencies than other architectures.
It also has a better overall absolute time for a single operation.
Abstract: This paper discusses a curriculum approach that will
give emphasis on practical portions of teaching network security
subjects in information and communication technology courses. As
we are well aware, the need to use a practice and application oriented
approach in education is paramount. Research on active learning and
cooperative groups have shown that students grasps more and have
more tendency towards obtaining and realizing soft skills like
leadership, communication and team work as opposed to the more
traditional theory and exam based teaching and learning. While this
teaching and learning paradigm is relatively new in Malaysia, it has
been practiced widely in the West. This paper examines a certain
approach whereby students learning wireless security are divided into
and work in small and manageable groups where there will be 2
teams which consist of black hat and white hat teams. The former
will try to find and expose vulnerabilities in a wireless network while
the latter will try their best to prevent such attacks on their wireless
networks using hardware, software, design and enforcement of
security policy and etc. This paper will try to show that the approach
taken plus the use of relevant and up to date software and hardware
and with suitable environment setting will hopefully expose students
to a more fruitful outcome in terms of understanding of concepts,
theories and their motivation to learn.
Abstract: In this paper the concept of Q-fuzzification of ideals of Γ-semigroups has been introduced and some important properties have been investigated. A characterization of regular Γ-semigroup in terms of Q-fuzzy ideals has been obtained. Operator semigroups of a Γ-semigroup has been made to work by obtaining various relationships between Q-fuzzy ideals of a Γ-semigroup and that of its operator semigroups.
Abstract: Most of the commercial gluten free products are
nutritionally inferior when compared to gluten containing
counterparts as manufacturers most often use the refined flours and
starches. So it is possible that people on gluten free diet have low
intake of fibre content. The foxtail millet flour and copra meal are
gluten free and have high fibre and protein contents. The formulation
of fibre rich gluten free cookies was optimized by response surface
methodology considering independent process variables as proportion
of Foxtail millet (Setaria italica) flour in mixed flour, fat content and
guar gum. The sugar, sodium chloride, sodium bicarbonates and
water were added in fixed proportion as 60, 1.0, 0.4 and 20% of
mixed flour weight, respectively. Optimum formulation obtained for
maximum spread ratio, fibre content, surface L-value, overall
acceptability and minimum breaking strength were 80% foxtail millet
flour in mixed flour, 42.8 % fat content and 0.05% guar gum.
Abstract: The behavior of Radial Basis Function (RBF) Networks greatly depends on how the center points of the basis functions are selected. In this work we investigate the use of instance reduction techniques, originally developed to reduce the storage requirements of instance based learners, for this purpose. Five Instance-Based Reduction Techniques were used to determine the set of center points, and RBF networks were trained using these sets of centers. The performance of the RBF networks is studied in terms of classification accuracy and training time. The results obtained were compared with two Radial Basis Function Networks: RBF networks that use all instances of the training set as center points (RBF-ALL) and Probabilistic Neural Networks (PNN). The former achieves high classification accuracies and the latter requires smaller training time. Results showed that RBF networks trained using sets of centers located by noise-filtering techniques (ALLKNN and ENN) rather than pure reduction techniques produce the best results in terms of classification accuracy. The results show that these networks require smaller training time than that of RBF-ALL and higher classification accuracy than that of PNN. Thus, using ALLKNN and ENN to select center points gives better combination of classification accuracy and training time. Our experiments also show that using the reduced sets to train the networks is beneficial especially in the presence of noise in the original training sets.
Abstract: The purpose of this paper is to demonstrate the ability
of a genetic programming (GP) algorithm to evolve a team of data
classification models. The GP algorithm used in this work is
“multigene" in nature, i.e. there are multiple tree structures (genes)
that are used to represent team members. Each team member assigns
a data sample to one of a fixed set of output classes. A majority vote,
determined using the mode (highest occurrence) of classes predicted
by the individual genes, is used to determine the final class
prediction. The algorithm is tested on a binary classification problem.
For the case study investigated, compact classification models are
obtained with comparable accuracy to alternative approaches.
Abstract: In films, visual effects have played the role of
expressing realities more realistically or describing imaginations as if
they are real. Such images are immediated images representing
realism, and the logic of immediation for the reality of images has
been perceived dominant in visual effects. In order for immediation to
have an identity as immediation, there should be the opposite concept
hypermediation.
In the mid 2000s, hypermediated images were settled as a code of
mass culture in Asia. Thus, among Asian films highly popular in those
days, this study selected five displaying hypermediated images – 2 Korean, 2 Japanese, and 1 Thailand movies – and examined the
semiotic meanings of such images using Roland Barthes- directional and implicated meaning analysis and Metz-s paradigmatic analysis
method, focusing on how hypermediated images work in the general
context of the films, how they are associated with spaces, and what
meanings they try to carry.
Abstract: Business rules are widely used within the services
sector. They provide consistency and allow relatively unskilled staff
to process complex transactions correctly. But there are many
examples where the rules themselves have an impact on the costs and
profits of an organisation. Financial services, transport and human
services are areas where the rules themselves can impact the bottom
line in a predictable way. If this is the case, how can we find that set
of rules that maximise profit, performance or customer service, or
any other key performance indicators? The manufacturing, energy
and process industries have embraced mathematical optimisation
techniques to improve efficiency, increase production and so on. This
paper explores several real world (but simplified) problems in the
services sector and shows how business rules can be optimised. It
also examines the similarities and differences between the service
and other sectors, and how optimisation techniques could be used to
deliver similar benefits.
Abstract: By taking advantage of computer-s processing power, an unlimited number of variations and parameters in both spatial and environmental can be provided while following the same set of rules and constraints. This paper focuses on using the tools of parametric urbanism towards a more responsive environmental and sustainable urban morphology. It presents an understanding to Parametric Urban Comfort Envelope (PUCE) as an interactive computational assessment urban model. In addition, it investigates the applicability potentials of this model to generate an optimized urban form to Borg El Arab city (a new Egyptian Community) concerning the human comfort values specially wind and solar envelopes. Finally, this paper utilizes its application outcomes -both visual and numerical- to extend the designer-s limitations by decrease the concern of controlling and manipulation of geometry, and increase the designer-s awareness about the various potentials of using the parametric tools to create relationships that generate multiple geometric alternatives.
Abstract: Graph decompositions are vital in the study of combinatorial design theory. Given two graphs G and H, an H-decomposition of G is a partition of the edge set of G into disjoint isomorphic copies of H. An n-sun is a cycle Cn with an edge terminating in a vertex of degree one attached to each vertex. In this paper we have proved that the complete graph of order 2n, K2n can be decomposed into n-2 n-suns, a Hamilton cycle and a perfect matching, when n is even and for odd case, the decomposition is n-1 n-suns and a perfect matching. For an odd order complete graph K2n+1, delete the star subgraph K1, 2n and the resultant graph K2n is decomposed as in the case of even order. The method of building n-suns uses Walecki's construction for the Hamilton decomposition of complete graphs. A spanning tree decomposition of even order complete graphs is also discussed using the labeling scheme of n-sun decomposition. A complete bipartite graph Kn, n can be decomposed into n/2 n-suns when n/2 is even. When n/2 is odd, Kn, n can be decomposed into (n-2)/2 n-suns and a Hamilton cycle.
Abstract: e-mail has become an important means of electronic
communication but the viability of its usage is marred by Unsolicited
Bulk e-mail (UBE) messages. UBE consists of many types
like pornographic, virus infected and 'cry-for-help' messages as well
as fake and fraudulent offers for jobs, winnings and medicines. UBE
poses technical and socio-economic challenges to usage of e-mails.
To meet this challenge and combat this menace, we need to
understand UBE. Towards this end, the current paper presents a
content-based textual analysis of nearly 3000 winnings-announcing
UBE. Technically, this is an application of Text Parsing and
Tokenization for an un-structured textual document and we approach
it using Bag Of Words (BOW) and Vector Space Document Model
techniques. We have attempted to identify the most frequently
occurring lexis in the winnings-announcing UBE documents. The
analysis of such top 100 lexis is also presented. We exhibit the
relationship between occurrence of a word from the identified lexisset
in the given UBE and the probability that the given UBE will be
the one announcing fake winnings. To the best of our knowledge and
survey of related literature, this is the first formal attempt for
identification of most frequently occurring lexis in winningsannouncing
UBE by its textual analysis. Finally, this is a sincere
attempt to bring about alertness against and mitigate the threat of
such luring but fake UBE.
Abstract: This paper describes a methodology for remote
performance monitoring of retail refrigeration systems. The proposed
framework starts with monitoring of the whole refrigeration circuit
which allows detecting deviations from expected behavior caused by
various faults and degradations. The subsequent diagnostics methods
drill down deeper in the equipment hierarchy to more specifically
determine root causes. An important feature of the proposed concept
is that it does not require any additional sensors, and thus, the
performance monitoring solution can be deployed at a low
installation cost. Moreover only a minimum of contextual
information is required, which also substantially reduces time and
cost of the deployment process.
Abstract: Estimation time and cost of work completion in a
project and follow up them during execution are contributors to
success or fail of a project, and is very important for project
management team. Delivering on time and within budgeted cost
needs to well managing and controlling the projects. To dealing with
complex task of controlling and modifying the baseline project
schedule during execution, earned value management systems have
been set up and widely used to measure and communicate the real
physical progress of a project. But it often fails to predict the total
duration of the project. In this paper data mining techniques is used
predicting the total project duration in term of Time Estimate At
Completion-EAC (t). For this purpose, we have used a project with
90 activities, it has updated day by day. Then, it is used regular
indexes in literature and applied Earned Duration Method to
calculate time estimate at completion and set these as input data for
prediction and specifying the major parameters among them using
Clem software. By using data mining, the effective parameters on
EAC and the relationship between them could be extracted and it is
very useful to manage a project with minimum delay risks. As we
state, this could be a simple, safe and applicable method in prediction
the completion time of a project during execution.
Abstract: Response surface methodology (RSM) is a very
efficient tool to provide a good practical insight into developing new
process and optimizing them. This methodology could help
engineers to raise a mathematical model to represent the behavior of
system as a convincing function of process parameters.
Through this paper the sequential nature of the RSM surveyed for process
engineers and its relationship to design of experiments (DOE), regression
analysis and robust design reviewed. The proposed four-step procedure in
two different phases could help system analyst to resolve the parameter
design problem involving responses. In order to check accuracy of the
designed model, residual analysis and prediction error sum of squares
(PRESS) described.
It is believed that the proposed procedure in this study can resolve a
complex parameter design problem with one or more responses. It can be
applied to those areas where there are large data sets and a number of
responses are to be optimized simultaneously. In addition, the proposed
procedure is relatively simple and can be implemented easily by using
ready-made standard statistical packages.
Abstract: Earth reinforcing techniques have become useful and economical to solve problems related to difficult grounds and provide satisfactory foundation performance. In this context, this paper uses radial basis function neural network (RBFNN) for predicting the bearing pressure of strip footing on reinforced granular bed overlying weak soil. The inputs for the neural network models included plate width, thickness of granular bed and number of layers of reinforcements, settlement ratio, water content, dry density, cohesion and angle of friction. The results indicated that RBFNN model exhibited more than 84 % prediction accuracy, thereby demonstrating its application in a geotechnical problem.