Abstract: A high performance computer includes a fast
processor and millions bytes of memory. During the data processing,
huge amount of information are shuffled between the memory and
processor. Because of its small size and its effectiveness speed, cache
has become a common feature of high performance computers.
Enhancing cache performance proved to be essential in the speed up
of cache-based computers. Most enhancement approaches can be
classified as either software based or hardware controlled. The
performance of the cache is quantified in terms of hit ratio or miss
ratio. In this paper, we are optimizing the cache performance based
on enhancing the cache hit ratio. The optimum cache performance is
obtained by focusing on the cache hardware modification in the way
to make a quick rejection to the missed line's tags from the hit-or
miss comparison stage, and thus a low hit time for the wanted line in
the cache is achieved. In the proposed technique which we called
Even- Odd Tabulation (EOT), the cache lines come from the main
memory into cache are classified in two types; even line's tags and
odd line's tags depending on their Least Significant Bit (LSB). This
division is exploited by EOT technique to reject the miss match line's
tags in very low time compared to the time spent by the main
comparator in the cache, giving an optimum hitting time for the
wanted cache line. The high performance of EOT technique against
the familiar mapping technique FAM is shown in the simulated
results.
Abstract: The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.
Abstract: Graph based image segmentation techniques are
considered to be one of the most efficient segmentation techniques
which are mainly used as time & space efficient methods for real
time applications. How ever, there is need to focus on improving the
quality of segmented images obtained from the earlier graph based
methods. This paper proposes an improvement to the graph based
image segmentation methods already described in the literature. We
contribute to the existing method by proposing the use of a weighted
Euclidean distance to calculate the edge weight which is the key
element in building the graph. We also propose a slight modification
of the segmentation method already described in the literature, which
results in selection of more prominent edges in the graph. The
experimental results show the improvement in the segmentation
quality as compared to the methods that already exist, with a slight
compromise in efficiency.
Abstract: In this article we are going to discuss the improvement
of the multi classes- classification problem using multi layer
Perceptron. The considered approach consists in breaking down the
n-class problem into two-classes- subproblems. The training of each
two-class subproblem is made independently; as for the phase of test,
we are going to confront a vector that we want to classify to all two
classes- models, the elected class will be the strongest one that won-t
lose any competition with the other classes. Rates of recognition
gotten with the multi class-s approach by two-class-s decomposition
are clearly better that those gotten by the simple multi class-s
approach.
Abstract: This paper demonstrates the results when either
Shiftrows stage or Mixcolumns stage and when both the stages are
omitted in the well known block cipher Advanced Encryption
Standard(AES) and its modified version AES with Key Dependent
S-box(AES-KDS), using avalanche criterion and other tests namely
encryption quality, correlation coefficient, histogram analysis and
key sensitivity tests.
Abstract: Software effort estimation is the process of predicting
the most realistic use of effort required to develop or maintain
software based on incomplete, uncertain and/or noisy input. Effort
estimates may be used as input to project plans, iteration plans,
budgets. There are various models like Halstead, Walston-Felix,
Bailey-Basili, Doty and GA Based models which have already used
to estimate the software effort for projects. In this study Statistical
Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are
experimented to estimate the software effort for projects. The
performances of the developed models were tested on NASA
software project datasets and results are compared with the Halstead,
Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based
models mentioned in the literature. The result shows that the NF
Model has the lowest MMRE and RMSE values. The NF Model
shows the best results as compared with the Fuzzy-GA based hybrid
Inference System and other existing Models that are being used for
the Effort Prediction with lowest MMRE and RMSE values.
Abstract: In this paper, we start by first characterizing the most
important and distinguishing features of wavelet-based watermarking
schemes. We studied the overwhelming amount of algorithms
proposed in the literature. Application scenario, copyright protection
is considered and building on the experience that was gained,
implemented two distinguishing watermarking schemes. Detailed
comparison and obtained results are presented and discussed. We
concluded that Joo-s [1] technique is more robust for standard noise
attacks than Dote-s [2] technique.
Abstract: The current paper conceptualizes the technique of
release consistency indispensable with the concept of
synchronization that is user-defined. Programming model concreted
with object and class is illustrated and demonstrated. The essence of
the paper is phases, events and parallel computing execution .The
technique by which the values are visible on shared variables is
implemented. The second part of the paper consist of user defined
high level synchronization primitives implementation and system
architecture with memory protocols. There is a proposition of
techniques which are core in deciding the validating and invalidating
a stall page .
Abstract: Cyber physical system (CPS) for target tracking, military surveillance, human health monitoring, and vehicle detection all require maximizing the utility and saving the energy. Sensor selection is one of the most important parts of CPS. Sensor selection problem (SSP) is concentrating to balance the tradeoff between the number of sensors which we used and the utility which we will get. In this paper, we propose a performance constrained slide windows (PCSW) based algorithm for SSP in CPS. we present results of extensive simulations that we have carried out to test and validate the PCSW algorithms when we track a target, Experiment shows that the PCSW based algorithm improved the performance including selecting time and communication times for selecting.
Abstract: Principal Component Analysis (PCA) has many
different important applications especially in pattern detection
such as face detection / recognition. Therefore, for real time
applications, the response time is required to be as small as
possible. In this paper, new implementation of PCA for fast
face detection is presented. Such new implementation is
designed based on cross correlation in the frequency domain
between the input image and eigenvectors (weights).
Simulation results show that the proposed implementation of
PCA is faster than conventional one.
Abstract: There are many situations where input feature vectors are incomplete and methods to tackle the problem have been studied for a long time. A commonly used procedure is to replace each missing value with an imputation. This paper presents a method to perform categorical missing data imputation from numerical and categorical variables. The imputations are based on Simpson-s fuzzy min-max neural networks where the input variables for learning and classification are just numerical. The proposed method extends the input to categorical variables by introducing new fuzzy sets, a new operation and a new architecture. The procedure is tested and compared with others using opinion poll data.
Abstract: The work we have accomplished in implementing a
Mobile Payment mechanism that enables customers to pay bills for
groceries and other purchased items in a store through the means of a
mobile phone, specifically a Smartphone. The mode of transaction, as
far as communication between the customer-s handset and the
merchant-s POS is concerned, we have decided upon NFC (Near
Field Communication). This is due to the fact that for the most part,
Pakistani Smartphone users have handsets that have Android mobile
OS, which supports the aforementioned platform, IOS, on the other
hand does not.
Abstract: The paper presents an approach for handling uncertain
information in deductive databases using multivalued logics. Uncertainty
means that database facts may be assigned logical values other
than the conventional ones - true and false. The logical values represent
various degrees of truth, which may be combined and propagated
by applying the database rules. A corresponding multivalued database
semantics is defined. We show that it extends successful conventional
semantics as the well-founded semantics, and has a polynomial time
data complexity.
Abstract: RFID (Radio Frequency IDentification) system has
been widely used in our life, such as transport systems, passports,
automotive, animal tracking, human implants, library, and so on.
However, the RFID authentication protocols between RF (Radio
Frequency) tags and the RF readers have been bring about various
privacy problems that anonymity of the tags, tracking, eavesdropping,
and so on. Many researchers have proposed the solution of the
problems. However, they still have the problem, such as location
privacy, mutual authentication. In this paper, we show the problems of
the previous protocols, and then we propose a more secure and
efficient RFID authentication protocol.
Abstract: Network Management Systems have played a great important role in information systems. Management is very important and essential in any fields. There are many managements such as configuration management, fault management, performance management, security management, accounting management and etc. Among them, configuration, fault and security management is more important than others. Because these are essential and useful in any fields. Configuration management is to monitor and maintain the whole system or LAN. Fault management is to detect and troubleshoot the system. Security management is to control the whole system. This paper intends to increase the network management functionalities including configuration management, fault management and security management. In configuration management system, this paper specially can support the USB ports and devices to detect and read devices configuration and solve to detect hardware port and software ports. In security management system, this paper can provide the security feature for the user account setting and user management and proxy server feature. And all of the history of the security such as user account and proxy server history are kept in the java standard serializable file. So the user can view the history of the security and proxy server anytime. If the user uses this system, the user can ping the clients from the network and the user can view the result of the message in fault management system. And this system also provides to check the network card and can show the NIC card setting. This system is used RMI (Remote Method Invocation) and JNI (Java Native Interface) technology. This paper is to implement the client/server network management system using Java 2 Standard Edition (J2SE). This system can provide more than 10 clients. And then this paper intends to show data or message structure of client/server and how to work using TCP/IP protocol.
Abstract: In this paper I have developed a system for evaluating
the degree of fear emotion that the intelligent agent-based system
may feel when it encounters to a persecuting event. In this paper I
want to describe behaviors of emotional agents using human
behavior in terms of the way their emotional states evolve over time.
I have implemented a fuzzy inference system using Java
environment. As the inputs of this system, I have considered three
parameters related on human fear emotion. The system outputs can
be used in agent decision making process or choosing a person for
team working systems by combination the intensity of fear to other
emotion intensities.
Abstract: During the last years, the genomes of more and more
species have been sequenced, providing data for phylogenetic recon-
struction based on genome rearrangement measures. A main task in
all phylogenetic reconstruction algorithms is to solve the median of
three problem. Although this problem is NP-hard even for the sim-
plest distance measures, there are exact algorithms for the breakpoint
median and the reversal median that are fast enough for practical use.
In this paper, this approach is extended to the transposition median as
well as to the weighted reversal and transposition median. Although
there is no exact polynomial algorithm known even for the pairwise
distances, we will show that it is in most cases possible to solve
these problems exactly within reasonable time by using a branch and
bound algorithm.
Abstract: This paper focuses on testing database of existing
information system. At the beginning we describe the basic problems
of implemented databases, such as data redundancy, poor design of
database logical structure or inappropriate data types in columns of
database tables. These problems are often the result of incorrect
understanding of the primary requirements for a database of an
information system. Then we propose an algorithm to compare the
conceptual model created from vague requirements for a database
with a conceptual model reconstructed from implemented database.
An algorithm also suggests steps leading to optimization of
implemented database. The proposed algorithm is verified by an
implemented prototype. The paper also describes a fuzzy system
which works with the vague requirements for a database of an
information system, procedure for creating conceptual from vague
requirements and an algorithm for reconstructing a conceptual model
from implemented database.
Abstract: Specification-based testing enables us to detect errors
in the implementation of functions defined in given specifications.
Its effectiveness in achieving high path coverage and efficiency in
generating test cases are always major concerns of testers. The automatic
test cases generation approach based on formal specifications
proposed by Liu and Nakajima is aimed at ensuring high effectiveness
and efficiency, but this approach has not been empirically assessed.
In this paper, we present an experiment for assessing Liu-s testing
approach. The result indicates that this testing approach may not be
effective in some circumstances. We discuss the result, analyse the
specific causes for the ineffectiveness, and describe some suggestions
for improvement.
Abstract: The online office is one of web application. We can
easily use the online office through a web browser with internet
connected PC. The online office has the advantage of using
environment regardless of location or time. When users want to use the
online office, they access the online office server and use their content.
However, recently developed and launched online office has the
weakness of insufficient consideration. In this paper, we analyze the
security vulnerabilities of the online office. In addition, we propose
the evaluation criteria to make secure online office using Common
Criteria. This evaluation criteria can be used to establish trust between
the online office server and the user. The online office market will be
more active than before.