Abstract: The purpose of this research is to develop a security model for voice eavesdropping protection over digital networks. The proposed model provides an encryption scheme and a personal secret key exchange between communicating parties, a so-called voice data transformation system, resulting in a real-privacy conversation. The operation of this system comprises two main steps as follows: The first one is the personal secret key exchange for using the keys in the data encryption process during conversation. The key owner could freely make his/her choice in key selection, so it is recommended that one should exchange a different key for a different conversational party, and record the key for each case into the memory provided in the client device. The next step is to set and record another personal option of encryption, either taking all frames or just partial frames, so-called the figure of 1:M. Using different personal secret keys and different sets of 1:M to different parties without the intervention of the service operator, would result in posing quite a big problem for any eavesdroppers who attempt to discover the key used during the conversation, especially in a short period of time. Thus, it is quite safe and effective to protect the case of voice eavesdropping. The results of the implementation indicate that the system can perform its function accurately as designed. In this regard, the proposed system is suitable for effective use in voice eavesdropping protection over digital networks, without any requirements to change presently existing network systems, mobile phone network and VoIP, for instance.
Abstract: The reliability of distributed systems and computer
networks have been modeled by a probabilistic network or a graph G.
Computing the residual connectedness reliability (RCR), denoted by
R(G), under the node fault model is very useful, but is an NP-hard
problem. Since it may need exponential time of the network size to
compute the exact value of R(G), it is important to calculate its tight
approximate value, especially its lower bound, at a moderate
calculation time. In this paper, we propose an efficient algorithm for
reliability lower bound of distributed systems with unreliable nodes.
We also applied our algorithm to several typical classes of networks
to evaluate the lower bounds and show the effectiveness of our
algorithm.
Abstract: This paper presents the mathematical model of electric field and magnetic field in transmission system, which performs in second-order partial differential equation. This research has conducted analyzing the electromagnetic field radiating to atmosphere around the transmission line, when there is the transmission line transposition in case of long distance distribution. The six types of 500 kV transposed HV transmission line with double circuit will be considered. The computer simulation is applied finite element method that is developed by MATLAB program. The problem is considered to two dimensions, which is time harmonic system with the graphical performance of electric field and magnetic field. The impact from simulation of six types long distance distributing transposition will not effect changing of electric field and magnetic field which surround the transmission line.
Abstract: In this paper the complete rotor system including
elastic shaft with distributed mass, allowing for the effects of oil film
in bearings. Also, flexibility of foundation is modeled. As a whole
this article is a relatively complete research in modeling and
vibration analysis of rotor considering gyroscopic effect, damping,
dependency of stiffness and damping coefficients on frequency and
solving the vibration equations including these parameters. On the
basis of finite element method and utilizing four element types
including element of shaft, disk, bearing and foundation and using
MATLAB, a computer program is written. So the responses in
several cases and considering different effects are obtained. Then the
results are compared with each other, with exact solutions and results
of other papers.
Abstract: Quality control charts are very effective in detecting
out of control signals but when a control chart signals an out of
control condition of the process mean, searching for a special cause
in the vicinity of the signal time would not always lead to prompt
identification of the source(s) of the out of control condition as the
change point in the process parameter(s) is usually different from the
signal time. It is very important to manufacturer to determine at what
point and which parameters in the past caused the signal. Early
warning of process change would expedite the search for the special
causes and enhance quality at lower cost. In this paper the quality
variables under investigation are assumed to follow a multivariate
normal distribution with known means and variance-covariance
matrix and the process means after one step change remain at the new
level until the special cause is being identified and removed, also it is
supposed that only one variable could be changed at the same time.
This research applies artificial neural network (ANN) to identify the
time the change occurred and the parameter which caused the change
or shift. The performance of the approach was assessed through a
computer simulation experiment. The results show that neural
network performs effectively and equally well for the whole shift
magnitude which has been considered.
Abstract: In this paper, parallelism in the solution of Ordinary
Differential Equations (ODEs) to increase the computational speed is
studied. The focus is the development of parallel algorithm of the two
point Block Backward Differentiation Formulas (PBBDF) that can
take advantage of the parallel architecture in computer technology.
Parallelism is obtained by using Message Passing Interface (MPI).
Numerical results are given to validate the efficiency of the PBBDF
implementation as compared to the sequential implementation.
Abstract: Grid networks provide the ability to perform higher throughput computing by taking advantage of many networked computer-s resources to solve large-scale computation problems. As the popularity of the Grid networks has increased, there is a need to efficiently distribute the load among the resources accessible on the network. In this paper, we present a stochastic network system that gives a distributed load-balancing scheme by generating almost regular networks. This network system is self-organized and depends only on local information for load distribution and resource discovery. The in-degree of each node is refers to its free resources, and job assignment and resource discovery processes required for load balancing is accomplished by using fitted random sampling. Simulation results show that the generated network system provides an effective, scalable, and reliable load-balancing scheme for the distributed resources accessible on Grid networks.
Abstract: Automatic segmentation of skin lesions is the first step
towards development of a computer-aided diagnosis of melanoma.
Although numerous segmentation methods have been developed,
few studies have focused on determining the most discriminative
and effective color space for melanoma application. This paper
proposes a novel automatic segmentation algorithm using color space
analysis and clustering-based histogram thresholding, which is able to
determine the optimal color channel for segmentation of skin lesions.
To demonstrate the validity of the algorithm, it is tested on a set of 30
high resolution dermoscopy images and a comprehensive evaluation
of the results is provided, where borders manually drawn by four
dermatologists, are compared to automated borders detected by the
proposed algorithm. The evaluation is carried out by applying three
previously used metrics of accuracy, sensitivity, and specificity and
a new metric of similarity. Through ROC analysis and ranking the
metrics, it is shown that the best results are obtained with the X and
XoYoR color channels which results in an accuracy of approximately
97%. The proposed method is also compared with two state-ofthe-
art skin lesion segmentation methods, which demonstrates the
effectiveness and superiority of the proposed segmentation method.
Abstract: This article proposes an Ant Colony Optimization
(ACO) metaheuristic to minimize total makespan for scheduling a set
of jobs and assign workers for uniformly related parallel machines.
An algorithm based on ACO has been developed and coded on a
computer program Matlab®, to solve this problem. The paper
explains various steps to apply Ant Colony approach to the problem
of minimizing makespan for the worker assignment & jobs
scheduling problem in a parallel machine model and is aimed at
evaluating the strength of ACO as compared to other conventional
approaches. One data set containing 100 problems (12 Jobs, 03
machines and 10 workers) which is available on internet, has been
taken and solved through this ACO algorithm. The results of our
ACO based algorithm has shown drastically improved results,
especially, in terms of negligible computational effort of CPU, to
reach the optimal solution. In our case, the time taken to solve all 100
problems is even lesser than the average time taken to solve one
problem in the data set by other conventional approaches like GA
algorithm and SPT-A/LMC heuristics.
Abstract: Ontology is a terminology which is used in artificial
intelligence with different meanings. Ontology researching has an
important role in computer science and practical applications,
especially distributed knowledge systems. In this paper we present an
ontology which is called Computational Object Knowledge Base
Ontology. It has been used in designing some knowledge base
systems for solving problems such as the system that supports
studying knowledge and solving analytic geometry problems, the
program for studying and solving problems in Plane Geometry, the
knowledge system in linear algebra.
Abstract: Combined experimental and computational analysis of
hygrothermal performance of an interior thermal insulation system
applied on a brick wall is presented in the paper. In the experimental
part, the functionality of the insulation system is tested at simulated
difference climate conditions using a semi-scale device. The
measured temperature and relative humidity profiles are used for the
calibration of computer code HEMOT that is finally applied for a
long-term hygrothermal analysis of the investigated structure.
Abstract: Human Computer Interaction (HCI) has been an
emerging field that draws in the experts from various fields to
enhance the application of computer programs and the ease of
computer users. HCI has much to do with learning and cognition and
an emerging approach to learning and problem-solving is problembased
learning (PBL). The processes of PBL involve important
cognitive functions in the various stages. This paper will illustrate
how closely related fields to HCI, PBL and cognitive psychology can
benefit from informing each other through analysing various
cognitive functions. Several cognitive functions from cognitive
function disc (CFD) would be presented and discussed in relation to
human-computer interface. This paper concludes with the
implications of bridging the gaps amongst these disciplines.
Abstract: Unified Theory of Acceptance and Use of Technology
(UTAUT) model has demonstrated the influencing factors for generic
information systems use such as tablet personal computer (TPC) and
mobile communication. However, in the context of digital library
system, there has been very little effort to determine factors affecting
the intention to use digital library based on the UTAUT model. This
paper investigates factors that are expected to influence the intention
of postgraduate students to use digital library based on modified
UTAUT model. The modified model comprises of constructs
represented by several latent variables, namely performance
expectancy (PE), effort expectancy (EE), information quality (IQ)
and service quality (SQ) and moderated by age, gender and
experience in using digital library. Results show that performance
expectancy, effort expectancy and information quality are positively
related to the intention to use digital library, while service quality is
negatively related to the intention to use digital library. Age and
gender have shown no evidence of any significant interactions, while
experience in using digital library significantly interacts with effort
expectancy and intention to use digital library. This has provided the
evidence of a moderating effect of experience in the intention to use
digital library. It is expected that this research will shed new lights
into research of acceptance and intention to use the library in a digital
environment.
Abstract: The paper presents a novel idea to control computer
mouse cursor movement with human eyes. In this paper, a working
of the product has been described as to how it helps the special
people share their knowledge with the world. Number of traditional
techniques such as Head and Eye Movement Tracking Systems etc.
exist for cursor control by making use of image processing in which
light is the primary source. Electro-oculography (EOG) is a new
technology to sense eye signals with which the mouse cursor can be
controlled. The signals captured using sensors, are first amplified,
then noise is removed and then digitized, before being transferred to
PC for software interfacing.
Abstract: In this paper, Selective Adaptive Parallel Interference Cancellation (SA-PIC) technique is presented for Multicarrier Direct Sequence Code Division Multiple Access (MC DS-CDMA) scheme. The motivation of using SA-PIC is that it gives high performance and at the same time, reduces the computational complexity required to perform interference cancellation. An upper bound expression of the bit error rate (BER) for the SA-PIC under Rayleigh fading channel condition is derived. Moreover, the implementation complexities for SA-PIC and Adaptive Parallel Interference Cancellation (APIC) are discussed and compared. The performance of SA-PIC is investigated analytically and validated via computer simulations.
Abstract: Compensating physiological motion in the context
of minimally invasive cardiac surgery has become an attractive
issue since it outperforms traditional cardiac procedures offering
remarkable benefits. Owing to space restrictions, computer vision
techniques have proven to be the most practical and suitable solution.
However, the lack of robustness and efficiency of existing methods
make physiological motion compensation an open and challenging
problem. This work focusses on increasing robustness and efficiency
via exploration of the classes of 1−and 2−regularized optimization,
emphasizing the use of explicit regularization. Both approaches are
based on natural features of the heart using intensity information.
Results pointed out the 1−regularized optimization class as the best
since it offered the shortest computational cost, the smallest average
error and it proved to work even under complex deformations.
Abstract: This paper is based on a study conducted in 2006 to assess the impact of computer usage on health of National Institute for Medical Research (NIMR) staff. NIMR being a research Institute, most of its staff spend substantial part of their working time on computers. There was notion among NIMR staff on possible prolonged computer usage health hazards. Hence, a study was conducted to establish facts and possible mitigation measures. A total of 144 NIMR staff were involved in the study of whom 63.2% were males and 36.8% females aged between 20 and 59 years. All staff cadres were included in the sample. The functions performed by Institute staff using computers includes; data management, proposal development and report writing, research activities, secretarial duties, accounting and administrative duties, on-line information retrieval and online communication through e-mail services. The interviewed staff had been using computers for 1-8 hours a day and for a period ranging from 1 to 20 years. The study has indicated ergonomic hazards for a significant proportion of interviewees (63%) of various kinds ranging from backache to eyesight related problems. The authors highlighted major issues which are substantially applicable in preventing occurrences of computer related problems and they urged NIMR Management and/or the government of Tanzania opts to adapt their practicability.
Abstract: Computer programming is considered a very difficult
course by many computer science students. The reasons for the
difficulties include cognitive load involved in programming,
different learning styles of students, instructional methodology and
the choice of the programming languages. To reduce the difficulties
the following have been tried: pair programming, program
visualization, different learning styles etc. However, these efforts
have produced limited success. This paper reviews the problem and
proposes a framework to help students overcome the difficulties
involved.
Abstract: Rapid Application Development (RAD) enables ever
expanding needs for speedy development of computer application
programs that are sophisticated, reliable, and full-featured. Visual
Basic was the first RAD tool for the Windows operating system, and
too many people say still it is the best. To provide very good
attraction in visual basic 6 applications, this paper directing to use
VRML scenes over the visual basic environment.
Abstract: Geographic Information System (GIS) is a computerbased
tool used extensively to solve various engineering problems
related to spatial data. In spite of growing popularity of GIS, its
complete potential to construction industry has not been realized. In
this paper, the summary of up-to-date work on spatial applications of
GIS technologies in construction industry is presented. GIS
technologies have the potential to solve space related problems of
construction industry involving complex visualization, integration of
information, route planning, E-commerce, cost estimation, etc. GISbased
methodology to handle time and space issues of construction
projects scheduling is developed and discussed in this paper.