Abstract: In this paper challenges associated with a new
generation of Computer Science students are examined. The mode of
education in tertiary institutes has progressed slowly while the needs
of students have changed rapidly in an increasingly technological
world. The major learning paradigms and learning theories within
these paradigms are studied to find a suitable strategy for educating
modern students. These paradigms include Behaviourism,
Constructivism, Humanism and Cogntivism. Social Learning theory
and Elaboration theory are two theories that are further examined and
a survey is done to determine how these strategies will be received by
students. The results and findings are evaluated and indicate that
students are fairly receptive to a method that incorporates both Social
Learning theory and Elaboration theory, but that some aspects of all
paradigms need to be implemented to create a balanced and effective
strategy with technology as foundation.
Abstract: Researchers have been applying artificial/ computational intelligence (AI/CI) methods to computer games. In this research field, further researchesare required to compare AI/CI methods with respect to each game application. In thispaper, we report our experimental result on the comparison of evolution strategy, genetic algorithm and their hybrids, applied to evolving controller agents for MarioAI. GA revealed its advantage in our experiment, whereas the expected ability of ES in exploiting (fine-tuning) solutions was not clearly observed. The blend crossover operator and the mutation operator of GA might contribute well to explore the vast search space.
Abstract: In this paper, the structural genetic algorithm is used to optimize the neural network to control the joint movements of robotic arm. The robotic arm has also been modeled in 3D and simulated in real-time in MATLAB. It is found that Neural Networks provide a simple and effective way to control the robot tasks. Computer simulation examples are given to illustrate the significance of this method. By combining Genetic Algorithm optimization method and Neural Networks for the given robotic arm with 5 D.O.F. the obtained the results shown that the base joint movements overshooting time without controller was about 0.5 seconds, while with Neural Network controller (optimized with Genetic Algorithm) was about 0.2 seconds, and the population size of 150 gave best results.
Abstract: Many multimedia communication applications require a
source to transmit messages to multiple destinations subject to quality
of service (QoS) delay constraint. To support delay constrained
multicast communications, computer networks need to guarantee an
upper bound end-to-end delay from the source node to each of
the destination nodes. This is known as multicast delay problem.
On the other hand, if the same message fails to arrive at each
destination node at the same time, there may arise inconsistency and
unfairness problem among users. This is related to multicast delayvariation
problem. The problem to find a minimum cost multicast
tree with delay and delay-variation constraints has been proven to
be NP-Complete. In this paper, we propose an efficient heuristic
algorithm, namely, Economic Delay and Delay-Variation Bounded
Multicast (EDVBM) algorithm, based on a novel heuristic function,
to construct an economic delay and delay-variation bounded multicast
tree. A noteworthy feature of this algorithm is that it has very high
probability of finding the optimal solution in polynomial time with
low computational complexity.
Abstract: The main goal of this seminal paper is to introduce the
application of Wireless Sensor Networks (WSN) in long distance
infrastructure monitoring (in particular in pipeline infrastructure
monitoring) – one of the on-going research projects by the Wireless
Communication Research Group at the department of Electronic and
Computer Engineering, Nnamdi Azikiwe University, Awka. The
current sensor network architectures for monitoring long distance
pipeline infrastructures are previewed. These are wired sensor
networks, RF wireless sensor networks, integrated wired and wireless
sensor networks. The reliability of these architectures is discussed.
Three reliability factors are used to compare the architectures in
terms of network connectivity, continuity of power supply for the
network, and the maintainability of the network. The constraints and
challenges of wireless sensor networks for monitoring and protecting
long distance pipeline infrastructure are discussed.
Abstract: The way music is interpreted by the human brain is a very interesting topic, but also an intricate one. Although this domain has been studied for over a century, many gray areas remain in the understanding of music. Recent advances have enabled us to perform accurate measurements of the time taken by the human brain to interpret and assimilate a sound. Cognitive computing provides tools and development environments that facilitate human cognition simulation. ACT-R is a cognitive architecture which offers an environment for implementing human cognitive tasks. This project combines our understanding of the music interpretation by a human listener and the ACT-R cognitive architecture to build SINGER, a computerized simulation for listening and recalling songs. The results are similar to human experimental data. Simulation results also show how it is easier to remember short melodies than long melodies which require more trials to be recalled correctly.
Abstract: The Norwegian Military Academy (Army) has been
using a tactical simulator for the last two years. During this time
there has been some discussion concerning how to use the simulator
most efficiently and what type of learning one achieves by using the
simulator. The problem that is addressed in this paper is how
simulators can be used as a learning resource for students concerned
with developing their military profession. The aim of this article is to
create a wider consciousness regarding the use of a simulator while
educating officers in a military profession. The article discusses the
use of simulators from two different perspectives. The first
perspective deals with using the simulator as a computer game, and
the second perspective looks at the simulator as a socio-cultural
artefact. Furthermore the article discusses four different ways the
simulator can be looked upon as a useful learning resource when
educating students of a military profession.
Abstract: Ambient Intelligence (AmI) environments bring
significant potential to exploit sophisticated computer technology in
everyday life. In particular, the educational domain could be
significantly enhanced through AmI, as personalized and adapted
learning could be transformed from paper concepts and prototypes to
real-life scenarios. In this paper, an integrated framework is
presented, named ClassMATE, supporting ubiquitous computing and
communication in a school classroom. The main objective of
ClassMATE is to enable pervasive interaction and context aware
education in the technologically augmented classroom of the future.
Abstract: This paper discusses the approach of real-time
controlling of the energy management system using the data
acquisition tool of LabVIEW. The main idea of this inspiration was
to interface the Station (PC) with the system and publish the data on
internet using LabVIEW. In this venture, controlling and switching of
3 phase AC loads are effectively and efficiently done. The phases are
also sensed through devices. In case of any failure the attached
generator starts functioning automatically. The computer sends
command to the system and system respond to the request. The
modern feature is to access and control the system world-wide using
world wide web (internet). This controlling can be done at any time
from anywhere to effectively use the energy especially in developing
countries where energy management is a big problem. In this system
totally integrated devices are used to operate via remote location.
Abstract: loss of feedwater accident is one of the frequently sever accidents in steam boiler facilities. It threatens the system structural integrity and generates serious hazards and economic loses. The safety analysis of the thermal installations, based extensively on the numeric simulation. The simulation analysis using realistic computer codes like Relap5/Mod3.2 will help understand steam boiler thermal-hydraulic behavior during normal and abnormal conditions. In this study, we are interested on the evaluation of the radiant steam boiler assessment and response to loss-of-feedwater accident. Pressure, temperature and flow rate profiles are presented in various steam boiler system components. The obtained results demonstrate the importance and capability of the Relap5/Mod3.2 code in the thermal-hydraulic analysis of the steam boiler facilities.
Abstract: This paper describes a newly designed decentralized
nonlinear control strategy to control a robot manipulator. Based on the
concept of the nonlinear state feedback theory and decentralized
concept is developed to improve the drawbacks in previous works
concerned with complicate intelligent control and low cost effective
sensor. The control methodology is derived in the sense of Lyapunov
theorem so that the stability of the control system is guaranteed. The
decentralized algorithm does not require other joint angle and velocity
information. Individual Joint controller is implemented using a digital
processor with nearly actuator to make it possible to achieve good
dynamics and modular. Computer simulation result has been
conducted to validate the effectiveness of the proposed control scheme
under the occurrence of possible uncertainties and different reference
trajectories. The merit of the proposed control system is indicated in
comparison with a classical control system.
Abstract: Paced Auditory Serial Addition Test (PASAT) has
been used as a common research tool for different neurological
disorders like Multiple Sclerosis. Recently, technology let
researchers to introduce a new versions of the visual test, the paced
visual serial addition test (PVSAT). In this paper, the computerized
version of these two tests is introduced. Beside the number of true
responses are interpreted, the reaction time of subjects are calculated
by the software. We hypothesize that paying attention to the reaction
time may be valuable. For this purpose, sixty eight female normal
subjects and fifty eight male normal subjects are enrolled in the
study. We investigate the similarity between the PASAT3 and
PVSAT3 in number of true responses and the new criterion (the
average reaction time of each subject). The similarity between two
tests were rejected (p-value = 0.000) which means that these two test
differ. The effect of sex in the tests were not approved since the pvalues
of different between PASAT3 and PVSAT3 in both sex is the
same (p-value = 0.000) which means that male and female subjects
performed the tests at no different level of performance. The new
criterion shows a negative correlation with the age which offers aged
normal subjects may have the same number of true responses as the
young subjects but they have latent responses. This will give prove
for the importance of reaction time.
Abstract: Low power consumption is a major constraint for battery-powered system like computer notebook or PDA. In the past, specialists usually designed both specific optimized equipments and codes to relief this concern. Doing like this could work for quite a long time, however, in this era, there is another significant restraint, the time to market. To be able to serve along the power constraint while can launch products in shorter production period, objectoriented programming (OOP) has stepped in to this field. Though everyone knows that OOP has quite much more overhead than assembly and procedural languages, development trend still heads to this new world, which contradicts with the target of low power consumption. Most of the prior power related software researches reported that OOP consumed much resource, however, as industry had to accept it due to business reasons, up to now, no papers yet had mentioned about how to choose the best OOP practice in this power limited boundary. This article is the pioneer that tries to specify and propose the optimized strategy in writing OOP software under energy concerned environment, based on quantitative real results. The language chosen for studying is C# based on .NET Framework 2.0 which is one of the trendy OOP development environments. The recommendation gotten from this research would be a good roadmap that can help developers in coding that well balances between time to market and time of battery.
Abstract: We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.
Abstract: The history of money is described in relationship to the history of computing. With the transformation and acceptance of money as information, major challenges to the security of money have involved engineering, computer science, and management. Research opportunities and challenges are described as money continues its transformation into information.
Abstract: Global approximation using metamodel for complex
mathematical function or computer model over a large variable
domain is often needed in sensibility analysis, computer simulation,
optimal control, and global design optimization of complex, multiphysics
systems. To overcome the limitations of the existing
response surface (RS), surrogate or metamodel modeling methods for
complex models over large variable domain, a new adaptive and
regressive RS modeling method using quadratic functions and local
area model improvement schemes is introduced. The method applies
an iterative and Latin hypercube sampling based RS update process,
divides the entire domain of design variables into multiple cells,
identifies rougher cells with large modeling error, and further divides
these cells along the roughest dimension direction. A small number
of additional sampling points from the original, expensive model are
added over the small and isolated rough cells to improve the RS
model locally until the model accuracy criteria are satisfied. The
method then combines local RS cells to regenerate the global RS
model with satisfactory accuracy. An effective RS cells sorting
algorithm is also introduced to improve the efficiency of model
evaluation. Benchmark tests are presented and use of the new
metamodeling method to replace complex hybrid electrical vehicle
powertrain performance model in vehicle design optimization and
optimal control are discussed.
Abstract: Interactive installations for public spaces are a
particular kind of interactive systems, the design of which has been
the subject of several research studies. Sensor-based applications are
becoming increasingly popular, but the human-computer interaction
community is still far from reaching sound, effective large-scale
interactive installations for public spaces. The 6DSpaces project is
described in this paper as a research approach based on studying the
role of multisensory interactivity and how it can be effectively used
to approach people to digital, scientific contents. The design of an
entire scientific exhibition is described and the result was evaluated
in the real world context of a Science Centre. Conclusions bring
insight into how the human-computer interaction should be designed
in order to maximize the overall experience.
Abstract: The purposes of this paper are to (1) promote
excellence in computer science by suggesting a cohesive innovative
approach to fill well documented deficiencies in current computer
science education, (2) justify (using the authors- and others anecdotal
evidence from both the classroom and the real world) why this
approach holds great potential to successfully eliminate the
deficiencies, (3) invite other professionals to join the authors in proof
of concept research. The authors- experiences, though anecdotal,
strongly suggest that a new approach involving visual modeling
technologies should allow computer science programs to retain a
greater percentage of prospective and declared majors as students
become more engaged learners, more successful problem-solvers,
and better prepared as programmers. In addition, the graduates of
such computer science programs will make greater contributions to
the profession as skilled problem-solvers. Instead of wearily
rememorizing code as they move to the next course, students will
have the problem-solving skills to think and work in more
sophisticated and creative ways.
Abstract: Least Development Countries (LDC) like
Bangladesh, whose 25% revenue earning is achieved from Textile
export, requires producing less defective textile for minimizing
production cost and time. Inspection processes done on these
industries are mostly manual and time consuming. To reduce error
on identifying fabric defects requires more automotive and
accurate inspection process. Considering this lacking, this research
implements a Textile Defect Recognizer which uses computer
vision methodology with the combination of multi-layer neural
networks to identify four classifications of textile defects. The
recognizer, suitable for LDC countries, identifies the fabric defects
within economical cost and produces less error prone inspection
system in real time. In order to generate input set for the neural
network, primarily the recognizer captures digital fabric images by
image acquisition device and converts the RGB images into binary
images by restoration process and local threshold techniques.
Later, the output of the processed image, the area of the faulty
portion, the number of objects of the image and the sharp factor of
the image, are feed backed as an input layer to the neural network
which uses back propagation algorithm to compute the weighted
factors and generates the desired classifications of defects as an
output.
Abstract: The vehicle fleet of public transportation companies is often equipped with intelligent on-board passenger information systems. A frequently used but time and labor-intensive way for keeping the on-board controllers up-to-date is the manual update using different memory cards (e.g. flash cards) or portable computers. This paper describes a compression algorithm that enables data transmission using low bandwidth wireless radio networks (e.g. GPRS) by minimizing the amount of data traffic. In typical cases it reaches a compression rate of an order of magnitude better than that of the general purpose compressors. Compressed data can be easily expanded by the low-performance controllers, too.