Abstract: In recent years there has been renewal of interest in the
relation between Green IT and Cloud Computing. The growing use of
computers in cloud platform has caused marked energy consumption,
putting negative pressure on electricity cost of cloud data center. This
paper proposes an effective mechanism to reduce energy utilization in
cloud computing environments. We present initial work on the
integration of resource and power management that aims at reducing
power consumption. Our mechanism relies on recalling virtualization
services dynamically according to user-s virtualization request and
temporarily shutting down the physical machines after finish in order
to conserve energy. Given the estimated energy consumption, this
proposed effort has the potential to positively impact power
consumption. The results from the experiment concluded that energy
indeed can be saved by powering off the idling physical machines in
cloud platforms.
Abstract: The dynamics of User Datagram Protocol (UDP) traffic
over Ethernet between two computers are analyzed using nonlinear
dynamics which shows that there are two clear regimes in the data
flow: free flow and saturated. The two most important variables
affecting this are the packet size and packet flow rate. However,
this transition is due to a transcritical bifurcation rather than phase
transition in models such as in vehicle traffic or theorized large-scale
computer network congestion. It is hoped this model will help lay
the groundwork for further research on the dynamics of networks,
especially computer networks.
Abstract: The evaluation of residual reliability of large sized
parallel computer interconnection systems is not practicable with
the existing methods. Under such conditions, one must go for
approximation techniques which provide the upper bound and lower
bound on this reliability. In this context, a new approximation method
for providing bounds on residual reliability is proposed here. The
proposed method is well supported by two algorithms for simulation
purpose. The bounds on residual reliability of three different categories
of interconnection topologies are efficiently found by using
the proposed method
Abstract: The purposes of this paper are to (1) promote excellence in computer science by suggesting a cohesive innovative approach to fill well documented deficiencies in current computer science education, (2) justify (using the authors' and others anecdotal evidence from both the classroom and the real world) why this approach holds great potential to successfully eliminate the deficiencies, (3) invite other professionals to join the authors in proof of concept research. The authors' experiences, though anecdotal, strongly suggest that a new approach involving visual modeling technologies should allow computer science programs to retain a greater percentage of prospective and declared majors as students become more engaged learners, more successful problem-solvers, and better prepared as programmers. In addition, the graduates of such computer science programs will make greater contributions to the profession as skilled problem-solvers. Instead of wearily rememorizing code as they move to the next course, students will have the problem-solving skills to think and work in more sophisticated and creative ways.
Abstract: This paper proposes a “soft systems" approach to
domain-driven design of computer-based information systems. We
propose a systemic framework combining techniques from Soft
Systems Methodology (SSM), the Unified Modelling Language
(UML), and an implementation pattern known as “Naked Objects".
We have used this framework in action research projects that have
involved the investigation and modelling of business processes using
object-oriented domain models and the implementation of software
systems based on those domain models. Within the proposed
framework, Soft Systems Methodology (SSM) is used as a guiding
methodology to explore the problem situation and to generate a
ubiquitous language (soft language) which can be used as the basis
for developing an object-oriented domain model. The domain model
is further developed using techniques based on the UML and is
implemented in software following the “Naked Objects"
implementation pattern. We argue that there are advantages from
combining and using techniques from different methodologies in this
way.
The proposed systemic framework is overviewed and justified as
multimethodologyusing Mingers multimethodology ideas.
This multimethodology approach is being evaluated through a
series of action research projects based on real-world case studies. A
Peer-Tutoring case study is presented here as a sample of the
framework evaluation process
Abstract: With the development of the Internet, E-commerce is
growing at an exponential rate, and lots of online stores are built up to
sell their goods online. A major factor influencing the successful
adoption of E-commerce is consumer-s trust. For new or unknown
Internet business, consumers- lack of trust has been cited as a major
barrier to its proliferation. As web sites provide key interface for
consumer use of E-Commerce, we investigate the design of web site to
build trust in E-Commerce from a design science approach. A
conceptual model is proposed in this paper to describe the ontology of
online transaction and human-computer interaction. Based on this
conceptual model, we provide a personalized webpage design
approach using Bayesian networks learning method. Experimental
evaluation are designed to show the effectiveness of web
personalization in improving consumer-s trust in new or unknown
online store.
Abstract: Internet computer games turn to be more and more
attractive within the context of technology enhanced learning.
Educational games as quizzes and quests have gained significant
success in appealing and motivating learners to study in a different
way and provoke steadily increasing interest in new methods of
application. Board games are specific group of games where figures
are manipulated in competitive play mode with race conditions on a
surface according predefined rules. The article represents a new,
formalized model of traditional quizzes, puzzles and quests shown as
multimedia board games which facilitates the construction process of
such games. Authors provide different examples of quizzes and their
models in order to demonstrate the model is quite general and does
support not only quizzes, mazes and quests but also any set of
teaching activities. The execution process of such models is
explained and, as well, how they can be useful for creation and
delivery of adaptive e-learning courseware.
Abstract: In image processing and visualization, comparing two
bitmapped images needs to be compared from their pixels by matching
pixel-by-pixel. Consequently, it takes a lot of computational time
while the comparison of two vector-based images is significantly
faster. Sometimes these raster graphics images can be approximately
converted into the vector-based images by various techniques. After
conversion, the problem of comparing two raster graphics images
can be reduced to the problem of comparing vector graphics images.
Hence, the problem of comparing pixel-by-pixel can be reduced to
the problem of polynomial comparisons. In computer aided geometric
design (CAGD), the vector graphics images are the composition of
curves and surfaces. Curves are defined by a sequence of control
points and their polynomials. In this paper, the control points will be
considerably used to compare curves. The same curves after relocated
or rotated are treated to be equivalent while two curves after different
scaled are considered to be similar curves. This paper proposed an
algorithm for comparing the polynomial curves by using the control
points for equivalence and similarity. In addition, the geometric
object-oriented database used to keep the curve information has also
been defined in XML format for further used in curve comparisons.
Abstract: Detection and tracking of the lip contour is an important
issue in speechreading. While there are solutions for lip tracking
once a good contour initialization in the first frame is available,
the problem of finding such a good initialization is not yet solved
automatically, but done manually. We have developed a new tracking
solution for lip contour detection using only few landmarks (15
to 25) and applying the well known Active Shape Models (ASM).
The proposed method is a new LMS-like adaptive scheme based on
an Auto regressive (AR) model that has been fit on the landmark
variations in successive video frames. Moreover, we propose an extra
motion compensation model to address more general cases in lip
tracking. Computer simulations demonstrate a fair match between
the true and the estimated spatial pixels. Significant improvements
related to the well known LMS approach has been obtained via a
defined Frobenius norm index.
Abstract: In today-s era of plasma and laser cutting, machines using oxy-acetylene flame are also meritorious due to their simplicity and cost effectiveness. The objective to devise a Computer controlled Oxy-Fuel profile cutting machine arose from the increasing demand for metal cutting with respect to edge quality, circularity and lesser formation of redeposit material. The System has an 8 bit micro controller based embedded system, which assures stipulated time response. A new window based Application software was devised which takes a standard CAD file .DXF as input and converts it into numerical data required for the controller. It uses VB6 as a front end whereas MS-ACCESS and AutoCAD as back end. The system is designed around AT89C51RD2, powerful 8 bit, ISP micro controller from Atmel and is optimized to achieve cost effectiveness and also maintains the required accuracy and reliability for complex shapes. The backbone of the system is a cleverly designed mechanical assembly along with the embedded system resulting in an accuracy of about 10 microns while maintaining perfect linearity in the cut. This results in substantial increase in productivity. The observed results also indicate reduced inter laminar spacing of pearlite with an increase in the hardness of the edge region.
Abstract: Prime Factorization based on Quantum approach in
two phases has been performed. The first phase has been achieved at
Quantum computer and the second phase has been achieved at the
classic computer (Post Processing). At the second phase the goal is to
estimate the period r of equation xrN ≡ 1 and to find the prime factors
of the composite integer N in classic computer. In this paper we
present a method based on Randomized Approach for estimation the
period r with a satisfactory probability and the composite integer N
will be factorized therefore with the Randomized Approach even the
gesture of the period is not exactly the real period at least we can find
one of the prime factors of composite N. Finally we present some
important points for designing an Emulator for Quantum Computer
Simulation.
Abstract: Scheduling of diversified service requests in
distributed computing is a critical design issue. Cloud is a type of
parallel and distributed system consisting of a collection of
interconnected and virtual computers. It is not only the clusters and
grid but also it comprises of next generation data centers. The paper
proposes an initial heuristic algorithm to apply modified ant colony
optimization approach for the diversified service allocation and
scheduling mechanism in cloud paradigm. The proposed optimization
method is aimed to minimize the scheduling throughput to service all
the diversified requests according to the different resource allocator
available under cloud computing environment.
Abstract: Decrease in hardware costs and advances in computer
networking technologies have led to increased interest in the use of
large-scale parallel and distributed computing systems. One of the
biggest issues in such systems is the development of effective
techniques/algorithms for the distribution of the processes/load of a
parallel program on multiple hosts to achieve goal(s) such as
minimizing execution time, minimizing communication delays,
maximizing resource utilization and maximizing throughput.
Substantive research using queuing analysis and assuming job
arrivals following a Poisson pattern, have shown that in a multi-host
system the probability of one of the hosts being idle while other host
has multiple jobs queued up can be very high. Such imbalances in
system load suggest that performance can be improved by either
transferring jobs from the currently heavily loaded hosts to the lightly
loaded ones or distributing load evenly/fairly among the hosts .The
algorithms known as load balancing algorithms, helps to achieve the
above said goal(s). These algorithms come into two basic categories -
static and dynamic. Whereas static load balancing algorithms (SLB)
take decisions regarding assignment of tasks to processors based on
the average estimated values of process execution times and
communication delays at compile time, Dynamic load balancing
algorithms (DLB) are adaptive to changing situations and take
decisions at run time.
The objective of this paper work is to identify qualitative
parameters for the comparison of above said algorithms. In future this
work can be extended to develop an experimental environment to
study these Load balancing algorithms based on comparative
parameters quantitatively.
Abstract: Finding the minimal logical functions has important applications in the design of logical circuits. This task is solved by many different methods but, frequently, they are not suitable for a computer implementation. We briefly summarise the well-known Quine-McCluskey method, which gives a unique procedure of computing and thus can be simply implemented, but, even for simple examples, does not guarantee an optimal solution. Since the Petrick extension of the Quine-McCluskey method does not give a generally usable method for finding an optimum for logical functions with a high number of values, we focus on interpretation of the result of the Quine-McCluskey method and show that it represents a set covering problem that, unfortunately, is an NP-hard combinatorial problem. Therefore it must be solved by heuristic or approximation methods. We propose an approach based on genetic algorithms and show suitable parameter settings.
Abstract: In our modern world, more physical transactions are being substituted by electronic transactions (i.e. banking, shopping, and payments), many businesses and companies are performing most of their operations through the internet. Instead of having a physical commerce, internet visitors are now adapting to electronic commerce (e-Commerce). The ability of web users to reach products worldwide can be greatly benefited by creating friendly and personalized online business portals. Internet visitors will return to a particular website when they can find the information they need or want easily. Dealing with this human conceptualization brings the incorporation of Artificial/Computational Intelligence techniques in the creation of customized portals. From these techniques, Fuzzy-Set technologies can make many useful contributions to the development of such a human-centered endeavor as e-Commerce. The main objective of this paper is the implementation of a Paradigm for the Intelligent Design and Operation of Human-Computer interfaces. In particular, the paradigm is quite appropriate for the intelligent design and operation of software modules that display information (such Web Pages, graphic user interfaces GUIs, Multimedia modules) on a computer screen. The human conceptualization of the user personal information is analyzed throughout a Cascaded Fuzzy Inference (decision-making) System to generate the User Ascribe Qualities, which identify the user and that can be used to customize portals with proper Web links.
Abstract: Diagnosis can be achieved by building a model of a
certain organ under surveillance and comparing it with the real time
physiological measurements taken from the patient. This paper deals
with the presentation of the benefits of using Data Mining techniques
in the computer-aided diagnosis (CAD), focusing on the cancer
detection, in order to help doctors to make optimal decisions quickly
and accurately. In the field of the noninvasive diagnosis techniques,
the endoscopic ultrasound elastography (EUSE) is a recent elasticity
imaging technique, allowing characterizing the difference between
malignant and benign tumors. Digitalizing and summarizing the main
EUSE sample movies features in a vector form concern with the use
of the exploratory data analysis (EDA). Neural networks are then
trained on the corresponding EUSE sample movies vector input in
such a way that these intelligent systems are able to offer a very
precise and objective diagnosis, discriminating between benign and
malignant tumors. A concrete application of these Data Mining
techniques illustrates the suitability and the reliability of this
methodology in CAD.
Abstract: Over the past decade, mobile has experienced a
revolution that will ultimately change the way we communicate.All
these technologies have a common denominator exploitation of
computer information systems, but their operation can be tedious
because of problems with heterogeneous data sources.To overcome
the problems of heterogeneous data sources, we propose to use a
technique of adding an extra layer interfacing applications of
management or supervision at the different data sources.This layer
will be materialized by the implementation of a mediator between
different host applications and information systems frequently used
hierarchical and relational manner such that the heterogeneity is
completely transparent to the VoIP platform.
Abstract: This paper considers the problem of Null-Steering beamforming using Neural Network (NN) approach for antenna array system. Two cases are presented. First, unlike the other authors, the estimated Direction Of Arrivals (DOAs) are used for antenna array weights NN-based determination and the imprecise DOAs estimations are taken into account. Second, the blind null-steering beamforming is presented. In this case the antenna array outputs are presented at the input of the NN without DOAs estimation. The results of computer simulations will show much better relative mean error performances of the first NN approach compared to the NNbased blind beamforming.
Abstract: Sorting appears the most attention among all computational tasks over the past years because sorted data is at the heart of many computations. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. Many parallel sorting algorithms have been investigated for a variety of parallel computer architectures. In this paper, three parallel sorting algorithms have been implemented and compared in terms of their overall execution time. The algorithms implemented are the odd-even transposition sort, parallel merge sort and parallel rank sort. Cluster of Workstations or Windows Compute Cluster has been used to compare the algorithms implemented. The C# programming language is used to develop the sorting algorithms. The MPI (Message Passing Interface) library has been selected to establish the communication and synchronization between processors. The time complexity for each parallel sorting algorithm will also be mentioned and analyzed.
Abstract: The main mission of Ezilla is to provide a friendly
interface to access the virtual machine and quickly deploy the high
performance computing environment. Ezilla has been developed by
Pervasive Computing Team at National Center for High-performance
Computing (NCHC). Ezilla integrates the Cloud middleware,
virtualization technology, and Web-based Operating System (WebOS)
to form a virtual computer in distributed computing environment. In
order to upgrade the dataset and speedup, we proposed the sensor
observation system to deal with a huge amount of data in the
Cassandra database. The sensor observation system is based on the
Ezilla to store sensor raw data into distributed database. We adopt the
Ezilla Cloud service to create virtual machines and login into virtual
machine to deploy the sensor observation system. Integrating the
sensor observation system with Ezilla is to quickly deploy experiment
environment and access a huge amount of data with distributed
database that support the replication mechanism to protect the data
security.