Abstract: With the development of ubiquitous computing,
current user interaction approaches with keyboard, mouse and pen
are not sufficient. Due to the limitation of these devices the useable
command set is also limited. Direct use of hands as an input device is
an attractive method for providing natural Human Computer
Interaction which has evolved from text-based interfaces through 2D
graphical-based interfaces, multimedia-supported interfaces, to fully
fledged multi-participant Virtual Environment (VE) systems.
Imagine the human-computer interaction of the future: A 3Dapplication
where you can move and rotate objects simply by moving
and rotating your hand - all without touching any input device. In this
paper a review of vision based hand gesture recognition is presented.
The existing approaches are categorized into 3D model based
approaches and appearance based approaches, highlighting their
advantages and shortcomings and identifying the open issues.
Abstract: The study of proteomics reached unexpected levels of
interest, as a direct consequence of its discovered influence over some
complex biological phenomena, such as problematic diseases like
cancer. This paper presents the latest authors- achievements regarding
the analysis of the networks of proteins (interactome networks), by
computing more efficiently the betweenness centrality measure. The
paper introduces the concept of betweenness centrality, and then
describes how betweenness computation can help the interactome net-
work analysis. Current sequential implementations for the between-
ness computation do not perform satisfactory in terms of execution
times. The paper-s main contribution is centered towards introducing
a speedup technique for the betweenness computation, based on
modified shortest path algorithms for sparse graphs. Three optimized
generic algorithms for betweenness computation are described and
implemented, and their performance tested against real biological
data, which is part of the IntAct dataset.
Abstract: The goal of Gene Expression Analysis is to understand the processes that underlie the regulatory networks and pathways controlling inter-cellular and intra-cellular activities. In recent times microarray datasets are extensively used for this purpose. The scope of such analysis has broadened in recent times towards reconstruction of gene networks and other holistic approaches of Systems Biology. Evolutionary methods are proving to be successful in such problems and a number of such methods have been proposed. However all these methods are based on processing of genotypic information. Towards this end, there is a need to develop evolutionary methods that address phenotypic interactions together with genotypic interactions. We present a novel evolutionary approach, called Phenomic algorithm, wherein the focus is on phenotypic interaction. We use the expression profiles of genes to model the interactions between them at the phenotypic level. We apply this algorithm to the yeast sporulation dataset and show that the algorithm can identify gene networks with relative ease.
Abstract: Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.
Abstract: A minimal complexity version of component mode
synthesis is presented that requires simplified computer
programming, but still provides adequate accuracy for modeling
lower eigenproperties of large structures and their transient
responses. The novelty is that a structural separation into components
is done along a plane/surface that exhibits rigid-like behavior, thus
only normal modes of each component is sufficient to use, without
computing any constraint, attachment, or residual-attachment modes.
The approach requires only such input information as a few (lower)
natural frequencies and corresponding undamped normal modes of
each component. A novel technique is shown for formulation of
equations of motion, where a double transformation to generalized
coordinates is employed and formulation of nonproportional damping
matrix in generalized coordinates is shown.
Abstract: This article is devoted to the numerical solution of
large-scale quadratic eigenvalue problems. Such problems arise in
a wide variety of applications, such as the dynamic analysis of
structural mechanical systems, acoustic systems, fluid mechanics,
and signal processing. We first introduce a generalized second-order
Krylov subspace based on a pair of square matrices and two initial
vectors and present a generalized second-order Arnoldi process for
constructing an orthonormal basis of the generalized second-order
Krylov subspace. Then, by using the projection technique and the
refined projection technique, we propose a restarted generalized
second-order Arnoldi method and a restarted refined generalized
second-order Arnoldi method for computing some eigenpairs of largescale
quadratic eigenvalue problems. Some theoretical results are also
presented. Some numerical examples are presented to illustrate the
effectiveness of the proposed methods.
Abstract: Avalanche release of snow has been modeled in the present studies. Snow is assumed to be represented by semi-solid and the governing equations have been studied from the concept of continuum approach. The dynamical equations have been solved for two different zones [starting zone and track zone] by using appropriate initial and boundary conditions. Effect of density (ρ), Eddy viscosity (η), Slope angle (θ), Slab depth (R) on the flow parameters have been observed in the present studies. Numerical methods have been employed for computing the non linear differential equations. One of the most interesting and fundamental innovation in the present studies is getting initial condition for the computation of velocity by numerical approach. This information of the velocity has obtained through the concept of fracture mechanics applicable to snow. The results on the flow parameters have found to be in qualitative agreement with the published results.
Abstract: This paper may be considered as combination of both pervasive computing and Differential GPS (global positioning satellite) which relates to control automatic traffic signals in such a
way as to pre-empt normal signal operation and permit lifesaving vehicles. Before knowing the arrival of the lifesaving vehicles from
the signal there is a chance of clearing the traffic. Traffic signal
preemption system includes a vehicle equipped with onboard computer system capable of capturing diagnostic information and
estimated location of the lifesaving vehicle using the information provided by GPS receiver connected to the onboard computer system
and transmitting the information-s using a wireless transmitter via a
wireless network. The fleet management system connected to a
wireless receiver is capable of receiving the information transmitted
by the lifesaving vehicle .A computer is also located at the
intersection uses corrected vehicle position, speed & direction
measurements, in conjunction with previously recorded data defining
approach routes to the intersection, to determine the optimum time to
switch a traffic light controller to preemption mode so that lifesaving
vehicles can pass safely. In case when the ambulance need to take a
“U" turn in a heavy traffic area we suggest a solution. Now we are
going to make use of computerized median which uses LINKED
BLOCKS (removable) to solve the above problem.
Abstract: This paper presents the design and implementation of CASTE, a Cloud-based automatic software test environment. We first present the architecture of CASTE, then the main packages and classes of it are described in detail. CASTE is built upon a private Infrastructure as a Service platform. Through concentrated resource management of virtualized testing environment and automatic execution control of test scripts, we get a better solution to the testing resource utilization and test automation problem. Experiments on CASTE give very appealing results.
Abstract: This paper develops an unscented grid-based filter
and a smoother for accurate nonlinear modeling and analysis
of time series. The filter uses unscented deterministic sampling
during both the time and measurement updating phases, to approximate
directly the distributions of the latent state variable. A
complementary grid smoother is also made to enable computing
of the likelihood. This helps us to formulate an expectation
maximisation algorithm for maximum likelihood estimation of
the state noise and the observation noise. Empirical investigations
show that the proposed unscented grid filter/smoother compares
favourably to other similar filters on nonlinear estimation tasks.
Abstract: Modeling product configurations needs large amounts of knowledge about technical and marketing restrictions on the product. Previous attempts to automate product configurations concentrate on representations and management of the knowledge for specific domains in fixed and isolated computing environments. Since the knowledge about product configurations is subject to continuous change and hard to express, these attempts often failed to efficiently manage and exchange the knowledge in collaborative product development. In this paper, XML Topic Map (XTM) is introduced to represent and exchange the knowledge about product configurations in collaborative product development. A product configuration model based on XTM along with its merger and inference facilities enables configuration engineers in collaborative product development to manage and exchange their knowledge efficiently. A prototype implementation is also presented to demonstrate the proposed model can be applied to engineering information systems to exchange the product configuration knowledge.
Abstract: Leading topic of this article is description of Lorentz
forces in the container with cuboid and cylindrical shape. Inside of
the container is an electrically conductive melt. This melt is driven by
rotating magnetic field. Input data for comparing Lorentz forces in
the container with cuboid shape were obtained from the computing
program NS-FEM3D, which uses DDS method of computing. Values
of Lorentz forces for container with cylindrical shape were obtained
from inferred analytical formula.
Abstract: This paper will present the initial findings of a
research into distributed computer rendering. The goal of the
research is to create a distributed computer system capable of
rendering a 3D model into an MPEG-4 stream. This paper outlines
the initial design, software architecture and hardware setup for the
system.
Distributed computing means designing and implementing
programs that run on two or more interconnected computing systems.
Distributed computing is often used to speed up the rendering of
graphical imaging. Distributed computing systems are used to
generate images for movies, games and simulations.
A topic of interest is the application of distributed computing to
the MPEG-4 standard. During the course of the research, a
distributed system will be created that can render a 3D model into an
MPEG-4 stream. It is expected that applying distributed computing
principals will speed up rendering, thus improving the usefulness and
efficiency of the MPEG-4 standard
Abstract: Much has been written about the difficulties students
have with producing traditional dissertations. This includes both
native English speakers (L1) and students with English as a second
language (L2). The main emphasis of these papers has been on the
structure of the dissertation, but in all cases, even when electronic
versions are discussed, the dissertation is still in what most would
regard as a traditional written form.
Master of Science Degrees in computing disciplines require
students to gain technical proficiency and apply their knowledge to a
range of scenarios. The basis of this paper is that if a dissertation is a
means of showing that such a student has met the criteria for a pass,
which should be based on the learning outcomes of the dissertation
module, does meeting those outcomes require a student to
demonstrate their skills in a solely text based form, particularly in a
highly technical research project? Could it be possible for a student
to produce a series of related artifacts which form a cohesive package
that meets the learning out comes of the dissertation?
Abstract: One of the most basic functions of control engineers is
tuning of controllers. There are always several process loops in the
plant necessitate of tuning. The auto tuned Proportional Integral
Derivative (PID) Controllers are designed for applications where
large load changes are expected or the need for extreme accuracy and
fast response time exists. The algorithm presented in this paper is
used for the tuning PID controller to obtain its parameters with a
minimum computing complexity. It requires continuous analysis of
variation in few parameters, and let the program to do the plant test
and calculate the controller parameters to adjust and optimize the
variables for the best performance. The algorithm developed needs
less time as compared to a normal step response test for continuous
tuning of the PID through gain scheduling.
Abstract: In this paper, a two-dimensional mathematical model is developed for estimating the extent of inland inundation due to Indonesian tsunami of 2004 along the coastal belts of Peninsular Malaysia and Thailand. The model consists of the shallow water equations together with open and coastal boundary conditions. In order to route the water wave towards the land, the coastal boundary is treated as a time dependent moving boundary. For computation of tsunami inundation, the initial tsunami wave is generated in the deep ocean with the strength of the Indonesian tsunami of 2004. Several numerical experiments are carried out by changing the slope of the beach to examine the extent of inundation with slope. The simulated inundation is found to decrease with the increase of the slope of the orography. Correlation between inundation / recession and run-up are found to be directly proportional to each other.
Abstract: Location-aware computing is a type of pervasive
computing that utilizes user-s location as a dominant factor for
providing urban services and application-related usages. One of the
important urban services is navigation instruction for wayfinders in a
city especially when the user is a tourist. The services which are
presented to the tourists should provide adapted location aware
instructions. In order to achieve this goal, the main challenge is to
find spatial relevant objects and location-dependent information. The
aim of this paper is the development of a reusable location-aware
model to handle spatial relevancy parameters in urban location-aware
systems. In this way we utilized ontology as an approach which could
manage spatial relevancy by defining a generic model. Our
contribution is the introduction of an ontological model based on the
directed interval algebra principles. Indeed, it is assumed that the
basic elements of our ontology are the spatial intervals for the user
and his/her related contexts. The relationships between them would
model the spatial relevancy parameters. The implementation language
for the model is OWLs, a web ontology language. The achieved
results show that our proposed location-aware model and the
application adaptation strategies provide appropriate services for the
user.
Abstract: The incorporation of computational fluid dynamics in the design of modern hydraulic turbines appears to be necessary in order to improve their efficiency and cost-effectiveness beyond the traditional design practices. A numerical optimization methodology is developed and applied in the present work to a Turgo water turbine. The fluid is simulated by a Lagrangian mesh-free approach that can provide detailed information on the energy transfer and enhance the understanding of the complex, unsteady flow field, at very small computing cost. The runner blades are initially shaped according to hydrodynamics theory, and parameterized using Bezier polynomials and interpolation techniques. The use of a limited number of free design variables allows for various modifications of the standard blade shape, while stochastic optimization using evolutionary algorithms is implemented to find the best blade that maximizes the attainable hydraulic efficiency of the runner. The obtained optimal runner design achieves considerably higher efficiency than the standard one, and its numerically predicted performance is comparable to a real Turgo turbine, verifying the reliability and the prospects of the new methodology.
Abstract: Cloud Computing is an approach that provides computation and storage services on-demand to clients over the network, independent of device and location. In the last few years, cloud computing became a trend in information technology with many companies that transfer their business processes and applications in the cloud. Cloud computing with service oriented architecture has contributed to rapid development of Geographic Information Systems. Open Geospatial Consortium with its standards provides the interfaces for hosted spatial data and GIS functionality to integrated GIS applications. Furthermore, with the enormous processing power, clouds provide efficient environment for data intensive applications that can be performed efficiently, with higher precision, and greater reliability. This paper presents our work on the geospatial data services within the cloud computing environment and its technology. A cloud computing environment with the strengths and weaknesses of the geographic information system will be introduced. The OGC standards that solve our application interoperability are highlighted. Finally, we outline our system architecture with utilities for requesting and invoking our developed data intensive applications as a web service.
Abstract: Technology of thin film deposition is of interest in
many engineering fields, from electronic manufacturing to corrosion
protective coating. A typical deposition process, like that developed
at the University of Eindhoven, considers the deposition of a thin,
amorphous film of C:H or of Si:H on the substrate, using the
Expanding Thermal arc Plasma technique. In this paper a computing
procedure is proposed to simulate the flow field in a deposition
chamber similar to that at the University of Eindhoven and a
sensitivity analysis is carried out in terms of: precursor mass flow
rate, electrical power, supplied to the torch and fluid-dynamic
characteristics of the plasma jet, using different nozzles. To this
purpose a deposition chamber similar in shape, dimensions and
operating parameters to the above mentioned chamber is considered.
Furthermore, a method is proposed for a very preliminary evaluation
of the film thickness distribution on the substrate. The computing
procedure relies on two codes working in tandem; the output from
the first code is the input to the second one. The first code simulates
the flow field in the torch, where Argon is ionized according to the
Saha-s equation, and in the nozzle. The second code simulates the
flow field in the chamber. Due to high rarefaction level, this is a
(commercial) Direct Simulation Monte Carlo code. Gas is a mixture
of 21 chemical species and 24 chemical reactions from Argon plasma
and Acetylene are implemented in both codes. The effects of the
above mentioned operating parameters are evaluated and discussed
by 2-D maps and profiles of some important thermo-fluid-dynamic
parameters, as per Mach number, velocity and temperature. Intensity,
position and extension of the shock wave are evaluated and the
influence of the above mentioned test conditions on the film
thickness and uniformity of distribution are also evaluated.