Abstract: Faced with social and health system capacity
constraints and rising and changing demand for welfare services,
governments and welfare providers are increasingly relying on
innovation to help support and enhance services. However, the
evidence reported by several studies indicates that the realization of
that potential is not an easy task. Innovations can be deemed
inherently complex to implement and operate, because many of them
involve a combination of technological and organizational renewal
within an environment featuring a diversity of stakeholders. Many
public welfare service innovations are markedly systemic in their
nature, which means that they emerge from, and must address, the
complex interplay between political, administrative, technological,
institutional and legal issues. This paper suggests that stakeholders
dealing with systemic innovation in welfare services must deal with
ambiguous and incomplete information in circumstances of
uncertainty. Employing a literature review methodology and case
study, this paper identifies, categorizes and discusses different
aspects of the uncertainty of systemic innovation in public welfare
services, and argues that uncertainty can be classified into eight
categories: technological uncertainty, market uncertainty,
regulatory/institutional uncertainty, social/political uncertainty,
acceptance/legitimacy uncertainty, managerial uncertainty, timing
uncertainty and consequence uncertainty.
Abstract: In this paper we discuss the effect of unbounded particle interaction operator on particle growth and we study how this can address the choice of appropriate time steps of the numerical simulation. We provide also rigorous mathematical proofs showing that large particles become dominating with increasing time while small particles contribute negligibly. Second, we discuss the efficiency of the algorithm by performing numerical simulations tests and by comparing the simulated solutions with some known analytic solutions to the Smoluchowski equation.
Abstract: The emergence of blended learning has been
influenced by the rapid changes in Higher Education within the last
few years. However, there is a lack of studies that look into the future
of blended learning in the Saudi context. The most likely explanation
is that blended learning is relatively new and, with respect to learning
in general, under-researched. This study addresses this gap and
explores the views of lecturers and students towards the future of
blended learning in Saudi Arabia. This study was informed by the
interpretive paradigm that appears to be most appropriate to
understand and interpret the perceptions of students and instructors
towards a new learning environment. While globally there has been
considerable research on the perceptions of e-learning and blended
learning with its different models, there is plenty of space for further
research specifically in the Arab region, and in Saudi Arabia where
blended learning is now being introduced.
Abstract: The use of machine vision to inspect the outcome of
surgical tasks is investigated, with the aim of incorporating this
approach in robotic surgery systems. Machine vision is a non-contact
form of inspection i.e. no part of the vision system is in direct contact
with the patient, and is therefore well suited for surgery where
sterility is an important consideration,. As a proof-of-concept, three
primary surgical tasks for a common neurosurgical procedure were
inspected using machine vision. Experiments were performed on
cadaveric pig heads to simulate the two possible outcomes i.e.
satisfactory or unsatisfactory, for tasks involved in making a burr
hole, namely incision, retraction, and drilling. We identify low level
image features to distinguish the two outcomes, as well as report on
results that validate our proposed approach. The potential of using
machine vision in a surgical environment, and the challenges that
must be addressed, are identified and discussed.
Abstract: Advent enhancements in the field of computing have
increased massive use of web based electronic documents. Current
Copyright protection laws are inadequate to prove the ownership for
electronic documents and do not provide strong features against
copying and manipulating information from the web. This has
opened many channels for securing information and significant
evolutions have been made in the area of information security.
Digital Watermarking has developed into a very dynamic area of
research and has addressed challenging issues for digital content.
Watermarking can be visible (logos or signatures) and invisible
(encoding and decoding). Many visible watermarking techniques
have been studied for text documents but there are very few for web
based text. XML files are used to trade information on the internet
and contain important information. In this paper, two invisible
watermarking techniques using Synonyms and Acronyms are
proposed for XML files to prove the intellectual ownership and to
achieve the security. Analysis is made for different attacks and
amount of capacity to be embedded in the XML file is also noticed.
A comparative analysis for capacity is also made for both methods.
The system has been implemented using C# language and all tests are
made practically to get the results.
Abstract: In this paper, backup and recovery technique for Peer
to Peer applications, such as a distributed asynchronous Web-Based
Training system that we have previously proposed. In order to
improve the scalability and robustness of this system, all contents and
function are realized on mobile agents. These agents are distributed
to computers, and they can obtain using a Peer to Peer network
that modified Content-Addressable Network. In the proposed system,
although entire services do not become impossible even if some
computers break down, the problem that contents disappear occurs
with an agent-s disappearance. As a solution for this issue, backups
of agents are distributed to computers. If a failure of a computer is
detected, other computers will continue service using backups of the
agents belonged to the computer.
Abstract: Aerial and satellite images are information rich. They are also complex to analyze. For GIS systems, many features require fast and reliable extraction of roads and intersections. In this paper, we study efficient and reliable automatic extraction algorithms to address some difficult issues that are commonly seen in high resolution aerial and satellite images, nonetheless not well addressed in existing solutions, such as blurring, broken or missing road boundaries, lack of road profiles, heavy shadows, and interfering surrounding objects. The new scheme is based on a new method, namely reference circle, to properly identify the pixels that belong to the same road and use this information to recover the whole road network. This feature is invariable to the shape and direction of roads and tolerates heavy noise and disturbances. Road extraction based on reference circles is much more noise tolerant and flexible than the previous edge-detection based algorithms. The scheme is able to extract roads reliably from images with complex contents and heavy obstructions, such as the high resolution aerial/satellite images available from Google maps.
Abstract: In this paper, we explore a new scheme for filtering spoofed packets (DDOS attack) which is a combination of path fingerprint and client puzzle concepts. In this each IP packet has a unique fingerprint is embedded that represents, the route a packet has traversed. The server maintains a mapping table which contains the client IP address and its corresponding fingerprint. In ingress router, client puzzle is placed. For each request, the puzzle issuer provides a puzzle which the source has to solve. Our design has the following advantages over prior approaches, 1) Reduce the network traffic, as we place a client puzzle at the ingress router. 2) Mapping table at the server is lightweight and moderate.
Abstract: Information and communication technology (ICT) is
essential to the operation of business, and create many employment
opportunities. High volumes of students graduate in ICT however
students struggle to find job placement. A discrepancy exists between
graduate skills and industry skill requirements. To address the need
for ICT skills required, universities must create programs to meet the
demands of a changing ICT industry. This requires a partnership
between industry, universities and other stakeholders. This situation
may be viewed as a critical systems thinking problem situation as
there are various role players each with their own needs and
requirements. Jackson states a typical critical systems methods has a
pluralistic nature. This paper explores the applicability and suitability
of Maslow and Dooyeweerd to guide understanding and make
recommendations for change in ICT WIL, to foster an all-inclusive
understanding of the situation by stakeholders. The above methods
provide tools for understanding softer issues beyond the skills
required. The study findings suggest that besides skills requirements,
a deeper understanding and empowering students from being a
student to a professional need to be understood and addressed.
Abstract: A study was conducted to formally characterize
notebook computer performance under various environmental and
usage conditions. Software was developed to collect data from the
operating system of the computer. An experiment was conducted to
evaluate the performance parameters- variations, trends, and
correlations, as well as the extreme value they can attain in various
usage and environmental conditions. An automated software script
was written to simulate user activity. The variability of each
performance parameter was addressed by establishing the empirical
relationship between performance parameters. These equations were
presented as baseline estimates for performance parameters, which
can be used to detect system deviations from normal operation and
for prognostic assessment. The effect of environmental factors,
including different power sources, ambient temperatures, humidity,
and usage, on performance parameters of notebooks was studied.
Abstract: Distributed Power generation has gained a lot of
attention in recent times due to constraints associated with
conventional power generation and new advancements in DG
technologies .The need to operate the power system economically
and with optimum levels of reliability has further led to an increase
in interest in Distributed Generation. However it is important to place
Distributed Generator on an optimum location so that the purpose of
loss minimization and voltage regulation is dully served on the
feeder. This paper investigates the impact of DG units installation on
electric losses, reliability and voltage profile of distribution networks.
In this paper, our aim would be to find optimal distributed
generation allocation for loss reduction subjected to constraint of
voltage regulation in distribution network. The system is further
analyzed for increased levels of Reliability. Distributed Generator
offers the additional advantage of increase in reliability levels as
suggested by the improvements in various reliability indices such as
SAIDI, CAIDI and AENS. Comparative studies are performed and
related results are addressed. An analytical technique is used in order
to find the optimal location of Distributed Generator. The suggested
technique is programmed under MATLAB software. The results
clearly indicate that DG can reduce the electrical line loss while
simultaneously improving the reliability of the system.
Abstract: During recent years wind turbine technology has
undergone rapid developments. Growth in size and the optimization
of wind turbines has enabled wind energy to become increasingly
competitive with conventional energy sources. As a result today-s
wind turbines participate actively in the power production of several
countries around the world. These developments raise a number of
challenges to be dealt with now and in the future. The penetration of
wind energy in the grid raises questions about the compatibility of the
wind turbine power production with the grid. In particular, the
contribution to grid stability, power quality and behavior during fault
situations plays therefore as important a role as the reliability. In the
present work, we addressed two fault situations that have shown their
influence on the generator and the behavior of the wind over the
defects which are briefly discussed based on simulation results.
Abstract: This paper describes a UDP over IP based, server-oriented redundant host configuration protocol (RHCP) that can be used by collaborating embedded systems in an ad-hoc network to acquire a dynamic IP address. The service is provided by a single network device at a time and will be dynamically reassigned to one of the other network clients if the primary provider fails. The protocol also allows all participating clients to monitor the dynamic makeup of the network over time. So far the algorithm has been implemented and tested on an 8-bit embedded system architecture with a 10Mbit Ethernet interface.
Abstract: Hierarchical Mobile IPv6 (HMIPv6) was designed to
support IP micro-mobility management in the Next Generation
Networks (NGN) framework. The main design behind this protocol is
the usage of Mobility Anchor Point (MAP) located at any level router
of network to support hierarchical mobility management. However,
the distance MAP selection in HMIPv6 causes MAP overloaded and
increase frequent binding update as the network grows. Therefore, to
address the issue in designing MAP selection scheme, we propose a
dynamic load control mechanism integrates with a speed detection
mechanism (DMS-DLC). From the experimental results we obtain
that the proposed scheme gives better distribution in MAP load and
increase handover speed.
Abstract: A novel interpolation scheme to extend usable spectrum
and upconvert in high performance D/A converters is addressed in this
paper. By adjusting the pulse width of cycle and the production circuit
of code, the expansion code is a null code or complementary code that
is interpolation process. What the times and codes of interpolation
decide DAC works in one of a normal mode or multi-mixer mode
so that convert the input digital data signal into normal signal or a
mixed analog signal having a mixer frequency that is higher than the
data frequency. Simulation results show that the novel scheme and
apparatus most extend the usable frequency spectrum into fifth to
sixth Nyquist zone beyond conventional DACs.
Abstract: This paper presents a formalisation of the different existing code mutation techniques (polymorphism and metamorphism) by means of formal grammars. While very few theoretical results are known about the detection complexity of viral mutation techniques, we exhaustively address this critical issue by considering the Chomsky classification of formal grammars. This enables us to determine which family of code mutation techniques are likely to be detected or on the contrary are bound to remain undetected. As an illustration we then present, on a formal basis, a proof-of-concept metamorphic mutation engine denoted PB MOT, whose detection has been proven to be undecidable.
Abstract: A new generation of manufacturing machines
so-called MIMCA (modular and integrated machine control
architecture) capable of handling much increased complexity in
manufacturing control-systems is presented. Requirement for more
flexible and effective control systems for manufacturing machine
systems is investigated and dimensioned-which highlights a need for
improved means of coordinating and monitoring production
machinery and equipment used to- transport material. The MIMCA
supports simulation based on machine modeling, was conceived by
the authors to address the issues. Essentially MIMCA comprises an
organized unification of selected architectural frameworks and
modeling methods, which include: NISTRCS, UMC and Colored
Timed Petri nets (CTPN). The unification has been achieved; to
support the design and construction of hierarchical and distributed
machine control which realized the concurrent operation of reusable
and distributed machine control components; ability to handle
growing complexity; and support requirements for real- time control
systems. Thus MIMCA enables mapping between 'what a machine
should do' and 'how the machine does it' in a well-defined but
flexible way designed to facilitate reconfiguration of machine
systems.
Abstract: The introduction of haptic elements in a graphic user interfaces are becoming more widespread. Since haptics are being introduced rapidly into computational tools, investigating how these models affect Human-Computer Interaction would help define how to integrate and model new modes of interaction. The interest of this paper is to discuss and investigate the issues surrounding Haptic and Graphic User Interface designs (GUI) as separate systems, as well as understand how these work in tandem. The development of these systems is explored from a psychological perspective, based on how usability is addressed through learning and affordances, defined by J.J. Gibson. Haptic design can be a powerful tool, aiding in intuitive learning. The problems discussed within the text is how can haptic interfaces be integrated within a GUI without the sense of frivolity. Juxtaposing haptics and Graphic user interfaces has issues of motivation; GUI tends to have a performatory process, while Haptic Interfaces use affordances to learn tool use. In a deeper view, it is noted that two modes of perception, foveal and ambient, dictate perception. These two modes were once thought to work in tandem, however it has been discovered that these processes work independently from each other. Foveal modes interpret orientation is space which provide for posture, locomotion, and motor skills with variations of the sensory information, which instructs perceptions of object-task performance. It is contended, here, that object-task performance is a key element in the use of Haptic Interfaces because exploratory learning uses affordances in order to use an object, without meditating an experience cognitively. It is a direct experience that, through iteration, can lead to skill-sets. It is also indicated that object-task performance will not work as efficiently without the use of exploratory or kinesthetic learning practices. Therefore, object-task performance is not as congruently explored in GUI than it is practiced in Haptic interfaces.
Abstract: Optimal routing in communication networks is a
major issue to be solved. In this paper, the application of Tabu Search
(TS) in the optimum routing problem where the aim is to minimize
the computational time and improvement of quality of the solution in
the communication have been addressed. The goal is to minimize the
average delays in the communication. The effectiveness of Tabu
Search method is shown by the results of simulation to solve the
shortest path problem. Through this approach computational cost can
be reduced.
Abstract: In this paper, Fabless Prototyping Methodology is
introduced for the design and analysis of MEMS devices.
Conventionally Finite Element Analysis (FEA) is performed before
system level simulation. In our proposed methodology, system level
simulation is performed earlier than FEA as it is computationally less
extensive and low cost. System level simulations are based on
equivalent behavioral models of MEMS device. Electrostatic
actuation based MEMS Microgripper is chosen as case study to
implement this methodology. This paper addresses the behavioral
model development and simulation of actuator part of an
electrostatically actuated Microgripper. Simulation results show that
the actuator part of Microgripper works efficiently for a voltage range
of 0-45V with the corresponding jaw displacement of 0-4.5425μm.
With some minor changes in design, this range can be enhanced to
15μm at 85V.