Abstract: As notifications become more common through mobile devices, it is important to understand the impact of wearable devices for improved user experience of man-machine interfaces. This study examined the use of a wearable device for a real-time system using a computer simulated petrochemical system. The key research question was to determine how using information provided by the wearable device can improve human performance through measures of situational awareness and decision making. Results indicate that there was a reduction in response time when using the watch and there was no difference in situational awareness. Perception of using the watch was positive, with 83% of users finding value in using the watch and receiving haptic feedback.
Abstract: Visually impaired people, in their daily lives, face struggles and spatial barriers because the built environment is often designed with an extreme focus on the visual element, causing what is called architectural visual bias or ocularcentrism. The aim of the study is to holistically understand the world of the visually impaired as an attempt to extract the qualities of space that accommodate their needs, and to show the importance of multi-sensory, holistic designs for the blind. Within the framework of existential phenomenology, common themes are reached through "intersubjectivity": experience descriptions by blind people and blind architects, observation of how blind children learn to perceive their surrounding environment, and a personal lived blind-folded experience are analyzed. The extracted themes show how visually impaired people filter out and prioritize tactile (active, passive and dynamic touch), acoustic and olfactory spatial qualities respectively, and how this happened during the personal lived blind folded experience. The themes clarify that haptic and aural inclusive designs are essential to create environments suitable for the visually impaired to empower them towards an independent, safe and efficient life.
Abstract: This paper presents a home-based robot-rehabilitation
instrument, called ”MAGNI Dynamics”, that utilized a vision-based
kinematic/dynamic module and an adaptive haptic feedback
controller. The system is expected to provide personalized
rehabilitation by adjusting its resistive and supportive behavior
according to a fuzzy intelligence controller that acts as an inference
system, which correlates the user’s performance to different stiffness
factors. The vision module uses the Kinect’s skeletal tracking to
monitor the user’s effort in an unobtrusive and safe way, by estimating
the torque that affects the user’s arm. The system’s torque estimations
are justified by capturing electromyographic data from primitive
hand motions (Shoulder Abduction and Shoulder Forward Flexion).
Moreover, we present and analyze how the Barrett WAM generates
a force-field with a haptic controller to support or challenge the
users. Experiments show that by shifting the proportional value,
that corresponds to different stiffness factors of the haptic path, can
potentially help the user to improve his/her motor skills. Finally,
potential areas for future research are discussed, that address how
a rehabilitation robotic framework may include multisensing data, to
improve the user’s recovery process.
Abstract: The haptic modality has brought a new dimension to human computer interaction by engaging the human sense of touch. However, designing appropriate haptic stimuli, and in particular tactile stimuli, for various applications is still challenging. To tackle this issue, we present an intuitive system that facilitates the authoring of tactile gestures for various applications. The system transforms a hand gesture into a tactile gesture that can be rendering using a home-made haptic jacket. A case study is presented to demonstrate the ability of the system to develop tactile gestures that are recognizable by human subjects. Four tactile gestures are identified and tested to intensify the following four emotional responses: high valence – high arousal, high valence – low arousal, low valence – high arousal, and low valence – low arousal. A usability study with 20 participants demonstrated high correlation between the selected tactile gestures and the intended emotional reaction. Results from this study can be used in a wide spectrum of applications ranging from gaming to interpersonal communication and multimodal simulations.
Abstract: Stiffness sensing is an important issue in medical diagnostic, robotics surgery, safe handling, and safe grasping of objects in production lines. Detecting and obtaining the characteristics in dwelling lumps embedded in a soft tissue and safe removing and handling of detected lumps is needed in surgery. Also in industry, grasping and handling an object without damaging in a place where it is not possible to access a human operator is very important. In this paper, a method for object handling is presented. It is based on the use of an intelligent gripper to detect the object stiffness and then setting a programmable force for grasping the object to move it. The main components of this system includes sensors (sensors for measuring force and displacement), electrical (electrical and electronic circuits, tactile data processing and force control system), mechanical (gripper mechanism and driving system for the gripper) and the display unit. The system uses a rotary potentiometer for measuring gripper displacement. A microcontroller using the feedback received by the load cell, mounted on the finger of the gripper, calculates the amount of stiffness, and then commands the gripper motor to apply a certain force on the object. Results of Experiments on some samples with different stiffness show that the gripper works successfully. The gripper can be used in haptic interfaces or robotic systems used for object handling.
Abstract: Robotic surgery is used to enhance minimally invasive
surgical procedure. It provides greater degree of freedom for surgical
tools but lacks of haptic feedback system to provide sense of touch to
the surgeon. Surgical robots work on master-slave operation, where
user is a master and robotic arms are the slaves. Current, surgical
robots provide precise control of the surgical tools, but heavily rely
on visual feedback, which sometimes cause damage to the inner
organs. The goal of this research was to design and develop a realtime
Simulink based robotic system to study force feedback
mechanism during instrument-object interaction. Setup includes three
VelmexXSlide assembly (XYZ Stage) for three dimensional
movement, an end effector assembly for forceps, electronic circuit for
four strain gages, two Novint Falcon 3D gaming controllers,
microcontroller board with linear actuators, MATLAB and Simulink
toolboxes. Strain gages were calibrated using Imada Digital Force
Gauge device and tested with a hard-core wire to measure
instrument-object interaction in the range of 0-35N. Designed
Simulink model successfully acquires 3D coordinates from two
Novint Falcon controllers and transfer coordinates to the XYZ stage
and forceps. Simulink model also reads strain gages signal through
10-bit analog to digital converter resolution of a microcontroller
assembly in real time, converts voltage into force and feedback the
output signals to the Novint Falcon controller for force feedback
mechanism. Experimental setup allows user to change forward
kinematics algorithms to achieve the best-desired movement of the
XYZ stage and forceps. This project combines haptic technology
with surgical robot to provide sense of touch to the user controlling
forceps through machine-computer interface.
Abstract: As currently various portable devices were launched,
smart business conducted using them became common. Since smart
business can use company-internal resources in an exlternal remote
place, user authentication that can identify authentic users is an
important factor. Commonly used user authentication is a method of
using user ID and Password. In the user authentication using ID and
Password, the user should see and enter authentication information
him or her. In this user authentication system depending on the user’s
vision, there is the threat of password leaks through snooping in the
process which the user enters his or her authentication information.
This study designed and produced a user authentication module
using an actuator to respond to the snooping threat.
Abstract: Aim: To investigate students’ perceptions of using e-models in an inquiry-based curriculum. Approach: 52 second-year dental students completed a pre- and post-test questionnaire relating to their perceptions of e-models and their use in inquiry-based learning. The pre-test occurred prior to any learning with e-models. The follow-up survey was conducted after one year's experience of using e-models. Results: There was no significant difference between the two sets of questionnaires regarding students’ perceptions of the usefulness of e-models and their willingness to use e-models in future inquiry-based learning. Most students preferred using both plaster models and e-models in tandem. Conclusion: Students did not change their attitude towards e-models and most of them agreed or were neutral that e-models are useful in inquiry-based learning. Whilst recognizing the utility of 3D models for learning, students' preference for combining these with solid models has implications for the development of haptic sensibility in an operative discipline.
Abstract: This paper discusses our preliminary experiences in the design of a user interface of a computerized content-rich vocational training courseware meant for users with little or no computer experience. In targeting a growing population with limited access to skills training of any sort, we faced numerous challenges, including language and cultural differences, resource limits, gender boundaries and, in many cases, the simple lack of trainee motivation. With the size of the unskilled population increasing much more rapidly than the numbers of sufficiently skilled teachers, there is little choice but to develop teaching techniques that will take advantage of emerging computer-based training technologies. However, in striving to serve populations with minimal computer literacy, one must carefully design the user interface to accommodate their cultural, social, educational, motivational and other differences. Our work, which uses computer based and haptic simulation technologies to deliver training to these populations, has provided some useful insights on potential user interface design approaches.
Abstract: An efficient reintegration of the disabled people in the
family and society should be fulfilled; hence it is strongly needful to assist their diminished functions or to replace the totally lost
functions. Assistive technology helps in neutralizing the impairment.
Recent advancements in embedded systems have opened up a vast
area of research and development for affordable and portable assistive devices for the visually impaired. Granted there are many assistive devices on the market that are able to detect obstacles, and numerous research and development currently in process to
alleviate the cause, unfortunately the cost of devices, size of
devices, intrusiveness and higher learning curve prevents the visually impaired from taking advantage of available devices. This
project aims at the design and implementation of a detachable unit
which is robust, low cost and user friendly, thus, trying to
aggrandize the functionality of the existing white cane, to concede above-knee obstacle detection. The designed obstruction detector
uses ultrasound sensors for detecting the obstructions before direct contact. It bestows haptic feedback to the user in accordance with the position of the obstacle.
Abstract: Current advancements in nanotechnology are dependent
on the capabilities that can enable nano-scientists to extend their eyes
and hands into the nano-world. For this purpose, a haptics (devices
capable of recreating tactile or force sensations) based system for
AFM (Atomic Force Microscope) is proposed. The system enables
the nano-scientists to touch and feel the sample surfaces, viewed
through AFM, in order to provide them with better understanding of
the physical properties of the surface, such as roughness, stiffness and
shape of molecular architecture. At this stage, the proposed work uses
of ine images produced using AFM and perform image analysis to
create virtual surfaces suitable for haptics force analysis. The research
work is in the process of extension from of ine to online process
where interaction will be done directly on the material surface for
realistic analysis.
Abstract: Haptics has been used extensively in many applications especially in human machine interaction and virtual reality systems. Haptic technology allows user to perceive virtual reality as in real world. However, commercially available haptic devices are expensive and may not be suitable for educational purpose. This paper describes the design and development of a low cost haptic knob, with only one degree of freedom, for use in rehabilitation or training hand pronation and supination. End-effectors can be changed to suit different applications or variation in hand sizes and hand orientation.
Abstract: A Cable-Driven Locomotion Interface provides a low
inertia haptic interface and is used as a way of enabling the user
to walk and interact with virtual surfaces. These surfaces generate
Cartesian wrenches which must be optimized for each motorized
reel in order to reproduce a haptic sensation in both feet. However,
the use of wrench control requires a measure of the cable tensions
applied to the moving platform. The latter measure may be inaccurate
if it is based on sensors located near the reel. Moreover, friction
hysteresis from the reel moving parts needs to be compensated
for with an evaluation of low angular velocity of the motor shaft.
Also, the pose of the platform is not known precisely due to cable
sagging and mechanical deformation. This paper presents a non-ideal
motorized reel design with its corresponding control strategy that
aims at overcoming the aforementioned issues. A transfert function
of the reel based on frequency responses in function of cable tension
and cable length is presented with an optimal adaptative PIDF
controller. Finally, an hybrid position/tension control is discussed with
an analysis of the stability for achieving a complete functionnality of
the haptic platform.
Abstract: This paper presents an authoring tool which makes a
user easily and intuitively design vibrotactile sensation. A mobile
hardware platform powered by ANDROID, a multi-purpose haptic
driver and a linear resonance actuator are used to implement the
system of the presented authoring tool. The tool allows users to easily
and simply create a vibrotactile sensation by drawing vibrotactile
images and to feel the sensation by rubbing drawn images on the touch
screen of a mobile device. The tool supports a graphical interface for
designing, editing and playing vibrotactile images as well as a
pre-defined file format for save and open.
Abstract: In this paper, the construction of a detailed spine
model is presented using the LifeMOD Biomechanics Modeler. The
detailed spine model is obtained by refining spine segments in
cervical, thoracic and lumbar regions into individual vertebra
segments, using bushing elements representing the intervertebral
discs, and building various ligamentous soft tissues between
vertebrae. In the sagittal plane of the spine, constant force will be
applied from the posterior to anterior during simulation to determine
dynamic characteristics of the spine. The force magnitude is
gradually increased in subsequent simulations. Based on these
recorded dynamic properties, graphs of displacement-force
relationships will be established in terms of polynomial functions by
using the least-squares method and imported into a haptic integrated
graphic environment. A thoracolumbar spine model with complex
geometry of vertebrae, which is digitized from a resin spine
prototype, will be utilized in this environment. By using the haptic
technique, surgeons can touch as well as apply forces to the spine
model through haptic devices to observe the locomotion of the spine
which is computed from the displacement-force relationship graphs.
This current study provides a preliminary picture of our ongoing
work towards building and simulating bio-fidelity scoliotic spine
models in a haptic integrated graphic environment whose dynamic
properties are obtained from LifeMOD. These models can be helpful
for surgeons to examine kinematic behaviors of scoliotic spines and
to propose possible surgical plans before spine correction operations.
Abstract: This paper describes a proposed support system which
enables applications designers to effectively create VR applications
using multiple haptic APIs. When the VR designers create
applications, it is often difficult to handle and understand many
parameters and functions that have to be set in the application program
using documentation manuals only. This complication may disrupt
creative imagination and result in inefficient coding. So, we proposed
the support application which improved the efficiency of VR
applications development and provided the interactive components of
confirmation of operations with haptic sense previously.
In this paper, we describe improvements of our former proposed
support application, which was applicable to multiple APIs and haptic
devices, and evaluate the new application by having participants
complete VR program. Results from a preliminary experiment suggest
that our application facilitates creation of VR applications.
Abstract: In this paper, we are presenting a new type of pointing interface for computers which provides mouse functionalities with near surface haptic feedback. Further, it can be configured as a haptic display where users may feel the basic geometrical shapes in the GUI by moving the finger on top of the device surface. These functionalities are achieved by tracking three dimensional positions of the neodymium magnet using Hall Effect sensors grid and generating like polarity haptic feedback using an electromagnet array. This interface brings the haptic sensations to the 3D space where previously it is felt only on top of the buttons of the haptic mouse implementations.
Abstract: Current advancements in nanotechnology are dependent on the capabilities that can enable nano-scientists to extend their eyes and hands into the nano-world. For this purpose, a haptics (devices capable of recreating tactile or force sensations) based system for AFM (Atomic Force Microscope) is proposed. The system enables the nano-scientists to touch and feel the sample surfaces, viewed through AFM, in order to provide them with better understanding of the physical properties of the surface, such as roughness, stiffness and shape of molecular architecture. At this stage, the proposed work uses of ine images produced using AFM and perform image analysis to create virtual surfaces suitable for haptics force analysis. The research work is in the process of extension from of ine to online process where interaction will be done directly on the material surface for realistic analysis.
Abstract: One of the essential requirements of a realistic
surgical simulator is to reproduce haptic sensations due to the
interactions in the virtual environment. However, the interaction need
to be performed in real-time, since a delay between the user action
and the system reaction reduces the immersion sensation. In this
paper, a prototype of a coronary stent implant simulator is present;
this system allows real-time interactions with an artery by means of a
specific haptic device. To improve the realism of the simulation, the
building of the virtual environment is based on real patients- images
and a Web Portal is used to search in the geographically remote
medical centres a virtual environment with specific features in terms
of pathology or anatomy. The functional architecture of the system
defines several Medical Centres in which virtual environments built
from the real patients- images and related metadata with specific
features in terms of pathology or anatomy are stored. The searched
data are downloaded from the Medical Centre to the Training Centre
provided with a specific haptic device and with the software
necessary both to manage the interaction in the virtual environment.
After the integration of the virtual environment in the simulation
system it is possible to perform training on the specific surgical
procedure.
Abstract: Since the 1940s, many promising telepresence
research results have been obtained. However, telepresence
technology still has not reached industrial usage. As human
intelligence is necessary for successful execution of most manual
assembly tasks, the ability of the human is hindered in some cases,
such as the assembly of heavy parts of small/medium lots or
prototypes. In such a case of manual assembly, the help of industrial
robots is mandatory. The telepresence technology can be considered
as a solution for performing assembly tasks, where the human
intelligence and haptic sense are needed to identify and minimize the
errors during an assembly process and a robot is needed to carry
heavy parts. In this paper, preliminary steps to integrate the
telepresence technology into industrial robot systems are introduced.
The system described here combines both, the human haptic sense
and the industrial robot capability to perform a manual assembly task
remotely using a force feedback joystick. Mapping between the
joystick-s Degrees of Freedom (DOF) and the robot-s ones are
introduced. Simulation and experimental results are shown and future
work is discussed.