Abstract: Stiffness sensing is an important issue in medical diagnostic, robotics surgery, safe handling, and safe grasping of objects in production lines. Detecting and obtaining the characteristics in dwelling lumps embedded in a soft tissue and safe removing and handling of detected lumps is needed in surgery. Also in industry, grasping and handling an object without damaging in a place where it is not possible to access a human operator is very important. In this paper, a method for object handling is presented. It is based on the use of an intelligent gripper to detect the object stiffness and then setting a programmable force for grasping the object to move it. The main components of this system includes sensors (sensors for measuring force and displacement), electrical (electrical and electronic circuits, tactile data processing and force control system), mechanical (gripper mechanism and driving system for the gripper) and the display unit. The system uses a rotary potentiometer for measuring gripper displacement. A microcontroller using the feedback received by the load cell, mounted on the finger of the gripper, calculates the amount of stiffness, and then commands the gripper motor to apply a certain force on the object. Results of Experiments on some samples with different stiffness show that the gripper works successfully. The gripper can be used in haptic interfaces or robotic systems used for object handling.
Abstract: A Cable-Driven Locomotion Interface provides a low
inertia haptic interface and is used as a way of enabling the user
to walk and interact with virtual surfaces. These surfaces generate
Cartesian wrenches which must be optimized for each motorized
reel in order to reproduce a haptic sensation in both feet. However,
the use of wrench control requires a measure of the cable tensions
applied to the moving platform. The latter measure may be inaccurate
if it is based on sensors located near the reel. Moreover, friction
hysteresis from the reel moving parts needs to be compensated
for with an evaluation of low angular velocity of the motor shaft.
Also, the pose of the platform is not known precisely due to cable
sagging and mechanical deformation. This paper presents a non-ideal
motorized reel design with its corresponding control strategy that
aims at overcoming the aforementioned issues. A transfert function
of the reel based on frequency responses in function of cable tension
and cable length is presented with an optimal adaptative PIDF
controller. Finally, an hybrid position/tension control is discussed with
an analysis of the stability for achieving a complete functionnality of
the haptic platform.
Abstract: In this paper, the construction of a detailed spine
model is presented using the LifeMOD Biomechanics Modeler. The
detailed spine model is obtained by refining spine segments in
cervical, thoracic and lumbar regions into individual vertebra
segments, using bushing elements representing the intervertebral
discs, and building various ligamentous soft tissues between
vertebrae. In the sagittal plane of the spine, constant force will be
applied from the posterior to anterior during simulation to determine
dynamic characteristics of the spine. The force magnitude is
gradually increased in subsequent simulations. Based on these
recorded dynamic properties, graphs of displacement-force
relationships will be established in terms of polynomial functions by
using the least-squares method and imported into a haptic integrated
graphic environment. A thoracolumbar spine model with complex
geometry of vertebrae, which is digitized from a resin spine
prototype, will be utilized in this environment. By using the haptic
technique, surgeons can touch as well as apply forces to the spine
model through haptic devices to observe the locomotion of the spine
which is computed from the displacement-force relationship graphs.
This current study provides a preliminary picture of our ongoing
work towards building and simulating bio-fidelity scoliotic spine
models in a haptic integrated graphic environment whose dynamic
properties are obtained from LifeMOD. These models can be helpful
for surgeons to examine kinematic behaviors of scoliotic spines and
to propose possible surgical plans before spine correction operations.
Abstract: The introduction of haptic elements in a graphic user interfaces are becoming more widespread. Since haptics are being introduced rapidly into computational tools, investigating how these models affect Human-Computer Interaction would help define how to integrate and model new modes of interaction. The interest of this paper is to discuss and investigate the issues surrounding Haptic and Graphic User Interface designs (GUI) as separate systems, as well as understand how these work in tandem. The development of these systems is explored from a psychological perspective, based on how usability is addressed through learning and affordances, defined by J.J. Gibson. Haptic design can be a powerful tool, aiding in intuitive learning. The problems discussed within the text is how can haptic interfaces be integrated within a GUI without the sense of frivolity. Juxtaposing haptics and Graphic user interfaces has issues of motivation; GUI tends to have a performatory process, while Haptic Interfaces use affordances to learn tool use. In a deeper view, it is noted that two modes of perception, foveal and ambient, dictate perception. These two modes were once thought to work in tandem, however it has been discovered that these processes work independently from each other. Foveal modes interpret orientation is space which provide for posture, locomotion, and motor skills with variations of the sensory information, which instructs perceptions of object-task performance. It is contended, here, that object-task performance is a key element in the use of Haptic Interfaces because exploratory learning uses affordances in order to use an object, without meditating an experience cognitively. It is a direct experience that, through iteration, can lead to skill-sets. It is also indicated that object-task performance will not work as efficiently without the use of exploratory or kinesthetic learning practices. Therefore, object-task performance is not as congruently explored in GUI than it is practiced in Haptic interfaces.
Abstract: One of the essential requirements in order to have a
realistic surgical simulator is real-time interaction by means of a
haptic interface is. In fact, reproducing haptic sensations increases
the realism of the simulation. However, the interaction need to be
performed in real-time, since a delay between the user action and the
system reaction reduces the user immersion. In this paper, we present
a prototype of the coronary stent implant simulator developed in the
HERMES Project; this system allows real-time interactions with a
artery by means of a specific haptic device; thus the user can
interactively navigate in a reconstructed artery and force feedback is
produced when contact occurs between the artery walls and the
medical instruments