ABSTRACT
“HAPTICS”-- a technology that adds the sense of touch to virtual environment .Haptic interfaces allow the user to feel as well as to see virtual objects on a computer, and so we can give an illusion of touching surfaces, shaping virtual clay or moving objects around.
The sensation of touch is the brain’s most effective learning mechanism --more effective than seeing or hearing—which is why the new technology holds so much promise as a teaching tool.
Haptic technology is like exploring the virtual world with a stick. If you push the stick into a virtual balloon push back .The computer communicates sensations through a haptic interface –a stick, scalpel, racket or pen that is connected to a force-exerting motors.
With this technology we can now sit down at a computer terminal and touch objects that exist only in the "mind" of the computer.By using special input/output devices (joysticks, data gloves, or other devices), users can receive feedback from computer applications in the form of felt sensations in the hand or other parts of the body. In combination with a visual display, haptics technology can be used to train people for tasks requiring hand-eye coordination, such as surgery and space ship maneuvers.
In this paper we explicate how sensors and actuators are used for tracking the position and movement of the haptic device moved by the operator. We mention the different types of force rendering algorithms. Then, we move on to a few applications of Haptic Technology. Finally we conclude by mentioning a few future developments.
Introduction
1What is Haptics?
Haptics refers to sensing and manipulation through touch. The word comes from the Greek ‘haptesthai’, meaning ‘to touch’.
The history of the haptic interface dates back to the 1950s, when a master-slave system was proposed by Goertz (1952). Haptic interfaces were established out of the field of tele- operation, which was then employed in the remote manipulation of radioactive materials. The ultimate goal of the tele-operation system was "transparency". That is, an user interacting with the master device in a master-slave pair should not be able to distinguish between using the master controller and manipulating the actual tool itself. Early haptic interface systems were therefore developed purely for telerobotic applications.
Working of Haptic Devices
Architecture for Haptic feedback:
Basic architecture for a virtual reality application incorporating visual, auditory, and haptic feedback.
1• Simulation engine:
Responsible for computing the virtual environment’s behavior over time.
1• Visual, auditory, and haptic rendering algorithms:
Compute the virtual environment’s graphic, sound, and force responses toward the user.
• Transducers:
Convert visual, audio, and force signals from the computer into a form the operator can perceive.
1• Rendering:
Process by which desired sensory stimuli are imposed on the user to convey information about a virtual haptic object.
The human operator typically holds or wears the haptic interface device and perceives audiovisual feedback from audio (computer speakers, headphones, and so on) and visual displays (a computer screen or head-mounted display, for example).
Audio and visual channels feature unidirectional information and energy flow (from the simulation engine towards the user) whereas, the haptic modality exchanges information and energy in two directions, from and toward the user. This bi directionality is often referred to as the single most important feature of the haptic interaction modality.
System architecture for haptic rendering:
An avatar is the virtual representation of the haptic interface through which the user physically interacts with the virtual environment.
Haptic-rendering algorithms compute the correct interaction forces between the haptic interface representation inside the virtual environment and the virtual objects populating the environment. Moreover, haptic rendering algorithms ensure that the haptic device correctly renders such forces on the human operator.
1.)Collision-detection algorithms detect collisions between objects and avatars in the virtual environment and yield information about where, when, and ideally to what extent collisions (penetrations, indentations, contact area, and so on) have occurred.
2.) Force-response algorithms compute the interaction force between avatars and virtual objects when a collision is detected. This force approximates as closely as possible the contact forces that would normally arise during contact between real objects.
Hardware limitations prevent haptic devices from applying the exact force computed by the force-response algorithms to the user.
3.) Control algorithms command the haptic device in such a way that minimizes the error between ideal and applicable forces. The discrete-time nature of the haptic- rendering algorithms often makes this difficult.
The force response algorithms’ return values are the actual force and torque vectors that will be commanded to the haptic device.
Existing haptic rendering techniques are currently based upon two main principles: "point-interaction" or "ray-based".
In point interactions, a single point, usually the distal point of a probe, thimble or stylus employed for direct interaction with the user, is employed in the simulation of collisions. The point penetrates the virtual objects, and the depth of indentation is calculated between the current point and a point on the surface of the object. Forces are then generated according to physical models, such as spring stiffness or a spring-damper model.
In ray-based rendering, the user interface mechanism, for example, a probe, is modeled in the virtual environment as a finite ray. Orientation is thus taken into account, and collisions are determined between the simulated probe and virtual objects. Collision detection algorithms return the intersection point between the ray and the surface of the simulated object.
Computing contact-response forces:
Humans perceive contact with real objects through sensors (mechanoreceptors) located in their skin, joints, tendons, and muscles. We make a simple distinction between the information these two types of sensors can acquire.
1.Tactile information refers to the information acquired through sensors in the skin with particular reference to the spatial distribution of pressure, or more generally, tractions, across the contact area.
To handle flexible materials like fabric and paper, we sense the pressure variation across the fingertip. Tactile sensing is also the basis of complex perceptual tasks like medical palpation, where physicians locate hidden anatomical structures and evaluate tissue properties using their hands.
2.Kinesthetic information refers to the information acquired through the sensors in the joints. Interaction forces are normally perceived through a combination of these two.
To provide a haptic simulation experience, systems are designed to recreate the contact forces a user would perceive when touching a real object.
There are two types of forces:
1.Forces due to object geometry.
2.Forces due to object surface properties, such as texture and friction.
Geometry-dependent force-rendering algorithms:
The first type of force-rendering algorithms aspires to recreate the force interaction a user would feel when touching a frictionless and textureless object.
Force-rendering algorithms are also grouped by the number of Degrees-of-freedom (DOF) necessary to describe the interaction force being rendered.
Surface property-dependent force-rendering algorithms:
All real surfaces contain tiny irregularities or indentations. Higher accuracy, however, sacrifices speed, a critical factor in real-time applications. Any choice of modeling technique must consider this tradeoff. Keeping this trade-off in mind, researchers have developed more accurate haptic-rendering algorithms for friction.
In computer graphics, texture mapping adds realism to computer-generated scenes by projecting a bitmap image onto surfaces being rendered. The same can be done haptically.
Controlling forces delivered through haptic interfaces:
Once such forces have been computed, they must be applied to the user. Limitations of haptic device technology, however, have sometimes made applying the force’s exact value as computed by force-rendering algorithms impossible. They are as follows:
1• Haptic interfaces can only exert forces with limited magnitude and not equally well in all directions
2• Haptic devices aren’t ideal force transducers. An ideal haptic device would render zero impedance when simulating movement in free space, and any finite impedance when simulating contact with an object featuring such impedance characteristics. The friction, inertia, and backlash present in most haptic devices prevent them from meeting this ideal.
3• A third issue is that haptic-rendering algorithms operate in discrete time whereas users operate in continuous time.
Finally, haptic device position sensors have finite resolution. Consequently, attempting to determine where and when contact occurs always results in a quantization error. It can create stability problems.
All of these issues can limit a haptic application’s realism. High servo rates (or low servo rate periods) are a key issue for stable haptic interaction.
Haptic Devices
Types of Haptic devices:
There are two main types of haptic devices:
1• Devices that allow users to touch and manipulate 3-dimentional virtual objects.
2• Devices that allow users to "feel" textures of 2-dementional objects.
Another distinction between haptic interface devices is their intrinsic mechanical behavior.
Impedance haptic devices simulate mechanical impedance—they read position and send force. Simpler to design and much cheaper to produce, impedance-type architectures are most common.
Admittance haptic devices simulate mechanical admittance—they read force and send position. Admittance-based devices are generally used for applications requiring high forces in a large workspace.
LOGITECH WINGMAN FORCE FEEDBACK MOUSE
It is attached to a base that replaces the mouse mat
and contains the motors used to provide forces back to
the user.
Interface use is to aid computer users who are blind or visually disabled; or who are tactile/Kinesthetic learners by providing a slight resistance at the edges of windows and buttons so that the user can "feel" the Graphical User Interface (GUI). This technology can also provide resistance to textures in computer images, which enables computer users to "feel" pictures such as maps and drawings.
PHANTOM:
The PHANTOM provides single point, 3D force-
feedback to the user via a stylus (or thimble) attached to a
moveable arm. The position of the stylus point/fingertip is
tracked, and resistive force is applied to it when the device
comes into 'contact' with the virtual model, providing accurate, ground referenced force feedback. The physical working space is determined by the extent of the arm, and a number of models are available to suit different user requirements.
The phantom system is controlled by three direct current (DC) motors that have sensors and encoders attached to them. The number of motors corresponds to the number of degrees of freedom a particular phantom system has, although most systems produced have 3 motors.
The encoders track the user’s motion or position along the x, y and z coordinates the motors track the forces exerted on the user along the x, y and z-axis. From the motors there is a cable that connects to an aluminum linkage, which connects to a passive gimbals which attaches to the thimble or stylus. A gimbal is a device that permits a body freedom of motion in any direction or suspends it so that it will remain level at all times.
Used in surgical simulations and remote operation of robotics in hazardous environments
Cyber Glove:
Cyber Glove can sense the position and movement of the fingers and wrist.
The basic Cyber Glove system includes one CyberGlove, its instrumentation unit, serial cable to connect to your host computer, and an executable version of VirtualHand graphic hand model display and calibration software.
The CyberGlove has a software programmable switch and LED on the wristband to permit the system software developer to provide the CyberGlove wearer with additional input/output capability. With the appropriate software, it can be used to interact with systems using hand gestures, and when combined with a tracking device to determine the hand's position in space, it can be used to manipulate virtual objects.
Cyber Grasp:
The Cyber Grasp is a full hand force-feedback exo skeletal device, which is worn over the CyberGlove. CyberGrasp consists of a lightweight mechanical assembly, or exoskeleton, that fits over a motion capture glove. About 20 flexible semiconductor sensors are sewn into the fabric of the glove measure hand, wrist and finger movement. The sensors send their readings to a computer that displays a virtual hand mimicking the real hand’s flexes, tilts, dips, waves and swivels.
The same program that moves the virtual hand on the screen also directs machinery that exerts palpable forces on the real hand, creating the illusion of touching and grasping. A special computer called a force control unit calculates how much the exoskeleton assembly should resist movement of the real hand in order to simulate the onscreen action. Each of five actuator motors turns a spool that rolls or unrolls a cable. The cable conveys the resulting pushes or pulls to a finger via the exoskeleton.
Applications
Medical training applications:
Such training systems use the Phantom’s force
display capabilities to let medical trainees
experience and learn the subtle and complex
physical interactions needed to become skillful in their art.
A computer based teaching tool has
been developed using haptic technology to train veterinary students to examine the bovine reproductive tract, simulating rectal palpation. The student receives touch feedback from a haptic device while palpating virtual objects. The teacher can visualize the student's actions on a screen and give training and guidance.
Collision Detection:-
Collision detection is a fundamental problem in computer animation, physically-based modeling, geometric modeling, and robotics. In these fields, it is often necessary to compute distances between objects or find intersection regions.
In particular, I have investigated the computation of global and local penetration depth, distance fields, and multiresolution hierarchies for perceptually-driven fast collision detection. These proximity queries have been applied to haptic rendering and rigid body dynamics simulation.
Minimally Invasive Surgery:
The main goal of this project is to measure forces and torques exerted by the surgeon during minimally-invasive surgery in order to optimize haptic feedback. A standard da Vinci tool affixed with a 6 DOF force/torque transducer will be used to perform basic surgical procedures and the forces applied by the tool will be recorded and analyzed. This will help determine in which degrees of freedom forces are most commonly applied.
Stroke patients:
Stroke patients who face months of tedious rehabilitation to regain the use of impaired limbs may benefit from new haptics systems -- interfaces that add the sense of touch to virtual computer environments -- in development at the University of Southern California's Integrated Media Systems Center (IMSC).
The new systems, being designed by an interdisciplinary team of researchers from the Viterbi School of Engineering and the Annenberg School for Communication, are challenging stroke patients to grasp, pinch, squeeze, throw and push their way to recovery.
Prostate Cancer:
Prostate cancer is the third leading cause of death among American men, resulting in approximately 31,000 deaths annually. A common treatment method is to insert needles into the prostate to distribute radioactive seeds, destroying the cancerous tissue. This procedure is known as brachytherapy.
The prostate itself and the surrounding organs are all soft tissue. Tissue deformation makes it difficult to distribute the seeds as planned. In our research we have developed a device to minimize this deformation, improving brachytherapy by increasing the seed distribution accuracy.
Removal of lens segment:
surgeons complete removal of the lens segments in the same way: by holding them at the mouth of the laser/aspiration probe using vacuum and firing the laser to fragment them for aspiration. However, several surgeons have developed different techniques for nuclear disassembly. These include:
Nuclear prechop. This technique, developed by Dr. Dodick himself, involves inserting two Dodick-Kallman Choppers under the anterior capsulotomy, 180? apart and out to the equator of the lens. The surgeon rotates the choppers downward and draws them towards each other, bisecting the lens inside the capsular bag. A similar maneuver then bisects each half. Using the irrigation probe to support the segments during removal is helpful.
Settings: Aspiration: 275 to 300 mmHg; Air infusion: 80 to 100 mmHg; Laser pulses: 1 Hz.
Wehner backcracking. This technique, developed by Wolfram Wehner, M.D., uses the Wehner Spoon, an irrigating handpiece that resembles a shovel at the tip. The surgeon lifts the nucleus using the laser/aspiration probe, inserts the Wehner spoon underneath, and uses the two probes to backcrack the nucleus. The Wehner spoon provides support during removal of the lens segments.
Settings: Aspiration: 275 mmHg; Air infusion: 95 mmHg; Laser pulses: 3 Hz.
Intelligent machines:
The Centre for Intelligent Machines is an inter-departmental inter-faculty research group which was formed to facilitate and promote research on intelligent systems.
Intelligent systems and machines are capable of adapting their behavior by sensing and interpreting their environment, making decisions and plans, and then carrying out those plans using physical actions. The mission of CIM is to excel in the field of intelligent machines, stressing basic research, technology development, and education. CIM seeks to advance the state of knowledge in such domains as robotics, automation, artificial intelligence, computer vision, systems and control theory, and speech recognition.
This is being achieved by collaborative efforts involving researchers with very different interests - CIM faculty and students come from the School of Computer Science, Department of Electrical and Computer Engineering, and the Department of Mechanical Engineering. It is this diversity of interests along with the spirit of collaboration which forms the driving force behind this dynamic research community.
Tactile slip display:
Human fingers are able to manipulate delicate objects without either dropping or breaking them, but lose this ability to a certain degree when using a tele-operated system. One reason for this is that human fingers are equipped with sensors that tell us when our fingerprints at the edge of the contact area start to come off the object we are holding, allowing us to apply the minimum force necessary to hold the object. While several other researchers have built synthetic skins for their robot fingers that work in a similar way to human fingerprints, a tactile haptic device is needed to display these sensations to a human using a tele-operated system. For this purpose we have designed the 2 degree of freedom Haptic Slip Display. We have conducted psychophysical experiments validating the device design and demonstrating that it can improve user performance in a delicate manipulation task in a virtual environment.
Gaming technology:
Flight Simulations: Motors and actuators push, pull, and shake the flight yoke, throttle, rudder pedals, and cockpit shell, replicating all the tactile and kinesthetic cues of real flight. Some examples of the simulator’s haptic capabilities include resistance in the yoke from pulling out of a hard dive, the shaking caused by stalls, and the bumps felt when rolling down concrete runway. These flight simulators look and feel so real that a pilot who successfully completes training on a top-of-the-line Level 5 simulator can immediately start flying a real commercial airliner.
Today, all major video consoles have built-in tactile feedback capability. Various sports games, for example, let you feel bone-crushing tackles or the different vibrations caused by skateboarding over plywood, asphalt, and concrete. Altogether, more than 500 games use force feedback, and more than 20 peripheral manufacturers now market in excess of 100 haptics hardware products for gaming.
Mobile Phones: Samsung has made a phone, which vibrates, differently for different callers. Motorola too has made haptic phones.
Cars: For the past two model years, the BMW 7 series has contained the iDrive (based on Immersion Corp's technology), which uses a small wheel on the console to give haptic feedback so the driver can control the peripherals like stereo, heating, navigation system etc. through menus on a video screen.
The firm introduced haptic technology for the X-by-Wire system and was showcased at the Alps Show 2005 in Tokyo. The system consisted of a "cockpit" with steering, a gearshift lever and pedals that embed haptic technology, and a remote-control car. Visitors could control a remote control car by operating the steering, gearshift lever and pedals in the cockpit seeing the screen in front of the cockpit, which is projected via a camera equipped on the remote control car.
Robot Control: For navigation in dynamic environments or at high speeds, it is often desirable to provide a sensor-based collision avoidance scheme on-board the robot to guarantee safe navigation. Without such a collision avoidance scheme, it would be difficult for the (remote) operator to prevent the robot from colliding with obstacles. This is primarily due to (1) limited information from the robots' sensors, such as images within a restricted viewing angle without depth information, which is insufficient for the user's full perception of the environment in which the robot moves, and (2) significant delay in the communication channel between the operator and the robot.
Experiments on robot control using haptic devices have shown the effectiveness of haptic feedback in a mobile robot tele-operation system for safe navigation in a shared autonomy scenario.
Future Enhancements:
Force Feedback Provided In Web Pages:
This underlying technology automatically assigns "generic touch sensations" to common Web page objects, such as hyperlinks, buttons, and menus.
Virtual Brailee Display:
The Virtual Braille Display (VBD) project was created to investigate the possibility of using the lateral skin stretch technology of the STReSS tactile display for Braille. The project was initially conducted at VisuAide inc. and is now being continued in McGill's Haptics Laboratory.
Haptic torch for the blind: The device, housed in a torch, detects the distance to objects, while a turning dial on which the user puts his thumb indicates the changing distance to an object. The pictured device was tested and found to be a useful tool.
CONCLUSION:
Haptic is the future for online computing and e-commerce, it will enhance the shopper experience and help online shopper to feel the merchandise without leave their home. Because of the increasing applications of haptics, the cost of the haptic devices will drop in future. This will be one of the major reasons for commercializing haptics.
With many new haptic devices being sold to industrial companies, haptics will soon be a part of a person’s normal computer interaction.
REFERENCES:
http://www.sensable.com/products/datafiles/phantom_ghost/Salisbury_Haptics95.pdf
http://www.wam.umd.edu/~prmartin/3degrees/HAPTIC%20TECHNOLOGY1.doc
http://www.computer.org/cga/cg2004/g2024.pdf
http://www.dcs.gla.ac.uk/~stephen/papers/EVA2001.pdf
http://cda.mrs.umn.edu/~lopezdr/seminar/spring2000/potts.pdf
http://www.sensable.com
http://www.immersion.com
http://www.logitech.com
http://www.technologyreview.com
Thursday, February 18, 2010
Tuesday, February 16, 2010
Motivation and Application of Haptic Systems
1.1 The Meaning of Haptics from a Philosophical and Social
Viewpoint
Haptics describes the sense of touch and movement. An engineer tends to describe
haptics in terms of forces, elongations, frequencies, mechanical tensions and shearforces.
This of course makes sense and is important for the technical design process.
However haptics is more than that. Haptic perceptions range from minor interactions
in everyday life, e.g., drinking from a glass or writing this text, to a means of social
communication, e.g. shaking hands or giving someone a pat on the shoulder,
and very personal and private interpersonal experiences. This chapter deals with
the spectrum and influence haptics has on the human being beyond technological
descriptions. It is also a hint for the development engineer, to be responsible and
conscious when considering the capabilities to fool the haptic sense.
1.1.1 Haptics as a Physical Being’s Boundary
Haptics is derived from the Greek term “haptios” and describes “something which
can be touched”. In fact the consciousness about and understanding of the haptic
sense has changed many times in the history of humanity. ARISTOTELES puts the
sense of touch in the last place when naming the five senses:
1. sight
2. hearing
3. smell
4. taste
5. touch
Nevertheless he attests this sense a high importance concerning its indispensability
[3]:
The social estimation of the sense of touch experienced all imaginable phases.
Frequently it was afflicted with the blemish of squalor, as lust is transmitted by it:
“Sight differs from touch by its virginity, such as hearing differs from smell and
taste: and in the same way their lust-sensation differs [289].”
It was called the sense of excess [78] . In a general subdivision between lower
and higher senses, touch was almost constantly ranged within the lower class. In
western civilization the church once stigmatized this sense as forbidden due to the
pleasure which can be gained by it. However in the 18th century the public opinion
changed and KANT is cited with the following statement [126]:
“This sense is the only one with an immediate exterior perception; due to this it is
the most important and the most teaching one, but also the roughest. Without this
sensing organ we would be able to grasp our physical shape, whose perception the
other two first class senses (sight and hearing) have to be refered to, to generate
some knowledge from experience.”
KANT thus emphasizes the central function of the sense of touch. It is capable of
teaching the spatial perception of our environment. Only touch enables us to feel and
classify impressions collected with the help of other senses, put them into context
and understand spatial concepts. Although stereoscopic vision and hearing develop
early, the first-time interpretation of what we see and hear, requires the connection
between both impressions perceived independently and information about distances
between objects. This can only be provided by a sense, which can bridge the space
between a being and an object. Such a sense is the sense of touch. The skin, being a
part of this sense, covers a human’s complete surface and defines his or her physical
boundary, the physical being.
Wearing glasses is another fascinating example of the effect of the relationship between
distance and perception. Short- sightedness requires glasses that demagnify
the picture of the environment on the retina due to the distance between eyeball and
lenses. Shortsighted people have a different view of size, e.g. concerning their own
body height, dependent on whether they wear glasses or contact lenses. At every
change between both optical aids the perception of their body has to adapt. Dependent
on a person’s kind of defective vision this is a consciously perceivable process.
It can be performed within seconds by using the well known references of one’s own
arms which touch things or one’s legs which walk
Especially in the 20th century art deals with the sense of touch and plays with
its meaning. Drastically the furry-cup (fig. 1.1) makes you aware of the significance
of haptic texture for the perception of surfaces and surface structures. Whereas the
general form of the cup remains visible and recognizable, the originally plane ceramic
surface is covered by fur. “Fighting the mud” (fig. 1.2) remembers you that
not only hands and fingers are relevant for haptic perception, but that the whole body
surface is able to touch and feel. In 1968 the “Pad- and Touch-Cinema” (fig. 1.3)
allowed visitors to touch VALIE EXPORT’s naked skin for 12 seconds through a box
being covered by a curtain all the time. According to the artist this was the only
valid approach to experience sexuality without the aspect of voyeurism [70]. These
are just a few examples of how art and artists played with the various aspects of
haptic perception
artistry. This is repeatedly demonstrated by expositions during the Worldhaptics
Conferences. At the same time, Prof. ISHII from MIT Media Laboratory or the
Graduate School of Systems and the Information Engineering group of the University
of Tsukuba of Prof. IWATA demonstrate startling exhibits (fig. 1.4) of “tangible
user interfaces” (TUI). These interfaces couple visual displays with haptically reconfigurable
objects to provide intuitive human-machine interfaces. There is much
more to find when the senses are sharpened to search for it.
The sense of touch can be a lot of things, e.g. a limitation of the physical being,
which helps to assess distances and calibrate other senses like vision, as well as a
means of social communication and a mediator of very personal experiences .Additionally
it is - like all the other senses - a target of art which makes us aware of
the importance of haptic experiences by fooling, distorting and emphasizing them.
. Besides these facets of the haptic sense, its function and its dynamic mechanical
Fig. 1.2 KAZUO SHIRAGA: Doro ni idomu (Fighting the mud) 1955 [70][217].
Fig. 1.3 VALIE EXPORT: Pad- and Touch-Cinema 1968 [70].
properties are also very impressive. Haptic perception in all its aspects is presented
in the following section.
Formation of the Sense of Touch
As shown in the prior section, the sense of touch has numerous functions. The
knowledge of these functions enables the engineer to formulate demands on the
technical system. It is helpful to consider the whole range of purposes the haptic
sense serves. However, at this point we do not yet choose an approach by measuring
its characteristics, but observe the properties of objects discriminated by it.
Fig. 1.4 Example for “Tangible Bits”, with different data streams accessible by opening bottles.
In this case single instrumental voices are combined to a trio [109] .
The sense of touch is not only specialized on the perception of the physical
boundaries of the body, as said before, but also on the analysis of surface properties.
Human beings and their predecessors had to be able to discriminate e.g. the
structure of fruits and leaves by touch, in order to identify their ripeness or whether
they were eatable or not, like e.g. a furry berry among smooth ones. The haptic
sense enables us to identify a potentially harming structure, like e.g. a spiny seed,
and to be careful when touching it,in order to obtain its content despite its dangerous
needles. For this reason, the sense of touch has been optimized for the perception
and discrimination of surface properties like e.g. roughness. Surface properties may
range from smooth ceramic like or lacquered surfaces with structural widths in the
area of low μm, to somewhat structured surfaces like coated tables and rough surfaces
like coarsely woven cord textiles with mesh apertures in the range of several
millimeters. Humans have developed a very typical way how to interact with theses
surfaces enabling them to draw conclusions based on the underlying perception
mechanism. A human being moves his or her finger along the surface (fig. 1.5),
allowing shear forces to be coupled to the skin. The level of the shear forces is dependent
on the quality of the frictional coupling between the object surface and the
skin. It is a summary of the tangential elasticity of the skin depending on the normal
pre-load resulting from the touch Fnorm and the velocity Fexplr of the movement and
the quality of the coupling factor μ.
Everyone who has ever designed a technical frictional coupling mechanism
knows that without additional structures or adhesive materials viscous friction between
two surfaces can hardly reach a factor of μr = 0.1. Nevertheless nature, in
order to be able to couple shearing force more efficiently into the skin, has “in
vented” a special structure at the most important body-part for touching and exploration:
the fingerprint. The epidermal ridges couple shearing forces efficiently to the
skin, as by the bars a bending moment is transmitted into its upper layers. Additionally
these bars allow form closures within structural widths of similar size, which
means nothing else but canting between the object handled and the hand’s skin. At
first glance this is a surprising function of this structure. When one looks again, it
just reminds you of the fact that nature does not introduce any structure without
a deeper purpose. Two practical facts result from this knowledge: First of all the
understanding of shear-forces’ coupling to the skin has come into focus of current
research [65] and has resulted in an improvement of the design process of tactile
devices. Secondly, this knowledge can be applied to improve the measuring accuracy
of commercial force sensors by building ridge-like structures [275]. Additional
details of the biological basics of tactile perception are given in chapter 3.
Consequently the sense of touch, as said before, has been developed for the discrimination
of surface structures. Although the skin may be our most sensitive organ,
it is still not the only haptically relevant one. Additional receptors are located
within muscles and joints, which enable us to get an impression of acting forces.
Anyone who has ever lifted a four pound weight (e.g. a well filled pitcher) with
an outstretched arm in a horizontal position, will have little recollection of the tactile
surface properties of the handle. The much more impressive experience of such
an experiment is the tensing up of the muscles, their slowly increasing fatigue and
the resulting change in the lifting angles of the joints. This is called “kinaesthetic
perception”. Whereas tactile perception describes forces (≈ 5mN..5N) and elongations
between skin and object which are low in amplitudes (≈1μm..1mm) and high
in frequencies (≈ 10Hz..1000Hz), kinaesthetic perception happens within muscles
and joints at higher forces but with lower dynamics (≈ static..10Hz). This enables
the human being and every other biological system with a firm supportive structure
- may it be bones or shells of chitin - to perform coordinated movements and targeted
interactions with its environment. While tactile perception generates similar
impressions during passive (e.g. a relative movement between a static finger-tip and
a moving surface) and active (e.g. a relative movement between a static surface and
a moving finger-tip) movement, kinaesthetic perception is more complex and influenced by additional factors. The human being is able to change deliberately his or
her mechanical properties. A handshake of the same person can be firm and rigid,
but it may be also loose and amicable. The coupling between muscles, joint position
and perception enables us to consciously influence the kinaesthetics of ourselves,
and to influence the intensity of our kinaesthetic perception in one and the same
situation. This makes us capable of blocking a blow with the same hands we use to
rock a baby to sleep. It gives us the ability to touch a structure carefully before we
grasp it firmly. The borders between action and reaction, active and passive become
blurred in the perspective of kinaesthetics. The awareness of this fact is important
for the requirements on systems with closed-loop control, which are important for
the design of haptic devices (chapter 5). At the same time this adaptability of the
human being and the connected ambiguity of the system’s borders are a significant
challenge for the design of a technical device.
Special Aspects of the Design Process
The design of any technical system always includes a long chain of compromises.
The achievement of the engineer lies in the selection of those compromises which
ensure that existing requirements are still fulfilled. Often these compromises are financially
motivated - a product should be inexpensive during the production process
without losing performance. Concerning these demands, an optimization of systems
with interfaces to other purely technical systems is often elegantly possible. The
technical systems are quite exactly known as to their characteristics and a technical
design can anticipate these characteristics with a certain security margin. Thus the
interpretation of a sensor capturing the rotation of a wheel, e.g. a speedometer, is
a relatively clear task. The necessary speeds are known, and disturbance variables
like temperature areas as well as humidity. can be identified Alternatively they can
be measured with high exactness. It is also relatively easy to identify the requirement
of measuring a two-dimensional movement of a human operated device on a
level surface - e.g. a computer mouse. The temperature range of the appliance is
known; the disturbance variables are limited to the optical measurement path and
the mechanical surface state and can easily be investigated. Only the speed is not
given as precisely as by a technical system. It results from the consideration about
the maximum speeds a human hand can reach. Here uncertainties become evident,
soon. Although the dynamics of human movement can be measured - technically,
a high variance between different people will be observed. This variance also concerns
the technical requirements of any object used by humans, and may it only be
the physical dimensions of tables and chairs. Dealing with such variances, matching
measuring methods and statistical analysis methods have entered anthropometric
modeling up to ergonomic design of work-places [153] as well as ergonomic standardization
ISO norms 9241/DIN 33 402 1. The science of anthropometrics applies to static (lengths, dimensions) and dynamic (speeds) cases. As a matter of fact: Every
human’s-applicable characteristic value is affected by such a wide variance that
with the information of ergonomic or anthropometric data only a proportional estimation
can be made. These estimations are called percentiles (fig. 1.6). A percentile
is a percentage of the totality of the data subject to analysis (e.g. European female
children between 10 and 15 years) and, depending on the context, encloses all people
who exceed or are below the percentage.
Fig. 1.6 Anthropometric design for sitting and standing work places considering the 5% and the
95% percentile according to DIN 33406.
percentiles introduced above is well established, as it fits quite well the natural variance
of people.With regard to the description of senses and their performances average
values are more common, e.g. when using a threshold 2. Thresholds themselves
are a key parameter in finding physical values to quantify human performance. Derived
from such values the technical system’s requirements like amplitude, amplitude
change or dynamics can be employed for deceiving a human sense and for
generating a “realistic” or “sufficient” haptic impression. The choice of words already
shows that requirements seldom comprise a concrete verifiable measurement.
They mostly represent a well-known structure, so that a group of people - or just
the superior or the board of directors - is content with its haptic impression. For the
design engineer this is an unsatisfying benchmark. Alternatives will be discussed to
a large extent. in the course of this book and especially in chapter 6.
The Significance of Haptics in Everyday Professional Life
The importance of haptics for professional life differs dependent on the profession
considered. In handcraft or manual trades the word ‘hand’ already implies the
relevance of haptics for performing these jobs. No bricklayer, carpenter, butcher,
plumber or barber would be able to do his job, if the sense of touch did not give
then important information about the material they work on. May that be the hair
they hold between their fingers, the humidity of the wall (as a change of heat transmission),
the cable core within the insulation, the difference between tendons and
muscles, the graining of pine and beech trees, the consistency of mortar. Even with
today’s state-of-the-art technology the involvement of man increases with the required
complexity and carefulness of a manual work. With this involvement and
the use of sense of touch the tools usually become less complex. Whereas during
archaeological excavation a first layer of earth is removed with an excavator, when
approaching a hidden structure a shovel will be used or maybe a spatula or for precision
work a brush or even the bare hands. However even in handcraft jobs machines
of increased flexibility made people turn away from the workpiece and its haptic
properties. Today master craftsmen criticize apprentices either for not having anymore
a sensation for materials and their properties or for lacking the information
-based technological know-how for the control of machines. By optimizing the interface
between manual work and machine-programming, engineers try to overcome
this gap. But in other areas of professional life, not only in jobs carrying the word
“manual” in their name, the loss of the sense of touch for everyday work has already
taken place.
The Sense of Touch in Everyday Medical Life
In many medical disciplines high manual skills are required. The capabilities of the
sense of touch are necessary for diagnostics and therapy, be it for the identification
of skin diseases, the diagnosis of joints, and the palpation of inner organs from the
outside or via natural openings; or for a direct surgical application like the transplantation
of a heart, the sawing of the cranium or the punctuation of the spinal
cord. The sense of touch transmits a plurality of information about texture, elasticity
and temperature to the medical professional - information which would either
be inaccessible or not so easily accessible in other ways. Nevertheless, in certain
situations it is necessary to substitute the sense of touch in diagnosis and therapy.
Via magnetic resonance imaging e.g. tendons and menisci of the knee can be visualized.
Thus a demanding manual examination of the joints’ movement range is not
necessary; especially as performing the procedure and interpreting the haptically
felt data requires experience and still leaves room for misinterpretation. Additionally
the results of a manual investigation are harder to explain to the patient than
the distinctiveness of a real image. However, when comparing the expenses of both diagnostic procedures, the precedence should be given to the haptic diagnosis. A
compromise can be seen in devices like the “Wristalyzer” [77]. This device either
puts varying loads on a moving joint - the wrist - or actively moves it, while dynamically
measuring the angle vs. displacement curves. Additionally it acquires a
complete electro-myography of the muscles. Besides for diagnosis, devices of this
kind are already planned for therapy. By actively generating forces and torques,
they can be used for the training of all joints of our extremities, of the cervical spine
and of the pelvis. Considering all these factors, there seems to be a tendency for the
mechanization in diagnostics and therapy. In orthopedic areas there is, however, still
some room to discuss its necessity, whereas in surgery there is an urgent need for
mechanization which, however, leads to a loss of haptic impressions. After surgical
interventions like e.g. an appendectomy, the wish for small wounds and scars for
medical and cosmetic reasons has therefore led to the design of laparoscopic instruments
(fig. 1.7). Simply by their length, mass and stiffness they also resemble a filter
for the haptic information. This decoupling between patient and surgeon has found
its temporary climax in the DaVinci system (fig. 1.8) - a laparoscopic telemanipulation
system without force feedback. This loss of the sense of touch during surgical
(or any other internistic) interventions is obvious and regrettable. As a result numerous
research projects were and still are focusing on an adequate substitute for the
direct haptic interaction by alternative technologies [73] or improved instruments
with integrated force-feedback [209] (fig. 1.9).
Fig. 1.7 Rigid laparoscopic instrument by Karl Storz.
The Sense of Touch in the Cockpit
Besides the aim of getting information which is already mechanically available
(elasticity, surface structure, etc.), there is the necessity to provide artificially generated
tactile data in addition to overloaded visual or auditory senses in information
Fig. 1.8 Surgical telemanipulator DaVinci R by Intuitive Surgical, installation in Munich.
Fig. 1.9 Functional muster of a hand-held laparoscopic telemanipulator with increased number of
degrees of freedom at the instrument’s tip, such as a prepared intracorporal force measurement
with haptic feedback on the control unit [209].
loaded working places. Such working places can be found in control stations where
the human has to make time critical and responsible decisions, e.g. within a jet, airplane
or at the steering wheel of a common car. The designers of a cockpit typically
choose between visual, acoustic and haptic transmission paths. Even the choice of
a scroll-wheel with hard stops instead of a pure incremental sensor is influenced by
the knowledge that a selection within a certain range can be much faster done if
the limits of this range are explicitly given [14]. Control knobs like the i-drive in
a BMW allow a reconfiguration of its haptic properties during operation. Warning
signals are already given via vibrating motors or so called “tactons”. Especially in
the military area a complex spatial orientation based on vibrating clothing (fig. 1.10)
for marines and flight-personnel is subject to actual research [267, 115], whereas
active sidesticks in military and civil airplanes and vibrating braking assistance or
in-lane guidance in cars are already established.
Fig. 1.10 West equipped with vibrators for the spatial coding of positioning and bearing data
(TNO, Netherlands) [267].
The Sense of Touch at the Desk
There is hardly any other job where the sense of touch has lost so much of its significance
than in the office. Just a few decades ago the use of paper, pens in a large
variety, rulers, folders and files was a joyful source of haptic information for the
sense of touch. Today the haptic interface to an office working place is defined by
a keyboard and a mouse. Due to this extreme focus on a single type of haptic interface
for a variety of things, the ergonomics of a keyboard is of extraordinarily
high importance. Besides the switching characteristics of the key itself, the surface
structure and the tactile markers on the letters F and J (fig. 1.11) and the size of
the key are necessary and considerable design criteria. ISO 9241-400 defines clear decision paths for both, the designer and the buyer of keyboards. Nevertheless it is
beyond doubt that major ergonomic improvements are not done by the optimization
of keyboard and mouse, but by improvements of office software ergonomics. Contrary
to many cases where the term “interface” refers only to the graphical interface,
RASKIN’s “The Humane Interface” [203] is a decided and enjoyable collection of
software with unergonomic graphical interfaces offering methods and design criteria
for their improvement.
The Sense of Touch in Music
If regarded from an abstract standpoint, haptic sense and acoustic perception have
multifarious parallels. Both are sensitive to the perception of mechanical oscillations
and cover a comparable frequency range. Thereby the haptic sense rather perceives
frequencies covering two decades below 1 kHz, whereas the acoustic sense rather
perceives frequencies up to two decades above 100 Hz. Music quite often makes
use of these parallels which may be used to perceive the oscillations of the string of
a valuable violin or harp; or to touch the soft vibration of a wind instrument giving
a low A. They are even to be found in studio technology. Devices like the “ButtKicker”
(fig. 1.12) from The Guitammer Company are electrodynamic actuators
which are used as tactile feedback devices during concerts. They transmit the lower
frequency range to the drummer giving the rhythm of the band without drowning his
own instrument. Additionally the acoustic pressure for the musicians is reduced, as
they may not necessarily want to be exposed to the same loudness as their excited
audience. These kinds of actuators are also suitable for e.g. the couch in a home cinema or chairs in front of gaming PCs to increase the perception of bass-intense
effects. Here again, the tactile effect is of similar intensity as the perception of a
bass impulse, connected with the advantage that little acoustic pressure is emanated
resulting in almost no disturbing noise for people around.
Fig. 1.12 Electrodynamic actuator “ButtKicker” for generating low-frequency oscillations on a
drum-stool.
Viewpoint
Haptics describes the sense of touch and movement. An engineer tends to describe
haptics in terms of forces, elongations, frequencies, mechanical tensions and shearforces.
This of course makes sense and is important for the technical design process.
However haptics is more than that. Haptic perceptions range from minor interactions
in everyday life, e.g., drinking from a glass or writing this text, to a means of social
communication, e.g. shaking hands or giving someone a pat on the shoulder,
and very personal and private interpersonal experiences. This chapter deals with
the spectrum and influence haptics has on the human being beyond technological
descriptions. It is also a hint for the development engineer, to be responsible and
conscious when considering the capabilities to fool the haptic sense.
1.1.1 Haptics as a Physical Being’s Boundary
Haptics is derived from the Greek term “haptios” and describes “something which
can be touched”. In fact the consciousness about and understanding of the haptic
sense has changed many times in the history of humanity. ARISTOTELES puts the
sense of touch in the last place when naming the five senses:
1. sight
2. hearing
3. smell
4. taste
5. touch
Nevertheless he attests this sense a high importance concerning its indispensability
[3]:
The social estimation of the sense of touch experienced all imaginable phases.
Frequently it was afflicted with the blemish of squalor, as lust is transmitted by it:
“Sight differs from touch by its virginity, such as hearing differs from smell and
taste: and in the same way their lust-sensation differs [289].”
It was called the sense of excess [78] . In a general subdivision between lower
and higher senses, touch was almost constantly ranged within the lower class. In
western civilization the church once stigmatized this sense as forbidden due to the
pleasure which can be gained by it. However in the 18th century the public opinion
changed and KANT is cited with the following statement [126]:
“This sense is the only one with an immediate exterior perception; due to this it is
the most important and the most teaching one, but also the roughest. Without this
sensing organ we would be able to grasp our physical shape, whose perception the
other two first class senses (sight and hearing) have to be refered to, to generate
some knowledge from experience.”
KANT thus emphasizes the central function of the sense of touch. It is capable of
teaching the spatial perception of our environment. Only touch enables us to feel and
classify impressions collected with the help of other senses, put them into context
and understand spatial concepts. Although stereoscopic vision and hearing develop
early, the first-time interpretation of what we see and hear, requires the connection
between both impressions perceived independently and information about distances
between objects. This can only be provided by a sense, which can bridge the space
between a being and an object. Such a sense is the sense of touch. The skin, being a
part of this sense, covers a human’s complete surface and defines his or her physical
boundary, the physical being.
Wearing glasses is another fascinating example of the effect of the relationship between
distance and perception. Short- sightedness requires glasses that demagnify
the picture of the environment on the retina due to the distance between eyeball and
lenses. Shortsighted people have a different view of size, e.g. concerning their own
body height, dependent on whether they wear glasses or contact lenses. At every
change between both optical aids the perception of their body has to adapt. Dependent
on a person’s kind of defective vision this is a consciously perceivable process.
It can be performed within seconds by using the well known references of one’s own
arms which touch things or one’s legs which walk
Especially in the 20th century art deals with the sense of touch and plays with
its meaning. Drastically the furry-cup (fig. 1.1) makes you aware of the significance
of haptic texture for the perception of surfaces and surface structures. Whereas the
general form of the cup remains visible and recognizable, the originally plane ceramic
surface is covered by fur. “Fighting the mud” (fig. 1.2) remembers you that
not only hands and fingers are relevant for haptic perception, but that the whole body
surface is able to touch and feel. In 1968 the “Pad- and Touch-Cinema” (fig. 1.3)
allowed visitors to touch VALIE EXPORT’s naked skin for 12 seconds through a box
being covered by a curtain all the time. According to the artist this was the only
valid approach to experience sexuality without the aspect of voyeurism [70]. These
are just a few examples of how art and artists played with the various aspects of
haptic perception
artistry. This is repeatedly demonstrated by expositions during the Worldhaptics
Conferences. At the same time, Prof. ISHII from MIT Media Laboratory or the
Graduate School of Systems and the Information Engineering group of the University
of Tsukuba of Prof. IWATA demonstrate startling exhibits (fig. 1.4) of “tangible
user interfaces” (TUI). These interfaces couple visual displays with haptically reconfigurable
objects to provide intuitive human-machine interfaces. There is much
more to find when the senses are sharpened to search for it.
The sense of touch can be a lot of things, e.g. a limitation of the physical being,
which helps to assess distances and calibrate other senses like vision, as well as a
means of social communication and a mediator of very personal experiences .Additionally
it is - like all the other senses - a target of art which makes us aware of
the importance of haptic experiences by fooling, distorting and emphasizing them.
. Besides these facets of the haptic sense, its function and its dynamic mechanical
Fig. 1.2 KAZUO SHIRAGA: Doro ni idomu (Fighting the mud) 1955 [70][217].
Fig. 1.3 VALIE EXPORT: Pad- and Touch-Cinema 1968 [70].
properties are also very impressive. Haptic perception in all its aspects is presented
in the following section.
Formation of the Sense of Touch
As shown in the prior section, the sense of touch has numerous functions. The
knowledge of these functions enables the engineer to formulate demands on the
technical system. It is helpful to consider the whole range of purposes the haptic
sense serves. However, at this point we do not yet choose an approach by measuring
its characteristics, but observe the properties of objects discriminated by it.
Fig. 1.4 Example for “Tangible Bits”, with different data streams accessible by opening bottles.
In this case single instrumental voices are combined to a trio [109] .
The sense of touch is not only specialized on the perception of the physical
boundaries of the body, as said before, but also on the analysis of surface properties.
Human beings and their predecessors had to be able to discriminate e.g. the
structure of fruits and leaves by touch, in order to identify their ripeness or whether
they were eatable or not, like e.g. a furry berry among smooth ones. The haptic
sense enables us to identify a potentially harming structure, like e.g. a spiny seed,
and to be careful when touching it,in order to obtain its content despite its dangerous
needles. For this reason, the sense of touch has been optimized for the perception
and discrimination of surface properties like e.g. roughness. Surface properties may
range from smooth ceramic like or lacquered surfaces with structural widths in the
area of low μm, to somewhat structured surfaces like coated tables and rough surfaces
like coarsely woven cord textiles with mesh apertures in the range of several
millimeters. Humans have developed a very typical way how to interact with theses
surfaces enabling them to draw conclusions based on the underlying perception
mechanism. A human being moves his or her finger along the surface (fig. 1.5),
allowing shear forces to be coupled to the skin. The level of the shear forces is dependent
on the quality of the frictional coupling between the object surface and the
skin. It is a summary of the tangential elasticity of the skin depending on the normal
pre-load resulting from the touch Fnorm and the velocity Fexplr of the movement and
the quality of the coupling factor μ.
Everyone who has ever designed a technical frictional coupling mechanism
knows that without additional structures or adhesive materials viscous friction between
two surfaces can hardly reach a factor of μr = 0.1. Nevertheless nature, in
order to be able to couple shearing force more efficiently into the skin, has “in
vented” a special structure at the most important body-part for touching and exploration:
the fingerprint. The epidermal ridges couple shearing forces efficiently to the
skin, as by the bars a bending moment is transmitted into its upper layers. Additionally
these bars allow form closures within structural widths of similar size, which
means nothing else but canting between the object handled and the hand’s skin. At
first glance this is a surprising function of this structure. When one looks again, it
just reminds you of the fact that nature does not introduce any structure without
a deeper purpose. Two practical facts result from this knowledge: First of all the
understanding of shear-forces’ coupling to the skin has come into focus of current
research [65] and has resulted in an improvement of the design process of tactile
devices. Secondly, this knowledge can be applied to improve the measuring accuracy
of commercial force sensors by building ridge-like structures [275]. Additional
details of the biological basics of tactile perception are given in chapter 3.
Consequently the sense of touch, as said before, has been developed for the discrimination
of surface structures. Although the skin may be our most sensitive organ,
it is still not the only haptically relevant one. Additional receptors are located
within muscles and joints, which enable us to get an impression of acting forces.
Anyone who has ever lifted a four pound weight (e.g. a well filled pitcher) with
an outstretched arm in a horizontal position, will have little recollection of the tactile
surface properties of the handle. The much more impressive experience of such
an experiment is the tensing up of the muscles, their slowly increasing fatigue and
the resulting change in the lifting angles of the joints. This is called “kinaesthetic
perception”. Whereas tactile perception describes forces (≈ 5mN..5N) and elongations
between skin and object which are low in amplitudes (≈1μm..1mm) and high
in frequencies (≈ 10Hz..1000Hz), kinaesthetic perception happens within muscles
and joints at higher forces but with lower dynamics (≈ static..10Hz). This enables
the human being and every other biological system with a firm supportive structure
- may it be bones or shells of chitin - to perform coordinated movements and targeted
interactions with its environment. While tactile perception generates similar
impressions during passive (e.g. a relative movement between a static finger-tip and
a moving surface) and active (e.g. a relative movement between a static surface and
a moving finger-tip) movement, kinaesthetic perception is more complex and influenced by additional factors. The human being is able to change deliberately his or
her mechanical properties. A handshake of the same person can be firm and rigid,
but it may be also loose and amicable. The coupling between muscles, joint position
and perception enables us to consciously influence the kinaesthetics of ourselves,
and to influence the intensity of our kinaesthetic perception in one and the same
situation. This makes us capable of blocking a blow with the same hands we use to
rock a baby to sleep. It gives us the ability to touch a structure carefully before we
grasp it firmly. The borders between action and reaction, active and passive become
blurred in the perspective of kinaesthetics. The awareness of this fact is important
for the requirements on systems with closed-loop control, which are important for
the design of haptic devices (chapter 5). At the same time this adaptability of the
human being and the connected ambiguity of the system’s borders are a significant
challenge for the design of a technical device.
Special Aspects of the Design Process
The design of any technical system always includes a long chain of compromises.
The achievement of the engineer lies in the selection of those compromises which
ensure that existing requirements are still fulfilled. Often these compromises are financially
motivated - a product should be inexpensive during the production process
without losing performance. Concerning these demands, an optimization of systems
with interfaces to other purely technical systems is often elegantly possible. The
technical systems are quite exactly known as to their characteristics and a technical
design can anticipate these characteristics with a certain security margin. Thus the
interpretation of a sensor capturing the rotation of a wheel, e.g. a speedometer, is
a relatively clear task. The necessary speeds are known, and disturbance variables
like temperature areas as well as humidity. can be identified Alternatively they can
be measured with high exactness. It is also relatively easy to identify the requirement
of measuring a two-dimensional movement of a human operated device on a
level surface - e.g. a computer mouse. The temperature range of the appliance is
known; the disturbance variables are limited to the optical measurement path and
the mechanical surface state and can easily be investigated. Only the speed is not
given as precisely as by a technical system. It results from the consideration about
the maximum speeds a human hand can reach. Here uncertainties become evident,
soon. Although the dynamics of human movement can be measured - technically,
a high variance between different people will be observed. This variance also concerns
the technical requirements of any object used by humans, and may it only be
the physical dimensions of tables and chairs. Dealing with such variances, matching
measuring methods and statistical analysis methods have entered anthropometric
modeling up to ergonomic design of work-places [153] as well as ergonomic standardization
ISO norms 9241/DIN 33 402 1. The science of anthropometrics applies to static (lengths, dimensions) and dynamic (speeds) cases. As a matter of fact: Every
human’s-applicable characteristic value is affected by such a wide variance that
with the information of ergonomic or anthropometric data only a proportional estimation
can be made. These estimations are called percentiles (fig. 1.6). A percentile
is a percentage of the totality of the data subject to analysis (e.g. European female
children between 10 and 15 years) and, depending on the context, encloses all people
who exceed or are below the percentage.
Fig. 1.6 Anthropometric design for sitting and standing work places considering the 5% and the
95% percentile according to DIN 33406.
percentiles introduced above is well established, as it fits quite well the natural variance
of people.With regard to the description of senses and their performances average
values are more common, e.g. when using a threshold 2. Thresholds themselves
are a key parameter in finding physical values to quantify human performance. Derived
from such values the technical system’s requirements like amplitude, amplitude
change or dynamics can be employed for deceiving a human sense and for
generating a “realistic” or “sufficient” haptic impression. The choice of words already
shows that requirements seldom comprise a concrete verifiable measurement.
They mostly represent a well-known structure, so that a group of people - or just
the superior or the board of directors - is content with its haptic impression. For the
design engineer this is an unsatisfying benchmark. Alternatives will be discussed to
a large extent. in the course of this book and especially in chapter 6.
The Significance of Haptics in Everyday Professional Life
The importance of haptics for professional life differs dependent on the profession
considered. In handcraft or manual trades the word ‘hand’ already implies the
relevance of haptics for performing these jobs. No bricklayer, carpenter, butcher,
plumber or barber would be able to do his job, if the sense of touch did not give
then important information about the material they work on. May that be the hair
they hold between their fingers, the humidity of the wall (as a change of heat transmission),
the cable core within the insulation, the difference between tendons and
muscles, the graining of pine and beech trees, the consistency of mortar. Even with
today’s state-of-the-art technology the involvement of man increases with the required
complexity and carefulness of a manual work. With this involvement and
the use of sense of touch the tools usually become less complex. Whereas during
archaeological excavation a first layer of earth is removed with an excavator, when
approaching a hidden structure a shovel will be used or maybe a spatula or for precision
work a brush or even the bare hands. However even in handcraft jobs machines
of increased flexibility made people turn away from the workpiece and its haptic
properties. Today master craftsmen criticize apprentices either for not having anymore
a sensation for materials and their properties or for lacking the information
-based technological know-how for the control of machines. By optimizing the interface
between manual work and machine-programming, engineers try to overcome
this gap. But in other areas of professional life, not only in jobs carrying the word
“manual” in their name, the loss of the sense of touch for everyday work has already
taken place.
The Sense of Touch in Everyday Medical Life
In many medical disciplines high manual skills are required. The capabilities of the
sense of touch are necessary for diagnostics and therapy, be it for the identification
of skin diseases, the diagnosis of joints, and the palpation of inner organs from the
outside or via natural openings; or for a direct surgical application like the transplantation
of a heart, the sawing of the cranium or the punctuation of the spinal
cord. The sense of touch transmits a plurality of information about texture, elasticity
and temperature to the medical professional - information which would either
be inaccessible or not so easily accessible in other ways. Nevertheless, in certain
situations it is necessary to substitute the sense of touch in diagnosis and therapy.
Via magnetic resonance imaging e.g. tendons and menisci of the knee can be visualized.
Thus a demanding manual examination of the joints’ movement range is not
necessary; especially as performing the procedure and interpreting the haptically
felt data requires experience and still leaves room for misinterpretation. Additionally
the results of a manual investigation are harder to explain to the patient than
the distinctiveness of a real image. However, when comparing the expenses of both diagnostic procedures, the precedence should be given to the haptic diagnosis. A
compromise can be seen in devices like the “Wristalyzer” [77]. This device either
puts varying loads on a moving joint - the wrist - or actively moves it, while dynamically
measuring the angle vs. displacement curves. Additionally it acquires a
complete electro-myography of the muscles. Besides for diagnosis, devices of this
kind are already planned for therapy. By actively generating forces and torques,
they can be used for the training of all joints of our extremities, of the cervical spine
and of the pelvis. Considering all these factors, there seems to be a tendency for the
mechanization in diagnostics and therapy. In orthopedic areas there is, however, still
some room to discuss its necessity, whereas in surgery there is an urgent need for
mechanization which, however, leads to a loss of haptic impressions. After surgical
interventions like e.g. an appendectomy, the wish for small wounds and scars for
medical and cosmetic reasons has therefore led to the design of laparoscopic instruments
(fig. 1.7). Simply by their length, mass and stiffness they also resemble a filter
for the haptic information. This decoupling between patient and surgeon has found
its temporary climax in the DaVinci system (fig. 1.8) - a laparoscopic telemanipulation
system without force feedback. This loss of the sense of touch during surgical
(or any other internistic) interventions is obvious and regrettable. As a result numerous
research projects were and still are focusing on an adequate substitute for the
direct haptic interaction by alternative technologies [73] or improved instruments
with integrated force-feedback [209] (fig. 1.9).
Fig. 1.7 Rigid laparoscopic instrument by Karl Storz.
The Sense of Touch in the Cockpit
Besides the aim of getting information which is already mechanically available
(elasticity, surface structure, etc.), there is the necessity to provide artificially generated
tactile data in addition to overloaded visual or auditory senses in information
Fig. 1.8 Surgical telemanipulator DaVinci R by Intuitive Surgical, installation in Munich.
Fig. 1.9 Functional muster of a hand-held laparoscopic telemanipulator with increased number of
degrees of freedom at the instrument’s tip, such as a prepared intracorporal force measurement
with haptic feedback on the control unit [209].
loaded working places. Such working places can be found in control stations where
the human has to make time critical and responsible decisions, e.g. within a jet, airplane
or at the steering wheel of a common car. The designers of a cockpit typically
choose between visual, acoustic and haptic transmission paths. Even the choice of
a scroll-wheel with hard stops instead of a pure incremental sensor is influenced by
the knowledge that a selection within a certain range can be much faster done if
the limits of this range are explicitly given [14]. Control knobs like the i-drive in
a BMW allow a reconfiguration of its haptic properties during operation. Warning
signals are already given via vibrating motors or so called “tactons”. Especially in
the military area a complex spatial orientation based on vibrating clothing (fig. 1.10)
for marines and flight-personnel is subject to actual research [267, 115], whereas
active sidesticks in military and civil airplanes and vibrating braking assistance or
in-lane guidance in cars are already established.
Fig. 1.10 West equipped with vibrators for the spatial coding of positioning and bearing data
(TNO, Netherlands) [267].
The Sense of Touch at the Desk
There is hardly any other job where the sense of touch has lost so much of its significance
than in the office. Just a few decades ago the use of paper, pens in a large
variety, rulers, folders and files was a joyful source of haptic information for the
sense of touch. Today the haptic interface to an office working place is defined by
a keyboard and a mouse. Due to this extreme focus on a single type of haptic interface
for a variety of things, the ergonomics of a keyboard is of extraordinarily
high importance. Besides the switching characteristics of the key itself, the surface
structure and the tactile markers on the letters F and J (fig. 1.11) and the size of
the key are necessary and considerable design criteria. ISO 9241-400 defines clear decision paths for both, the designer and the buyer of keyboards. Nevertheless it is
beyond doubt that major ergonomic improvements are not done by the optimization
of keyboard and mouse, but by improvements of office software ergonomics. Contrary
to many cases where the term “interface” refers only to the graphical interface,
RASKIN’s “The Humane Interface” [203] is a decided and enjoyable collection of
software with unergonomic graphical interfaces offering methods and design criteria
for their improvement.
The Sense of Touch in Music
If regarded from an abstract standpoint, haptic sense and acoustic perception have
multifarious parallels. Both are sensitive to the perception of mechanical oscillations
and cover a comparable frequency range. Thereby the haptic sense rather perceives
frequencies covering two decades below 1 kHz, whereas the acoustic sense rather
perceives frequencies up to two decades above 100 Hz. Music quite often makes
use of these parallels which may be used to perceive the oscillations of the string of
a valuable violin or harp; or to touch the soft vibration of a wind instrument giving
a low A. They are even to be found in studio technology. Devices like the “ButtKicker”
(fig. 1.12) from The Guitammer Company are electrodynamic actuators
which are used as tactile feedback devices during concerts. They transmit the lower
frequency range to the drummer giving the rhythm of the band without drowning his
own instrument. Additionally the acoustic pressure for the musicians is reduced, as
they may not necessarily want to be exposed to the same loudness as their excited
audience. These kinds of actuators are also suitable for e.g. the couch in a home cinema or chairs in front of gaming PCs to increase the perception of bass-intense
effects. Here again, the tactile effect is of similar intensity as the perception of a
bass impulse, connected with the advantage that little acoustic pressure is emanated
resulting in almost no disturbing noise for people around.
Fig. 1.12 Electrodynamic actuator “ButtKicker” for generating low-frequency oscillations on a
drum-stool.
Monday, February 15, 2010
preface to haptics
The term “haptics” unlike the terms “optics” or “acoustics” is not so well-known
to the majority of people, not even to those who buy and use products related to
haptics. The words “haptics” and “haptic” refer to everything concerning the sense
of touch. “Haptics” is everything and everything is “haptic”, because it does not
only describe pure mechanical interaction, but also includes thermal- and pain- (nociception)
perception. The sense of touch makes it possible for humans and other
living beings to perceive the “borders of their physical being”, i.e. to identify where
their own body begins and where it ends. With regard to this aspect, the sense of
touch is much more efficient than the sense of vision, as well in resolution as in the
covered dihedral angle, e.g.: In the heat of a basketball match a light touch on our
back immediately makes us aware of an attacking player we do not see. We notice
the intensity of contact, the direction of the movement by a shear on our skin or a
breeze moving our body hairs - all this is perceived without catching a glimpse of
the opponent.
“Haptic systems” are divided into two classes1. There are the time-invariant systems
(the keys of my keyboard), which generate a more or less unchanging haptic effect
whether being pressed today or in a year’s time. Structures like surfaces, e.g. the
wooden surface of my table, are also part of this group. These haptically interesting
surfaces are often named “haptic textures”. Furthermore, there are active, reconfigurable
systems, which change their haptic properties partly or totally dependent on
a preselection - e.g. from a menu. Finally, there are combinations and hybrid forms
of systems, which are presented and explained in the corresponding chapters. The
focus of this book is on the technological design criteria for active, reconfigurable
systems, providing a haptic coupling of user and object in a mainly mechanical un-derstanding. Thermal and nociceptive perceptions are mentioned according to their
significance but are not seriously discussed. This is also the case with regard to passive
haptic systems.
The fact that you have bought this book suggests that you are interested in haptics.
You might have already tried to sketch a technical system meant to fool haptic perception.
And this attempt may have been more or less successful, e.g. concerning
your choice of the actuators. Maybe, you are just planning a project as part of your
studies or as a commercial product aimed at improving a certain manual control or
at introducing a new control concept. Approaches of this kind are quite frequent.
Many of the first active haptic systems were used in airplanes, to make aware of
critical situations by a vibrating control handle. Nowadays, the most wide-spread
active haptic system surely is the vibration of a mobile-phone. It enables its user to
notice the reception of a message without visual or auditory contact, whereby even
the type of the message - SMS or phone call - is coded in this buzzing haptic signal.
More complex haptic systems can be found in automotive technology, as e.g. reconfigurable
haptic control knobs. They are typically located in the center of the control
console and are usually part of complex luxury limousines. Today, multidimensional
haptic interaction is no longer limited to navigation- or modeling purposes of professional
users, but has also found its way into interaction during computer gaming.
Maybe, you are a member of the popular group of doctors and surgeons actively
using haptics in medical technology. There has been a continuous increase of, the
complexity of the tools for minimally-invasive surgery - longitudinal instruments
with a limited degree of freedom to inspect and manipulate human tissue through
small artificial or natural openings in the human body. This automatically results
in the loss of the direct contact between surgeon and the manipulated tissue. For
decades, the wish to improve the haptic feedback during such kinds of applications
and/or the realization of training methods for minimally-invasive surgery has been
a high motivation for researchers in haptic device design, however without a satisfactory
commercial breakthrough, yet significant improvements in telemanipulation
and simulation have been achieved.
Despite of or even because of the great variety of projects in industry and research
working with haptic systems, the common understanding of “haptics” and the terms
directly referring to it, like “kinaesthetic” and “tactile” are by no means as unambiguous
and indisputable as they should be. In this book, we, the authors, intend to
offer you a help to act more safely in the area of designing haptic devices. This book
will begin with the presentation of the terminology and its usage according to what
we regard as appropriate. Then it will provide a deeper understanding of haptics and
a simplified engineering description, and will finally lead to concrete instructions
and recommendations for the design of technologically complex haptic systems.
Besides the intention to generate real hardware design, there is another reason for
dealing with haptic device design: A continuing ambition to extend one’s knowledge
of haptic perception. This discipline, named “psychophysics” is an “unsharp”,
non deterministic science formulating hypotheses and systematically checking them
with the help of experiments and observations. These experiments are paramount to
any progress. Consequently, special attention has to be paid to their quality and theparameters observed. As a by-product of this important science of haptic research
a plurality of devices and technical systems have been built. In fact psychophysics
uses expertise in many different disciplines to solve its problems. As a consequence,
important and creative engineers and scientists like Prof. HONG TAN and Prof. VINCENT
HAYWARD have not only been designing high fidelity and very efficient haptic
devices, but are also heavily involved in the research on psychophysical parameters.
Psychophysics with emphasis on haptic questions is a very dynamic science. Every
year, an uncounted number of results and experiments are published at congresses
and in journals. Lately, MARTIN GRUNWALD [79] has published a notable summary
of the latest state of knowledge. The book you are holding in your hands
does not claim to keep up with every detail of this psychophysical progress. However,
it tries to include as many of its findings as possible into the design of haptic
devices. This book has been written by and is addressed to engineers of all the disciplines
mentioned before: Design-engineers representing mechanical engineering,
hardware-near electrical engineering, control-engineering, software-engineering or
as a synergy of expertise in all disciplines of mechatronics.
As said before, the haptic sense is doubtlessly gaining in importance. This can be
concluded from the great number of scientific publications on this subject and from
the fact that all relevant distal senses like the senses of sight and hearing have already
been provided with synthetic information in almost perfect quality in every-day life.
“Perfect quality” may have different meanings depending on the actual context. A
realistic rendering of a sensual experience can be an important requirement. The
resolution of a 3D-monitor has to be below the resolution capability of the human
eye in color dynamics and spatial distances between the picture elements (pixels).
Sounds have to be traceable in space and must not interfere with artifacts of the
storage- or transmission medium. In different circumstances attracting attention can
be another “perfect quality”. Typically, warning signals in the dashboard of a car
are visual examples, so are acoustic signals in the cockpit of an airplane. Another
demand on “perfect quality” can be the simultaneous requirement of high discriminabilty
and large range - just think of navigational signals for ships. Both areas -
optics and acoustics - have been subject to intense research for decades and have
been provided with numerous intelligent device designs. In many cases the borders
of the human capability of perception of the information provided have been reached
or even crossed nowadays. At this point it is obvious to make use of another human
sense to transmit information. Another motivation is the true-to-life simulation of
virtual environments. After visual and auditory presentation having reached a high
quality, the focus is directed to the haptic sense as being the next important one.
Only this sense enables us to experience our physical borders and the synergy of
interaction and perception.
Further areas of haptic research are telepresence and telemanipulation systems. In
these cases, an intuitive and immediate feedback is a prerequisite for a safe handling
of e.g. dangerous and / or valuable materials. There are reasons enough and
to spare for dealing with the design of haptic devices which are demanded by the
market. However, experts are rare and the access to this subject is difficult. The design
of haptic devices demands interdisciplinary knowledge which should includethe basics of the properties of haptic perception and its dynamic-dependence on amplitude
and frequency. Furthermore an overview of technological solutions, like the
designs of actuators, kinematics or complete systems including software-solutions
and the interfaces to simulations and virtual reality systems may be extremely helpful.
For designing virtual reality systems it is also necessary to know the concepts of
haptic renderings to enhance communication between soft- and hardware engineers.
The authors of this book regard their task as being fulfilled as soon as this book
helps to fascinate more design-engineers by the development of haptic devices, thus
speeding up the creation of more and better haptic systems available on the market.
to the majority of people, not even to those who buy and use products related to
haptics. The words “haptics” and “haptic” refer to everything concerning the sense
of touch. “Haptics” is everything and everything is “haptic”, because it does not
only describe pure mechanical interaction, but also includes thermal- and pain- (nociception)
perception. The sense of touch makes it possible for humans and other
living beings to perceive the “borders of their physical being”, i.e. to identify where
their own body begins and where it ends. With regard to this aspect, the sense of
touch is much more efficient than the sense of vision, as well in resolution as in the
covered dihedral angle, e.g.: In the heat of a basketball match a light touch on our
back immediately makes us aware of an attacking player we do not see. We notice
the intensity of contact, the direction of the movement by a shear on our skin or a
breeze moving our body hairs - all this is perceived without catching a glimpse of
the opponent.
“Haptic systems” are divided into two classes1. There are the time-invariant systems
(the keys of my keyboard), which generate a more or less unchanging haptic effect
whether being pressed today or in a year’s time. Structures like surfaces, e.g. the
wooden surface of my table, are also part of this group. These haptically interesting
surfaces are often named “haptic textures”. Furthermore, there are active, reconfigurable
systems, which change their haptic properties partly or totally dependent on
a preselection - e.g. from a menu. Finally, there are combinations and hybrid forms
of systems, which are presented and explained in the corresponding chapters. The
focus of this book is on the technological design criteria for active, reconfigurable
systems, providing a haptic coupling of user and object in a mainly mechanical un-derstanding. Thermal and nociceptive perceptions are mentioned according to their
significance but are not seriously discussed. This is also the case with regard to passive
haptic systems.
The fact that you have bought this book suggests that you are interested in haptics.
You might have already tried to sketch a technical system meant to fool haptic perception.
And this attempt may have been more or less successful, e.g. concerning
your choice of the actuators. Maybe, you are just planning a project as part of your
studies or as a commercial product aimed at improving a certain manual control or
at introducing a new control concept. Approaches of this kind are quite frequent.
Many of the first active haptic systems were used in airplanes, to make aware of
critical situations by a vibrating control handle. Nowadays, the most wide-spread
active haptic system surely is the vibration of a mobile-phone. It enables its user to
notice the reception of a message without visual or auditory contact, whereby even
the type of the message - SMS or phone call - is coded in this buzzing haptic signal.
More complex haptic systems can be found in automotive technology, as e.g. reconfigurable
haptic control knobs. They are typically located in the center of the control
console and are usually part of complex luxury limousines. Today, multidimensional
haptic interaction is no longer limited to navigation- or modeling purposes of professional
users, but has also found its way into interaction during computer gaming.
Maybe, you are a member of the popular group of doctors and surgeons actively
using haptics in medical technology. There has been a continuous increase of, the
complexity of the tools for minimally-invasive surgery - longitudinal instruments
with a limited degree of freedom to inspect and manipulate human tissue through
small artificial or natural openings in the human body. This automatically results
in the loss of the direct contact between surgeon and the manipulated tissue. For
decades, the wish to improve the haptic feedback during such kinds of applications
and/or the realization of training methods for minimally-invasive surgery has been
a high motivation for researchers in haptic device design, however without a satisfactory
commercial breakthrough, yet significant improvements in telemanipulation
and simulation have been achieved.
Despite of or even because of the great variety of projects in industry and research
working with haptic systems, the common understanding of “haptics” and the terms
directly referring to it, like “kinaesthetic” and “tactile” are by no means as unambiguous
and indisputable as they should be. In this book, we, the authors, intend to
offer you a help to act more safely in the area of designing haptic devices. This book
will begin with the presentation of the terminology and its usage according to what
we regard as appropriate. Then it will provide a deeper understanding of haptics and
a simplified engineering description, and will finally lead to concrete instructions
and recommendations for the design of technologically complex haptic systems.
Besides the intention to generate real hardware design, there is another reason for
dealing with haptic device design: A continuing ambition to extend one’s knowledge
of haptic perception. This discipline, named “psychophysics” is an “unsharp”,
non deterministic science formulating hypotheses and systematically checking them
with the help of experiments and observations. These experiments are paramount to
any progress. Consequently, special attention has to be paid to their quality and theparameters observed. As a by-product of this important science of haptic research
a plurality of devices and technical systems have been built. In fact psychophysics
uses expertise in many different disciplines to solve its problems. As a consequence,
important and creative engineers and scientists like Prof. HONG TAN and Prof. VINCENT
HAYWARD have not only been designing high fidelity and very efficient haptic
devices, but are also heavily involved in the research on psychophysical parameters.
Psychophysics with emphasis on haptic questions is a very dynamic science. Every
year, an uncounted number of results and experiments are published at congresses
and in journals. Lately, MARTIN GRUNWALD [79] has published a notable summary
of the latest state of knowledge. The book you are holding in your hands
does not claim to keep up with every detail of this psychophysical progress. However,
it tries to include as many of its findings as possible into the design of haptic
devices. This book has been written by and is addressed to engineers of all the disciplines
mentioned before: Design-engineers representing mechanical engineering,
hardware-near electrical engineering, control-engineering, software-engineering or
as a synergy of expertise in all disciplines of mechatronics.
As said before, the haptic sense is doubtlessly gaining in importance. This can be
concluded from the great number of scientific publications on this subject and from
the fact that all relevant distal senses like the senses of sight and hearing have already
been provided with synthetic information in almost perfect quality in every-day life.
“Perfect quality” may have different meanings depending on the actual context. A
realistic rendering of a sensual experience can be an important requirement. The
resolution of a 3D-monitor has to be below the resolution capability of the human
eye in color dynamics and spatial distances between the picture elements (pixels).
Sounds have to be traceable in space and must not interfere with artifacts of the
storage- or transmission medium. In different circumstances attracting attention can
be another “perfect quality”. Typically, warning signals in the dashboard of a car
are visual examples, so are acoustic signals in the cockpit of an airplane. Another
demand on “perfect quality” can be the simultaneous requirement of high discriminabilty
and large range - just think of navigational signals for ships. Both areas -
optics and acoustics - have been subject to intense research for decades and have
been provided with numerous intelligent device designs. In many cases the borders
of the human capability of perception of the information provided have been reached
or even crossed nowadays. At this point it is obvious to make use of another human
sense to transmit information. Another motivation is the true-to-life simulation of
virtual environments. After visual and auditory presentation having reached a high
quality, the focus is directed to the haptic sense as being the next important one.
Only this sense enables us to experience our physical borders and the synergy of
interaction and perception.
Further areas of haptic research are telepresence and telemanipulation systems. In
these cases, an intuitive and immediate feedback is a prerequisite for a safe handling
of e.g. dangerous and / or valuable materials. There are reasons enough and
to spare for dealing with the design of haptic devices which are demanded by the
market. However, experts are rare and the access to this subject is difficult. The design
of haptic devices demands interdisciplinary knowledge which should includethe basics of the properties of haptic perception and its dynamic-dependence on amplitude
and frequency. Furthermore an overview of technological solutions, like the
designs of actuators, kinematics or complete systems including software-solutions
and the interfaces to simulations and virtual reality systems may be extremely helpful.
For designing virtual reality systems it is also necessary to know the concepts of
haptic renderings to enhance communication between soft- and hardware engineers.
The authors of this book regard their task as being fulfilled as soon as this book
helps to fascinate more design-engineers by the development of haptic devices, thus
speeding up the creation of more and better haptic systems available on the market.
Thursday, February 11, 2010
Abstract—A haptic interface is a kinesthetic link between a
human operator and a virtual environment. This paper addresses
fundamental stability and performance issues associated with
haptic interaction. It generalizes and extends the concept of a
virtual coupling network, an artificial link between the haptic
display and a virtual world, to include both the impedance and
admittance models of haptic interaction. A benchmark example
exposes an important duality between these two cases. Linear
circuit theory is used to develop necessary and sufficient conditions
for the stability of a haptic simulation, assuming the human
operator and virtual environment are passive. These equations
lead to an explicit design procedure for virtual coupling networks
which give maximum performance while guaranteeing stability.
By decoupling the haptic display control problem from the design
of virtual environments, the use of a virtual coupling network
frees the developer of haptic-enabled virtual reality models from
issues of mechanical stability.
Index Terms—Absolute stability, force feedback, haptic interface,
impedance control, two-port network, unconditional stability,
virtual reality.
I. INTRODUCTION
Ahaptic interface conveys a kinesthetic sense of presence
to a human operator interacting with a computer generated
environment. Historically, human-computer interaction
has taken place through one-directional channels of information.
Visual and audio information is sent from the computer
to the operator. Keyboard, mouse, and joystick inputs transfer
human inputs to the machine. Human neuromuscular and
decision responses close this information loop. Oscillatory
behavior is possible in this configuration, for example, when
attempting to track a moving target with the mouse in the
presence of delay in the rendering of graphics. Since there
is no kinesthetic energy flow to the operator, such an event
is at worst annoying, but never physically threatening. Haptic
interaction is fundamentally different in that physical energy
flows bi-directionally, from and to the human operator. The
haptic display, typically some form of robotic manipulator,
creates a feedback loop which includes not only the human
neuromuscular and decision responses, but also the biomechanical
impedance characteristics of the operator’s contact
with the device. The human grasp may stabilize an otherwise
unstable system by absorbing mechanical energy. Conversely,
the human grasp may destabilize an otherwise stable system by
reflecting energy back into the system. Since the haptic device
actively generates physical energy, instabilities can damage
hardware and even pose a physical threat to the human.
A number of authors have considered issues of stability
in haptic simulation. Minsky et al. [1] explored stability
problems in the haptic display of simple virtual environments.
They noted a critical tradeoff between simulation rate, virtual
wall stiffness, and device viscosity and provided insights
into the role of the human operator in stability concerns.
A more rigorous examination of the stability problem was
performed by Colgate et al. [2]. They used a simple benchmark
problem to derive conditions under which a haptic display
would exhibit passive behavior. Salcudean and Vlaar [3]
studied the stability properties of a discrete, proportionalplus-
derivative, virtual wall implementation for a magnetically
levitated force feedback joystick. They found very low device
friction significantly limited the achievable stiffness of the
virtual environment. A much higher perceived stiffness was
achieved using a braking pulse at the moment of impact with
the virtual surface. While each of these works are significant
contributions to the field, their analyzes are limited to specific
assumptions about the type of haptic display used and the
type of virtual environment being simulated. The problem lies
in the fact that no distinction is made between the virtual
environment and the control law for the haptic device. In fact,
in the above examples, the virtual environment is the control
law. It is encumbered with twin roles of creating realistic force
feedback cues to render a virtual scene and ensuring the haptic
device remains stable.
One way of decoupling the haptic device control problem
from virtual scene generation is the introduction of an artificial
coupling between the haptic display and the virtual
environment. Colgate et al. [4] introduced the idea of a virtual
coupling for haptic displays which guarantees stability for
arbitrary passive human operators and environments. Zilles
and Salisbury [5] presented a heuristically motivated “godobject”
approach which greatly simplifies control law design.
Ruspini et al. [6] use a virtual “proxy” extension of the
god-object to couple a Phantom device to a three degree-offreedom
constraint based simulation. These implementations
can be grouped together as special cases of a virtual coupling
network, a two-port interface between the haptic display and
the virtual environment. This network can play the important
role of making the stability of the haptic simulation independent
of both human grasp impedance and the details of virtual
environment design. All of the above-mentioned work focuses
on one particular class of haptic display, those which render
impedance. No similar work on virtual couplings has appeared
for the complementary case of haptic displays which render
admittance and very little exists in explicit criteria for the
design of virtual coupling networks.
This paper extends the concept of a virtual coupling to
admittance displays and attempts to treat the problem of
stable haptic interaction in a more general framework which
encompasses any combination of haptic display and virtual
environment causality. Llewelyn’s criteria for “unconditional
stability” is introduced as a tool in the design and evaluation
of virtual coupling networks. A benchmark example illustrates
some fundamental stability and performance tradeoffs and
brings to light an important duality between the impedance
and admittance models of haptic interaction.
II. PRELIMINARIES
A. Terminology
The following terms are used throughout this paper.
Haptic display mechanical device configured to convey
kinesthetic cues to a human operator.
Haptic displays vary greatly in kinematic structure, workspace,
and force output. They can be broadly classified into two
categories, those which “measure motion and display force”
and those which “measure force and display motion” [7]. The
former will be referred to as impedance displays, the latter
as admittance displays. Impedance displays typically have
low inertia and are highly back-drivable. The well known
Phantom [8] family of haptic displays, the McGill University
Pantograph [9], and the University of Washington Pen-Based
Force Display [10] fall into this class, along with many
others. Admittance displays are often high-inertia, non backdrivable
manipulators fitted with force sensors and driven by a
position or velocity control loop. Examples include Carnegie
Mellon University’s WYSIWYF Display [11] and the Iowa
State/Boeing virtual aircraft control column [12], both of
which are based upon PUMA 560 industrial robots.
Haptic interface includes everything that comes between
the human operator and the virtual environment.
This always includes the haptic device, control software,
and analog-to-digital/digital-to-analog conversion. It may also
include a virtual coupling network which links the haptic
display to the virtual world. The haptic interface characterizes
the exchange of energy between the operator and the virtual
world and thus is important for both stability and performance
analysis.
Virtual environment computer generated model of some
physically motivated scene.
The virtual world may be as elaborate as a high-fidelity walkthrough
simulation of a new aircraft design, or as simple as
a computer air hockey game. Regardless of its complexity,
there are two fundamentally different ways in which a physically
based model can interact with the haptic interface. The
environment can act as an impedance, accepting velocities (or
positions) and generating forces according to some physical
model. This class includes all so-called penalty based approaches
and to-date has been the most prevalent [1]–[3], [8],
human operator and a virtual environment. This paper addresses
fundamental stability and performance issues associated with
haptic interaction. It generalizes and extends the concept of a
virtual coupling network, an artificial link between the haptic
display and a virtual world, to include both the impedance and
admittance models of haptic interaction. A benchmark example
exposes an important duality between these two cases. Linear
circuit theory is used to develop necessary and sufficient conditions
for the stability of a haptic simulation, assuming the human
operator and virtual environment are passive. These equations
lead to an explicit design procedure for virtual coupling networks
which give maximum performance while guaranteeing stability.
By decoupling the haptic display control problem from the design
of virtual environments, the use of a virtual coupling network
frees the developer of haptic-enabled virtual reality models from
issues of mechanical stability.
Index Terms—Absolute stability, force feedback, haptic interface,
impedance control, two-port network, unconditional stability,
virtual reality.
I. INTRODUCTION
Ahaptic interface conveys a kinesthetic sense of presence
to a human operator interacting with a computer generated
environment. Historically, human-computer interaction
has taken place through one-directional channels of information.
Visual and audio information is sent from the computer
to the operator. Keyboard, mouse, and joystick inputs transfer
human inputs to the machine. Human neuromuscular and
decision responses close this information loop. Oscillatory
behavior is possible in this configuration, for example, when
attempting to track a moving target with the mouse in the
presence of delay in the rendering of graphics. Since there
is no kinesthetic energy flow to the operator, such an event
is at worst annoying, but never physically threatening. Haptic
interaction is fundamentally different in that physical energy
flows bi-directionally, from and to the human operator. The
haptic display, typically some form of robotic manipulator,
creates a feedback loop which includes not only the human
neuromuscular and decision responses, but also the biomechanical
impedance characteristics of the operator’s contact
with the device. The human grasp may stabilize an otherwise
unstable system by absorbing mechanical energy. Conversely,
the human grasp may destabilize an otherwise stable system by
reflecting energy back into the system. Since the haptic device
actively generates physical energy, instabilities can damage
hardware and even pose a physical threat to the human.
A number of authors have considered issues of stability
in haptic simulation. Minsky et al. [1] explored stability
problems in the haptic display of simple virtual environments.
They noted a critical tradeoff between simulation rate, virtual
wall stiffness, and device viscosity and provided insights
into the role of the human operator in stability concerns.
A more rigorous examination of the stability problem was
performed by Colgate et al. [2]. They used a simple benchmark
problem to derive conditions under which a haptic display
would exhibit passive behavior. Salcudean and Vlaar [3]
studied the stability properties of a discrete, proportionalplus-
derivative, virtual wall implementation for a magnetically
levitated force feedback joystick. They found very low device
friction significantly limited the achievable stiffness of the
virtual environment. A much higher perceived stiffness was
achieved using a braking pulse at the moment of impact with
the virtual surface. While each of these works are significant
contributions to the field, their analyzes are limited to specific
assumptions about the type of haptic display used and the
type of virtual environment being simulated. The problem lies
in the fact that no distinction is made between the virtual
environment and the control law for the haptic device. In fact,
in the above examples, the virtual environment is the control
law. It is encumbered with twin roles of creating realistic force
feedback cues to render a virtual scene and ensuring the haptic
device remains stable.
One way of decoupling the haptic device control problem
from virtual scene generation is the introduction of an artificial
coupling between the haptic display and the virtual
environment. Colgate et al. [4] introduced the idea of a virtual
coupling for haptic displays which guarantees stability for
arbitrary passive human operators and environments. Zilles
and Salisbury [5] presented a heuristically motivated “godobject”
approach which greatly simplifies control law design.
Ruspini et al. [6] use a virtual “proxy” extension of the
god-object to couple a Phantom device to a three degree-offreedom
constraint based simulation. These implementations
can be grouped together as special cases of a virtual coupling
network, a two-port interface between the haptic display and
the virtual environment. This network can play the important
role of making the stability of the haptic simulation independent
of both human grasp impedance and the details of virtual
environment design. All of the above-mentioned work focuses
on one particular class of haptic display, those which render
impedance. No similar work on virtual couplings has appeared
for the complementary case of haptic displays which render
admittance and very little exists in explicit criteria for the
design of virtual coupling networks.
This paper extends the concept of a virtual coupling to
admittance displays and attempts to treat the problem of
stable haptic interaction in a more general framework which
encompasses any combination of haptic display and virtual
environment causality. Llewelyn’s criteria for “unconditional
stability” is introduced as a tool in the design and evaluation
of virtual coupling networks. A benchmark example illustrates
some fundamental stability and performance tradeoffs and
brings to light an important duality between the impedance
and admittance models of haptic interaction.
II. PRELIMINARIES
A. Terminology
The following terms are used throughout this paper.
Haptic display mechanical device configured to convey
kinesthetic cues to a human operator.
Haptic displays vary greatly in kinematic structure, workspace,
and force output. They can be broadly classified into two
categories, those which “measure motion and display force”
and those which “measure force and display motion” [7]. The
former will be referred to as impedance displays, the latter
as admittance displays. Impedance displays typically have
low inertia and are highly back-drivable. The well known
Phantom [8] family of haptic displays, the McGill University
Pantograph [9], and the University of Washington Pen-Based
Force Display [10] fall into this class, along with many
others. Admittance displays are often high-inertia, non backdrivable
manipulators fitted with force sensors and driven by a
position or velocity control loop. Examples include Carnegie
Mellon University’s WYSIWYF Display [11] and the Iowa
State/Boeing virtual aircraft control column [12], both of
which are based upon PUMA 560 industrial robots.
Haptic interface includes everything that comes between
the human operator and the virtual environment.
This always includes the haptic device, control software,
and analog-to-digital/digital-to-analog conversion. It may also
include a virtual coupling network which links the haptic
display to the virtual world. The haptic interface characterizes
the exchange of energy between the operator and the virtual
world and thus is important for both stability and performance
analysis.
Virtual environment computer generated model of some
physically motivated scene.
The virtual world may be as elaborate as a high-fidelity walkthrough
simulation of a new aircraft design, or as simple as
a computer air hockey game. Regardless of its complexity,
there are two fundamentally different ways in which a physically
based model can interact with the haptic interface. The
environment can act as an impedance, accepting velocities (or
positions) and generating forces according to some physical
model. This class includes all so-called penalty based approaches
and to-date has been the most prevalent [1]–[3], [8],
Subscribe to:
Posts (Atom)