In the introduction a number of terminologies originating from the context of
haptic science and device design has already been used. In this chapter a systematic
introduction into the area of designing haptic devices begins. The following sections
explain the scientific and industrial disciplines participating in the research and development of haptic devices. Afterward terms and their definitions are introduced
and illustrated by examples how to characterize haptic systems based on some concrete
technical devices.
Scientific Disciplines as Part of Haptic Research:
In haptic science there are three groups of interest (fig. 2.1) with quite fluent bordersin between: Scientists working within the area of “haptic perception” proceed
according to strictly deductive scientific principles: Resulting from an observation a
hypothesis is derived. For this hypothesis an experiment is designed by testing the
point of the hypothesis by the exclusion of other varying parameters. As a result the
hypothesis is veri- or falsified leading to a new and improved hypothesis.
Research in the area of “haptic perception” is done by two scientific disciplines:Psychophysics and Neurobiology. Psychophysics deals with the analysis of the impression of physical stimuli - in the case of haptic perception this mainly refers to oscillations and forces of different spatial orientation. The aim of psychophysics is to create a model explaining perception. Neurobiology observes biologically measurable connections and analyzes the direct conversion of physical stimuli into neuronal signals and their processing within the brain. Both disciplines complement each other so that the neuronal observation should be able to explain a part of the psychophysical model and vice versa. These scientific disciplines formulate technical tasks for the preparation of experiments which are processed by two groups interested in “haptic synthesis” or “haptic measurement”, respectively
Fig. 2.1 Overview about the disciplines participating in haptic research.
On an alternative track both groups get assignments from industry making themselves use of the knowledge gathered by research on haptic perception. These groups work according to engineering solution strategies An assumption of requirements is derived from a technical question based on the current state of knowledge. A functional prototype and later a product to fulfill the requirements is designed in a developmental process accompanied by a continuous tracking of the prior assumptions and their meaning. Then the product obtained can be used for the analysis of psychophysical questions, or, respectively as a a product of the gaming-, automotive or aviation industry.
In the case of the generation of haptic impressions for Virtual-Reality (VR) applications the technical requirements typically ask for tactile, kinaesthetic or combined feedback systems. In that area the emphasis is on the correct choice of actuators, control and driver electronics and on the processing and transmission of signals. Due to the coupling of devices and time-discrete simulation systems a consideration of discretization-effects and their influence on the haptic quality of the impression is necessary. In the case of telemanipulation systems technical challenges are comparable. The main difference lies in the necessary measurement technology for the acquisition of haptic object properties. Additionally, the control engineering questions are more complex, as this area typically deals with closed-loop systems with unknown loads on both ends.
Terms and Terminology Used for the Description of Haptic Systems:
The definition of the terminology within the context of haptic systems is subject
to the current ISO 9241-910 norm. Many of the definitions used in this book follow
the terminology presented there. According to the author’s experience, all these
terminologies have the status of recommendations shared by a large number of researchers associated with the haptic area. However, there is no binding consent for their usage within the haptic community, so that many actual and future papers differ from the definitions presented above. The nomenclature mentioned here is based on prior publications to this material, especially by HAYWARD [90], COLGATE [176], HANNAFORD [85], BURDEA [34], ADAMS [2] and many papers by other authors.
Basic Concepts of Haptics:
Haptics means the combined sensation of mechanical, thermal and noci-perception
(fig. 2.2). It is more or less defined by the exclusion of the optical, acoustic, olfactoryand gustatory perception from the sum of sensory perceptions. As a result
haptics consists of nociceptive, thermoceptive, kinaesthetic and tactile perceptions. The sense of balance takes an exceptional position as it is not counted among the five human senses having receptors of their own Yet, it really exists making use of all other senses’ receptors, especially the haptic ones.
Haptics describes the sensory as well as the motor capabilities within the skin,joints, muscles and tendons.
Tactile means the mechanical interaction with the skin. Therefore tactile perception is the sensation of exclusively mechanical interaction. Please note that tactile perception is not exclusively bound to forces or movements.
Kinaesthetics describes both, actuatory and sensory capabilities of muscles and joints. It refers to their forces, torques, movements, positions and angles. As a result any kinaesthetic interaction has a tactile component due to this definition.
Fig. 2.2 Distribution of senses.
2.2.2 Definition of Haptic Systems:
The technical terminology is listed from the special to the general and illustrated
by block diagrams. The arrows between the components of the block diagrams may
represent different kinds of information depending on the devices they refer to. They
remain unlabeled. Haptic devices are capable of transmitting elongations, forces and
temperature differences and in a few realizations they also stimulate pain receptors.
The terms “system” and “device” and “component” are not defined on an interdisciplinary
basis. Dependent on one’s point of view the same object can be e.g. “a
device” for a hardware-designer ,“a system” for the software-engineer, or “just a
component” for another hardware-engineer. These terms are nevertheless part of any
engineering discipline and are used accordingly here but should anyhow be read with
this knowledge in mind.
A haptic device is a system generating an output which can be perceived haptically.
It has (fig. 2.3) at least one output, but not necessarily any input. The tactile
markers on the keys F and J of a keyboard represent information for the positioning
of the index finger. By these properties the keys are already tactile devices. At
a closer look the key itself shows a haptically notable point of actuation, the haptic
click. This information is transmitted in a kinaesthetic and tactile way by the
interaction of the key’s mechanics with the muscles and joints and the force being
transmitted through the skin. Such a key is a haptic device without a changing input
and two outputs.
A user (in the context of haptic systems) is a receiver of haptic information.
A haptic controller describes a component of a haptic system for processing
haptic information flows and improving transmission. Quite pragmatic in the case
of telemanipulation systems these kinds of controllers are frequently either a spring damper coupling element between end-effector and the operating element or a local
abstraction model of the area of interaction to compensate transmission delays. In
the case of a haptic simulator it is quite frequently a simple LTI-model with a high
in- and output rate. The LTI model itself is then updated on a lower frequency than
the actual speed of the haptic in- and output.
Fig. 2.3 Haptic device, user and controller.
Haptic interaction describes the haptic transmission of information. This transmission
can be bi- or unidirectional (fig. 2.4). Moreover, specifically tactile (unidirectional)
or kinaesthetic (uni- or bidirectional) interaction may happen. A tactile
marker like embossed printing on a bill can communicate tactile information (the
bill’s value) as a result of haptic interaction.
Fig. 2.4 Haptic interaction.
The addressability of haptic systems refers to the subdivision (spatial or temporal)
of an output signal of a device (frequently a force) or of the user (frequently a
position).
The resolution of a haptic system refers to the capability to detect a subdivision
(spatial or temporal) of an input signal. With reference to a device this is in accordance with the measuring accuracy. With respect to the user this corresponds to his
perceptual resolution.
A haptic marker refers to a mark communicating information about the object
carrying the marker by way of a defined code of some kind. Examples are markers
in Braille on bills or road maps. Frequently these markers are just tactile, but there
are also kinaesthetically effective ones marking sidewalks and road crossings for
visually handicapped people.
A haptic display is a haptic device permitting haptic interaction, whereby the
transmitted information is subject to change (fig. 2.5). There are purely tactile as
well as kinaesthetic displays.
A tactor is a haptic purely tactile haptic display generating a dynamic and oscillating
output. They usually provide a translatory output (e.g. fig. 9.19), but could
also be rotatory (e.g. fig. 2.14).
Fig. 2.5 Haptic display.
A haptic interface is a haptic device permitting a haptic interaction, whereby the
transmitted information is subject to change and a measure of the haptic interaction
is acquired (fig. 2.6). A haptic interface always refers to data and device.
Force-Feedback (FFB) refers to the information transmitted by kinaesthetic interaction
(fig. 2.6). It is a term coined by numerous commercial products like FFBjoysticks,
FFB-steering wheels and FFB-mice. Due to its usage in advertising, the
term Force Feedback (FFB) is seldom consistent with the other terminology given
here.
A haptic manipulator is a system interacting mechanically with objects whereby
continuously information about positions in space and forces and torques of the interaction
is acquired.
>
Fig. 2.6 Haptic interface.
A telemanipulation system refers to a system enabling a spatially separated haptic
interaction with a real physical object. There are purely mechanical telemanipulation
systems (fig. 2.7), scaling forces and movements via a lever-cable-system. In
the area of haptic interfaces, there are mainly electromechanic telemanipulation systems
according to figure 2.8 relevant. These systems allow an independent scaling
of forces and positions and an independent closed-loop control of haptic interface
and manipulator.
Fig. 2.7 Mechanical telemanipulator for handling dangerous goods (CRL model L) .
A haptic assistive system is a system adding haptic information to a natural interaction
(fig. 2.9). For this purpose object or interaction properties are measured via
a sensor and used to add valuable information in the interaction path. An application
would be a vibrating element indicating the leaving of a lane in a drive assistance
system.
A haptic simulator is a system enabling interaction with a virtual object (fig. 2.10).
It always requires a computer for the calculation of the object’s physical properties.
Haptic simulators and simulations are important incitements for the development of haptic devices. They can be found in serious training applications, e.g. for surgeons,
as well as in gaming applications for private use (see also chapter 13).
haptic devices. They can be found in serious training applications, e.g. for surgeons,
as well as in gaming applications for private use (see also chapter 13).
2.2.3 Parameters of Haptic Systems:
In [156] LAWRENCE defines the transparency T as a factor between impedance as
the input source of the haptic interface Zin and the actually felt output impedance
Zout of the device.
The principle of transparency is mainly a tool for control engineering purposes
analyzing stability and should be within the range ±3dB. T may be regarded as
the sole established, frequency dependent, characteristic value of haptic interfaces.
Frequently only the transparency‘s magnitude is considered. A transparency close
to “one” shows that the input impedance is not altered by the technical system. The
user of the haptic device being the end of the transmission chain experiences the
haptic input data in a pristine way. The concept of transparency can be applied to
telemanipulation systems and as well as to haptic simulators .
In [39] COLGATE describes the impedance width (Z-width) of a haptic system
Z−width = Zmax−Zmin
as the difference between the maximum load Zmax and the perceivable friction
and inertia at free space movement Zmin. The Z-width describes the potential of devices
and enables the comparability between them, after technical changes, e.g. by
the integration of a closed-loop control and a force measurement.
Active haptic devices are systems requiring an external energy source for the
display of haptic information. Usually, these are at least haptic displays. Passive
haptic devices, on the contrary, are systems transmitting haptic information solely
by their shape. This may lead to a false conclusion: A passive system in a control
engineering sense is a system with a negative energy flow at its input, e.g. a system
not emitting energy into the outside world. This concept of passive control is an
important stability criterion which will be discussed in detail in subsection 7.3.3.
For the moment, it should be noted that a passive haptic system is not necessarily
identical with a haptic system designed according to the criterion of passivity1.
The mechanical impedance Z is the complex coefficient between force F and velocity
v respectively torque M and angular velocityΩ. Impedance and its reciprocal
value - the mechanical admittance Y - are used for the mathematical description of
dynamic technical systems. High impedance means that a system is “stiff” or “inert”
and “grinds”. Low impedance describes a “light” or “soft” and “sliding” system.
The concept of impedance is applied to haptic systems by way of the terms displayimpedance
or interface-impedance Zd. It describes the impedance a system shows
when it is moved at its mechanical output (e.g. its handle).The concept of impedance
cannot be applied only to technical systems, but also to a simplified model of the
user and his mechanical properties. This is described by the term user-impedance
ZH. User-impedance - how stiff a user tends to be - can be influenced at will up
to a certain point. Shaking hands can either be hard or soft depending on its frequency.
The mechanical resistance of a handshake is lower at low frequencies and higher at high frequencies resulting simply from the inertia of the hand’s material.
Detailed descriptions of the building of models and the application of the concept
of user-impedance are given in section 4.2. An introduction into calculating with a
complex basis and mechanical systems is given in appendix 16. Understanding complex
calculation rules and the mechanical impedances are fundamental to the design
of haptic devices in the context of this book. Therefore it is recommended to update
one’s knowledge by self-studies of the relevant literature of electromechanics [158]
and control-engineering [167].
2.2.4 Characterization of Haptic Object Properties:
Besides the terminology for haptic systems, there is another group of terms describing
solely haptic objects and their properties:
Haptic texture refers to those object properties, which can exclusively be felt by
touch. The roughness of a surface, the structure of leather, even the haptic markers
already mentioned are haptic textures of the objects they are located on. In some
cases a differentiation is made between tangential and normal textures, whereby
the directional information refers to the skin’s surface. This specific differentiation
is more a result of technical limitations, than of a specialty of tactile perceptions
as tactile displays are frequently unable to generate a feedback covering a two or
three-dimensional movement.
Haptic shape refers to object properties which can mainly be felt kinaesthetically.
This can be the shape of a cup held in one’s hand. But it can also be the shape
and geometric design of a table rendered to be touched in a virtual environment.
In fact terms like texture and shape are used analogically to their meaning in
graphical programming and software techniques for 3D objects, where meshes provide
shape and surface-textures give color and fine structures. However, in comparison
with graphical texture, haptic texture mainly describes three-dimensional
surface properties incorporating properties like adhesion or friction, i.e. a realistic
haptic texture is much more complex in its parameters than a typical graphical
texture, even when considering bump-, specular or normal-maps. Therefore numerous
haptic surface properties, e.g. specific haptic surface effects are defined and
described from the perspective of a software engineer. These surface effects are
partly derived from physical equivalents of real objects, narrowed down to softwaremotivated
concepts in order to increase the degree of realism of haptic textures:
• Surface friction describes the viscose (velocity-proportional) friction of a contact
point on a surface.
• Surface adhesion Surface adhesion describes a force binding the movement of
a contact point to a surface. This concept allows simulating magnetic or sticking
effects.
• Roughness describes an uniform, sinoid structure of a small, defined amplitude
making the movement of a contact point on a surface appears rough.
Tacton refers to a sequence of stimuli adressing the tactile sense. It usually encodes
an event within the sequence’s pattern. The stimuli vary in intensity and frequency.
Both, stimuli and tacton, may even be overlayed with a time-dependent
amplitude modulation, such as fade-in or fade-out.
2.2.5 Technical examples:
There are several commercial haptic control units available on the market for the application
in design, CAD and modeling. One major player on the market is SensAble
with their PHANTOM R -series and the actually most low-cost product PHANTOM
Omni (fig. 2.11a). The PHANTOM-series can most easily be identified by the free
positioning of a pen-like handle in a three dimensional space. The position and
orientation of this handle is measured in three translational and three rotational degrees
of freedom. Depending on the model of the series, the tip force can act on the
handle in at least three translational dimensions. The generation of forces is done
via electrodynamic actuators; depending on the model these are either mechanically
or electronically commutated. The actuators are located within the device’s
basis and transmit their mechanical energy via levers and Bowden cables on the
corresponding joints. As a result of changing level-lengths the transmission-ratio
of the PHANTOM devices is nonlinear. For the static situation these changes are
compensated within the software driver. The PHANTOM devices are connected to
common PCs. The electrical interface used depends largely on the device’s product
generation and ranges from parallel ports to IDE cards and FireWire connectors.
The PHANTOM devices from SensAble are haptic devices (fig. 2.11c) primarily
addressing the kinaesthetic perception of the whole hand and the arm. As the force
transmission happens via a hand-held pen tactile requirements are automatically relevant
for the design too. This bidirectional haptic display is a haptic interface to the
user transmitting force information of a software application in a PC and feeding
back positioning information to her or him.
The network model of one degree of freedom (fig. 2.11b) shows the electronic
commutated electrodynamic motor as an idealized torque source M0 with inertia of
Θ of the rotor and a rotary damping dR resulting from bearings and links. By the
use of a converter resembling levers the rotary movement is transformed in a linear
movement with a force F0 and a velocity v0. An inertia m describes the mass of
the hand-held pen. The portion of the generated force Fout is dependent on the ratio
between the sum of all display-impedances against the user impedance ZH.
2.2.5.2 Reconfigurable Keyboard:
The reconfigurable keyboard (fig. 2.12a) is made of a number of independent actuators
arranged in a matrix. The actuators are electrodynamic linear motors with
a moving magnet. Each actuator can be controlled individually either as an openloop
controlled force source or as a positioning actuator by a closed-loop control.
When being used as force source, the primary purpose of the actuator is to follow
a configurable force/displacement curve of a typical key. The application of this reconfigurable
keyboard [46] is an alternative to the classical touchscreen - a surface
providing different haptically accessible functions depending on a selection within
a menu. For this purpose single actuators can be combined to larger keys and may
change in size and switching characteristics.
The reconfigurable keyboard is a haptic device (fig. 2.12c) mainly addressing the
kinaesthetic sensation, but has strong tactile properties, too. The user of the device is
the controller of the keyboard, receiving haptic information in form of the changing
shape of keys and their switching characteristics during interaction. The keyboard is at least a haptic display. As it communicates with another unit about the switching
event and the selection, it is also a haptic interface.
The network model (fig. 2.12b) of a single key shows the open-loop controlled
force source F0 of the electrodynamic actuator, the mass of the moving magnet m
and the friction in the linear guides d. Elasticity does not exist, as the design does
not contain any spring. This is in contrast to what could be expected from the typical
designs of electrodynamic speakers and their membranes. The actuator is capable
of generating a force Fout dependent on the ratio between the complex impedance
of the haptic display ZD = sm+d and the user’s impedance ZH.
2.2.5.3 Tactile Pin-Array:
Tactile pin-arrays are the archetype of all systems generating spatially coded information
for the haptic sense. Conceptually they are based on Braille-displays whose
psychophysical impression has been studied comprehensively since the middle of
the 20th century, e.g. by BÉKÉSY [23]. Many approaches were made ranging from
electromagnetic actuators of dot matrix printers [232] to piezoelectric bending actuators
[149] and pneumatic [292], hydraulic [231], electrostatic [290] and thermal [5]
actuators. Tactile pin arrays mainly focus on the skin’s stimulation in normal direction.
Only lately spatially resolved arrays with lateral force generation are receiving
an increased interest [142].
A tactile pin-array with excitation in normal skin direction is a haptic device
(fig. 2.13c) mainly addressing the tactile perception. The user is in continuous haptic
interaction with the device and receives haptic information coded in changing
pin heights. A tactile pin array is a haptic display. In contrast to the systems examined
before this time the user’s interaction does not include any user-feedback. As a
result the device is not necessarily a haptic interface2.
In the mechanical network model (fig. 2.13) a tactile pin array corresponds to
a positioning or velocity source v with a mechanical stiffness k in series to it (a
combination of actuator and kinematics). In a stiff design the mechanical admittance
of the technical system is small resulting in the elongation being totally unaffected
by the user’s touch. The system is open-loop position controlled.
2.2.5.4 Vibration-Motor:
Vibration-motors are used to direct attention to a certain event. There is a vibration
motor similar to figure 2.14a within every modern mobile phone, made of a rotary
actuator combined with a mass located eccentrically on its axis. Its rotation speed is
controlled by the voltage applied. It typically ranges from 7000 to 12000 rotations per minute (117 to 200 Hz). It is possible to encode information into the felt vibration
by varying the control voltage. This is often done with mobile phones in order
to make the ring tone haptically perceptible.
A vibration-motor is a haptic device (fig. 2.14c) addressing tactile perception.
The user is haptically interacting with the device and receives haptic information
in the form of oscillation coded in frequency and amplitude. A vibration-motor is a
pure form of a haptic display, or more precisely a purely tactile display.
With vibration motors the relevant force effect is the centripetal force. Assuming
a rotational speed of ω = 2π 10000RPM
60 Hz and a moving mass of 0.5g on a radius
of 2mm a force amplitude of F = mω2 r = 1.1N is generated, producing a sinoid
force with a peak-to-peak amplitude of 2.2N. This is an extraordinary value of an
actuator with a length of only 20 mm. Considering the network model (fig. 2.14b)
the vibratory-motor can be regarded as a force-source with sinoid output. It has to
accelerate a housing (e.g. the phone) with a mass m which is coupled to the user
via an elastic material, e.g. clothes. It is important for the function of the device
that the impedance of the branch with spring/damper coupling and user-impedance
ZH is large against the mass m. This guarantees that most of the vibration energy is
directed to the user, thus generating a perception.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment