You are on page 1of 7

A Conceptual Virtual Reality Model

John N. Latta
4th Wave

David J. Oberg
Global Outpost

This conceptual model


isolates and describes the
human and technical
elements that create the
participatory environments
of virtual reality systems.

irtual reality is an advanced human-computer interface shown in Figure 1 on page 24, that addresses respectively what
U that simulates a realistic environment and allows partici-
pants to interact with it. Experimental psychology provides a
VR is, how it is accomplished, and its effect-both of VR on the
participant and of the participant on the environment. The dis-
foundation for applying systems concepts to human participa- tinguishing what of VR is its extension of the human-com-
tion in an environment. The complexity of the human interface puter interface. Engineering discipline requires precise and
makes it impossible to completely characterize this participa- consistent definitions of each interfaced system. However, de-
tion, whether the environment is real or computer generated. spite numerous studies,-3there are no such definitions of human
However, we use this foundation here to propose a conceptual perceptual systems. We must therefore rely on more qualitative
model that examines both the human and technical elements of definitions. As a basis for our conceptual VR model, we adopted
a VR system. J.J. Gibsons descriptions of human perceptual and muscle sys-
We feel that the current fixation on immersive VR systems temsJ (see Tables 1 and 2 on pages 24 and 25). Gibson began as
fails to encompass the potential of the technology. The core of an experimental psychologist and throughout his career devel-
VR is the human-computer interface. A VR system might range oped a systems view of perception. In one of his last works, he
from a flight simulator to a synthetic environment to telepres- described the visual system in the context of what he called eco-
ence. The definition and conceptual model for the technology logical psychology. This contextual concept helps in under-
should be robust enough to encompass all three forms, as well standing a participants interaction with VR environments.
as others that develop as the technology evolves. Comparing the list of perceptual systems in Table 1with the
list of muscle systems in Table 2 suggests the degree of integra-
Definition of virtual reality tion possible between them. We believe that VR interface tech-
Virtual reality involves the creation and experience of envi- nology must integrate both systems. Again, the complexity of
ronments. Its central objective is to place the participant in an the human interface makes it impossible to define an environ-
environment that is not normally or easily experienced. This ment that fully interfaces to all the perceptual and muscle sys-
objective is satisfied by establishing a relationship between the tems. Our model will therefore examine VR systems for only
participant and the created environment. some perceptual systems, not all.
Accordingly, we developed a three-tiered definition of VR, The how tier in Figure 1 addresses the creation of a realis-

January l Y Y 4 23
Virtual Reality

Figure 1. VR objective and


definition.
To place the j Computer-based
participant in an ~ interface to
environment that is j human perceptual
not normally or ; and muscle
easily experienced ; systems

-
tions with a real environment.
Complete perceptual equiva-
lence is impossible, but there
is a subjective threshold at
which interactions within the
VR system feel natural and
(Experiential) (Operational) the environment seems real.
To have a significant To perform The compelling aspect of
personal experience operations while
while participating in in an environment VR is the experience it cre-
an environment ates, yet many VR applica-
tions have no experiential
element. They involve simply
accomplishing a task; the feel-
tic interactive sensory environment. This requires linking the ing from the process is secondary at best. We have therefore di-
sensory stimulation defined in tier 1 with the effect of VR par- vided the effect tier into two components, experiential and
ticipation defined in tier 3. Realism is a qualitative judgment, operational. We can define an operational VR system as one
achieved when the participant perceives equivalence between in- that provides a computer interface to specific human perceptual
teractions with an artificially created environment and interac- and muscle systems for the purpose of allowing the participant

Table 1. Perceptual systems.

System Mode of activh Receptive units Ewtemai information

Orienting Posturing and Mechanical Vestibular Forces of gravity Direction of gravity


orienting and gravity organs equilibrium and acceleration or acceleration

1 1
receptors and balance

Auditory Listening Mechanical Cochlear organs Orienting to Vibrations in Nature and location
receptors sounds he,t of vibratory events

I I

Haptic Touching Mechanical, Skin, joints, Exploration of Deformation Contact with object
thermal, and muscles, and many kinds of tissues, surfaces and shapes,
kinesthetic tendons including configuration material states,
receptors appendages, of joints, stretchins solidity and viscosity,
skin, and tongue of muscle fibers heat and cold

Taste-smell Tasting Chemical and Oral cavity Savoring Chemistry Nutritive and
mechanical (mouth) of ingested biochemical values
receptors objects

Smelling Chemical Nasal cavity Sniffing Chemistry Nature of odors


receptors (nose) of vapors

I I
Visual Seeing Photo receptors Ocular Accomodation, Light Size, shape, distance,
mechanism pupillary location, color, texture,
including eyes adjustment, and movement
and whole-body fixation,
movement convergence,
scanning

Adapted from J.J. Gibson, The Senses Considered us Perceptirul S y s t e m , Houghton-Mifflin. Boston, 1966.

24 IEEE Computer Graphics & Applications

- -
A Conceptual Virtual Reality Model

System Purpose Application Other systems

Postural Orientation with gravity and Maintain body equilibrium Vestibular organs

-
acceleration forces

Orienting- Movement of body parts to Sense information or explore All other senses
investigating obtain external stimulus
~

Locomotor Movement of body or body parts Go from one location to another Orienting-investigative and
to other parts of environment postural

Appetitive Movement of body parts to take Ingest or relieve Taste, ingestion, and
from or give to environment other body functions

Performatory I Movement beneficial to the individual I Take action, move items, protect self I Locomotor and others

Expressive Movement to express self, display Make postural, facial, and vocal Voice, hearing, and facial muscles
emotion, or identify self movements

Sematic Movements to signal action, Voice expression Any other system based on
state, or expression signal intents

Adapted from J.J. Gibson, The Senses Considered (IS Percc,ptirn/ Systems. Houghton-Miftlin, Boston. 1966.

to perform operations that would not normally be possible. Figure 2. VR system: Participant and environment.
Flight simulators are an excellent illustration of this.
Obtaining precision in the implementation of experiential
VR systems is an art form-comparable, for example, to the
ability of theater or film directors to evoke audience emotions.
I Human view Technical view

VR systems for entertainment rely on the experiential effect Environment


Participant
as the objective of the participation. It is likely that the lessons
learned from game design will provide some of the best guide-
lines for VR experiential design.(
Our conceptual model examines both a human and a techni-
cal view of a VR system, as shown in Figure 2. The human per-
spective centers o n the physical and psychological issues of as simple as a series of functional flow blocks, we have listed
providing the stimulus to and detecting the actions of the par- other factors on the left. including individual experience, er-
ticipant. The technical view refers to the physical space, func- gonomics, and sensual sensitivity. Our ability to quantify the re-
tions, processes, equipment, and concepts that define the lationship between stimulation from effectors and responses
environment. The environment provides the stimulus that cre- from sensors with either an individuals personal experience of
ates sensation while the individual takes action through move- participation or their performance is limited, except in special
ment. The environment, in the VR context of this model. is the cases. Without such quantification, the design of V R systems
total space, both real and artificial. with specific objectives will be heuristic.
A participant might consciously or unconsciously use differ-
Human view ent criteria to evaluate or evoke the experience of participa-
Figure 3 o n page 26 shows the participant within an environ- tion. We categorize these criteria as either psychological,
ment, interfacing with the VR system through effectors and anatomical, or individual. The anatomical criteria include, for
sensors. The VR system uses effectors to stimulate the receptors example, individual ergonomics. Such criteria are the easiest
of the human perceptual system. These effectors can include, for to define. because they involve measuring the participants body
example, a helmet-mounted display, a force-feedback device. or and fitting it with sensors and effectors. The psychological cri-
even an odor emitter. The system detects the participants ac- teria include the general foundations for defining human in-
tions, as expressed by the muscle systems, with sensors. teractions (for example, experimental. cognitive, and ecological
This human-centered view characterizes the participant in psychology). These criteria and those addressing the specific
the center of the environment. It highlights both the perceptual profile and background of individual participants are much
and muscle systems as functional flow elements, showing cog- more difficult issues. Csikszentmihalyi described an optimal
nitive and psychological processes operating on output from human experience, which he called flow. It is based on six
the perceptual systems and generating input to the muscle sys- factors that address such things as the sense a person has of
tems. This schematic also includes emotional responses that time passing while engaged in a flow activity. However, flow is
could be reflected in actions. Further, because no individual is a post-activity assessment. It does not provide clear rules to de-

January 1994 25
Virtual Reality

Figure 3. VR system:
Environment Human view.

Fixtures and appliances often


Virtual include both sensors and ef-
reality fectors operating indepen-

Sensors 4 I dently. Shells provide a


common interface between
the orienting perceptual sys-
tem and the postural and ori-
enting-investigating muscle
systems (see Tables 1and 2).
The degree to which the shell
represents the actual envi-
I ronment varies, as it does, for
example, in flight simulators.
T o understand the rela-
tionship between the partici-
pant and the sensors and
effectors, we must define the
partipatory environment.
sign a potential experience that will meet the flow criteria. It
might nevertheless prove useful in VR development. HeeteF Space definition and environment
used flow to describe the responses of participants to Bat- management
tleTech. a location-based entertainment attraction. The definition and integration of the real and artificial envi-
ronments defines the participatory environment. The artificial
Sensors and effectors environment comprises two elements:
VR systems use sensors and effectors to create the partici-
patory environment. We define three classes of effectors: 1. Space definition,including volume, spatial, object, and time
definitions; physical laws; and material flow; and
1. Shell, a stationary enclosure designed with an environ- 2. Environment management, including dynamics, elimina-
mental and participatory objective. tion, creation, scaling, sampling, and time control.
2. Fixture, an environment-fixed device used to create sen-
sory stimulations. Space definition elements exist independent of the experi-
3. Appliance, a participant-fixed component worn on or near ence, and environment management elements exist dynami-
the body to stimulate one or more senses. cally during the experience. The definition of the artificial
environment includes all space definition elements; however, it
Shells are usually large and special-purpose. Fixtures can might also include the rules for environment management. The
range from C R T displays to a room-size tracking system. Ap- flight simulator industry has developed technology to define
pliances can range from a body suit to a pair of goggles. Al- worlds as databases. The scope of VR worlds can include struc-
though appliances close the distance between the participant tures, such as buildings; the human body; and environments of
and the stimulation, they are intrusive and people generally do varying scales. Worlds associated with different scientific disci-
not like wearing them. Currently, the so-called immersive VR plines will likely be quantified by different methods. For ex-
systems are based on personal appliances that block stimulation ample, the tools used to describe C A D objects from 3 D
from the actual environment. cross-sectional scans will have limited application in describ-
The immersive experience is but one class of VR systems ing the human body.
that meet the definition in Figure 1. VR could fittingly be called Both the quality of the experience in VR and the partici-
environment interface technology. However, computer sci- pants ability to function effectively are determined by how well
ence today emphasizes object technology and thus tends to clas- the individual perceives the participatory environment. The
sify VR systems on the basis of the interface equipment. We feel participatory environment is created through a spectrum of
that this classification masks the fundamental contribution of stimulations to perceptual systems and actions by muscle sys-
the technology. Our definitions of appliances, fixtures, and tems. This spectrum of stimulations and responses may cross be-
shells focus on providing an interface to perceptual systems, tween real and artificial environments. For example, visual
not on describing what the interface looks like. displays use a combiner to superimpose an artificial image onto
Sensors work in conjunction with effectors to enable inter- the real field of view.
action with the environment. For example, tracking devices in- The interaction of perceptual and muscle systems is espe-
terface with a muscle system to detect participant actions. cially complex with haptic stimulation. As Table 1showed, the

26 lEEE Computer Graphics & Applications


A Conceptual Virtual Reality Model

Figure 4. VR system:
Technical view. ~ Virtual reality system

organ anatomy for the haptic


1 I environment
Rea'

perceptual system includes


the postural and orienting-in-
vestigating muscle systems.
Cutaneous sensation comes
from the skin, while proprio-
ceptive sensation originates ~~

with muscles, tendons, and


other internal organs. The
complexity of this interaction
is revealed in a recent driving simulator that claimed VR at- Figure 5. Functional components of the confection model.
tributes. The simulator generated acceleration motion cues by
delivering force impulses to pads within the seat. However. the
resulting proprioceptive sensation has little relation to the G
forces normally experienced during driving. Thus. this imple-
mentation does not interface with the muscle systems in a nat-
interface interface
ural fashion and does not achieve sensory equivalence.
It is unrealistic to expect perceptual equivalence. The human
sensory system is too complex. For years, developers have tried
to create a full-field-of-view display with either a fixed instal- Artificial Environment Real
lation or head-mounted display. Such a display must match the environment control environment
resolution of the eye in the context of head and eye movements.
A number of compromises are required to achieve effective re-
sults that are affordable and. in the case of personal appliances.
comfortable. Thus. the V R engineering task requires an un-
derstanding of the human perceptual systems to guide the de-
sign. Certainly the literature on the visual system is rich, and the Representation
foundation for the human orienting system is well defined for definition
flight simulation.""' However. that same level of understanding 1

does not exist for the haptic perceptual system."

Technical view ment. Also. the technological view reveals another participant
Robinett" developed a taxonomy for synthetic experience. sensor, a body state sensor. It is not a part of the muscle sys-
This taxonomy is the first thorough classification of experiences tems: rather. it monitors involuntary body conditions, such as
that uses the concept of technical mediation. Zeltzer" has used temperature, pulse rate, and blood pressure. Certain high-stress
an autonomy, interaction. and presence (AIP) cube to describe or critically functional VR environments may require such sen-
the taxonomy of graphic simulation systems. Our VR model Tors. which are commonly used with astronauts, for example.
builds on the work of Robinett. It includes not only mediation. Figure 5 presents the functional elements of the confection
but also creation and interaction within a participatory model. There are two control functions: one for interfaces (at
environment. the top of the figure) and one for environments (in the center).
Figure 4 illustrates the technical view of a VR system. It in- Environment control implements the combination of the real
cludes the real environment and a new component called the and artificial environments. The functions under environment
fechnical corifection. Confection is the process of preparing or control include space definition and environment management,
making, especially by combining. In VR we are confecting a which operate as described earlier. Mediation functions apply
participatory environment by combining a real environment when the confection model operates as an interface between the
with an artificial one. The technical confection includes a con- participant and the real environment. (Mediation can apply to
fection model that achieves interface control. defines the arti- artificial environments, but we see little practical use for such
ficial environment, and mediates between the participant and functions there.) Representation functions refer to different
the real environment. forms the participant can take within the artificial or real envi-
Technical confection also establishes the relationship between ronment. These include agents. robots, or personifications with
the sensors and effectors for the participant and the environ- such design parameters as autonomy. intelligence, and quantity.
ment. Some VR implementations, such as telepresence. can be We feel that the V R model must support representation. How-
seen more clearly from a technical view that highlights the in- ever. the scope of representation and its impact on participation
termediary function between the participant and the environ- await further research.

January 1994 27
, -
Virtual Reality

Interface
mapping Participant Model Environment
Participant participant Environmental
perceptual effectors state determination
systems via sensors
Visual Displays Camera
Auditory Speakers Microphone
Orienting Motion platform Accelerometer
The confection model can Haptic Switches Pressure
support independent models Taste-smell Pressure applicator Thermometer
for each perceptual system. It Others Infrared imaging
Others
also supports independent
models for each muscle sys- Participant Participant Environmental
tem, but the participants de- muscle sensors state modification
tection of action is usually systems via effectors
correlated with perceptual Postural Trackers Remote manipulator
systems. The confection Orienting/ Glove Motion platform
investigating Foot pedal Lighting
model is the central element Locomotor Body suit Sound
in determining the participa- Appetitive Microphone Material flow
tory environment. The pa- Performatory Switch Others
rameters that define its Expressive Others
Sematic
properties are interface map-
ping, model source, and time
and space. Figure 6. Interface mapping.
Figure 6 presents example
interface mappings for each Figure 7. Model source.
of the perceptual and muscle
systems identified in Tables 1
and 2, listed in order of most Model
common usage. For the source Artificial Real
model, we have listed exam- Dynamic Direct
ple effectors and sensors. We Model dynamically changes 1:1 mapping between the
d o not intend this list to be during the participation space or interface
based on the actions of the parameters as experienced
limiting; rather, it illustrates participant or other events. by the participant.
types of shells, fixtures, and Model database changes
appliances. For the environ- dynamically. Sampled
ment, we have listed sensors Limited spatial or interface
Constructed parameter resolution.
and effectors that interface to Model is defined a priori as a
the real environment. Inter- fixed space and objects. Modified
face mapping establishes the Model database is static. Modified space or interface
functional connection be- parameters, such as gain,
tween these elements. Most Recorded frequency response, or time
Time recording of the space variable.
mappings are as simple as a or interface parameters.
horizontal line connecting the Recorded
elements. Time recording of the space
Figure 7 describes confec- - Transparency - or interface parameters.
tion model sources. In artifi- (relative contribution between artificial and real
cial space, the model is either space components to create the environment)
dynamic, constructed, or
recorded. A dynamic model
describes the space changes
based on the participants action, while a constructed model is pilots heads-up display might show images and information on
a static model database. Recorded models capture a specific a transparent mask that overlays the pilots real vision. The
sequence of actions for later replay by the participant: as the fig- combination may also occur inadvertently as in a tactile feed-
ure shows, they apply to both artificial and real environments. back device that feels mostly like a tight and cumbersome glove.
Real environment models also include direct, sampled, and Figure 8 lists the parameters that allow the confection model
modified models. The transparency shown between the artifi- to change time and space properties of real or artificial envi-
cial and real model sources signifies that the sources can be ronments. An example of time alteration is a flight simulator
combined. So long as an effector delivering components of an that lets participants jump ahead to a desired point without
artificial model source does not fully mask the input to a per- flying the whole way there. Manipulation of the space param-
ceptual system, then the experienced model will be a combina- eter is illustrated by microteleoperation, where the participants
tion of the artificial and real environments. For example, a perspective and operations are reduced t o the size of molecules.

28 IEEE Computer Graphics & Applications


A Conceptual Virtual Reality Model

Figure 8. Time and space


Time Space
parameters.
Direct Direct
1 :1 correlation between time (x, y) matching of the
in the environment and the participant space and the
participant involvement. environment
Multiple Distance
(nt)time modification (mz)distance scaling of the
The direct parameters reflect between participant space participant space and the
and the environment environment
the participants natural ex-
perience of space and time. Fixed Scaled
There is a natural hierarchy ( T ) fixed time between I(x, x 4 or (nx, my, 0.3
in managing and controlling participant space and the scaling of distance for the
environment spatial dimensions between
a V R system based on the pa- participant and the
rameters of the technical con- Remapped environment
fection model. First, the f(t) functional remapping of
interface mapping supports time between participant and Functional
definition of the participants the environment f(x, y, z) functional
remapping of distance for the
perception of the environ- spatial dimensions between
ment and actions on it. Next, participant and the
the model source defines the environment
static and dynamic aspects of
the environment. Finally,
space and time have equal im-
portance; they are independent of each other, but dependent on IO. J.M. Rolfe. and K.J. Staples. 4 s . . Flighr Simularion, Cambridge
the first two levels of the confection model. Univ. Press. New York. 1988.
1 1 . Taylor. M. M.. Tactual Perception of Texture. in Handbook of
Conclusion Perception. Vol. 3: Biology of Perceptual Systems, E.C. Carterette
The technical confection model has interesting implications and M.P. Friedman. eds.. Academic Press. New York, 1973, pp. 251-
for networked V R systems. We feel that exploring these impli- 272.
cations will benefit from the context of a conceptual VR model. 12. W. Robinett. Synthetic Experience: A Proposed Taxonomy, Pres-
The model proposed here includes a human model to show the FIICL:
Teleoperators and Virtual Environments. Vol. 1, No. 2, Spring
experiential side of V R systems and a technical confection 1992, pp. 229-247.
model for the creation of a participatory environment. Together 13. D. Zcltzer. Autonomy. Interaction. and Presence. Presence; Tele-
these models define a conceptual V R system from a systems operotor\- and Virrunl ErivironnientA, Vol. 1. No. 1. Winter 1992.
perspective that includes the human participant. u pp.127-132.

References
1. K.R. Boff and J.E. Lincoln, eds.. Engineering Data Cornpcnrliunz:
Human Perception and Performance. Vols. 1.2. and 3, Human Eng. John N. Latta is president of 4th Wave, where he
Div.. Harry G. Armstrong Aerospace Medical Research Lab., is responsible for several state-of-the-art multi-
Wright-Patterson AFB. Ohio, 1988. media projects, including a business television sys-
2. K. Boff. J.P. Thomas. and L. Kaufman. eds.. Handbook of Percrp- tem. H e is also actively involved in the analysis,
tion & Human Performance,Vols. I and 11. John Wiley. New York. design. and integration of projects that inclade
1986. personal computers, multimedia, and virtual re-
3. E.G. Boring, Sensation and Perception in [he History of Experi- ality. Latta has a BES from Brigham Young Uni-
mental Psychology, Irvington. New York. 1977. versity and an MS and PhD from the University of
4. J.J. Gibson, The Senses Considered as Perceprriul Systenis. Houghton Kansas, all in electrical engineering.
Mifflin, Boston, 1966.
5. J.J. Gibson. The Ecological Approach to Visirul Perccprion. David J. Obergis a principal engineer with Global
Lawrence Erlbaum Assoc., Hillsdale. N.J.. 1986. Output\, where he is developing teleoperations
6. T.W. Malone, What Makes Computer Games Fun? B-vre,Vol. 6, and telescience capabilities for a commercial space
No. 12, Dec.1981 pp. 258-277. platform His professional interests include space
7. M. Csikszentmihalyi. Flow: The Psychology of Optimal Experience. systems development, GPS-based attitude deter-
Harper Perennial, New York, 1990. mination and rendezvous and docking, and space
8. C. Heeter, BattleTech Masters: Emergence of the First U S . Vir- station propulsion systems Oberg received BS
tual Subculture, Mulrimedia Review.Vol. 3, No. 4. Winter 1992. pp. and MS degrees in aeronautics and astronautics
65-70. from the Massachusetts Institute of Technology in 1985 and 1986, re-
9. S.R. Bussolari et al., The Use of Vestibular Models for Design And spectively.
Evaluation of Flight Simulator Motion, A I A A Paper 89-3274.pre-
sented at the AI A A Flight Simulation Technologies Conference. Readers can contact Latta at 4th Wave. P O Box 6547, Alexandria.
1989. VA 22306: e-mail jnl@fourthwave.com.

January 1994 29

You might also like