You are on page 1of 9

Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS (2018), pp.

1–9
F. Jaillet, G. Zachmann, K. Erleben, and S. Andrews (Editors)

U NREAL H APTICS: A Plugin-System for High Fidelity Haptic


Rendering in the Unreal Engine

Jeffery Bissel, Max Crass, Sinan Demirtas, Johannes Ganser, Tushar Garg, Sylvia Jürgens, Ralf Morawe, Marc O. Rüdel, Umesh Sevra,
Saeed Zahedi, Rene Weller, Gabriel Zachmann
University of Bremen

Abstract

We present U NREAL H APTICS, a novel set of plugins that enable both 3-DOF and 6-DOF haptic rendering in the Unreal En-
gine 4. The core is the combination of the integration of a state-of-the-art collision detection library with support for very fast
and stable force and torque computations and a general haptics library for the communication with different haptic hardware
devices. Our modular and lightweight architecture makes it easy for other researchers to adapt our plugins to their own re-
quirements. As a use case we have tested our plugin in a new asymmetric collaborative multiplayer game for blind and sighted
people. The results show that our plugin easily meets the requirements for haptic rendering even in complex scenes.
CCS Concepts
•Human-centered computing → Haptic devices; Virtual reality; •Software and its engineering → Software libraries and
repositories;

1. Introduction touch is much more sensitive with respect to the temporal resolu-
With the rise of affordable consumer devices such as the Oculus tion. Here, a frequency of preferably 1000 Hz is required to provide
Rift or the HTC Vive there has been a large increase in interest an acceptable force feedback. This requirement for haptic render-
and development in the area of virtual reality (VR). The new dis- ing requires a decoupling of the physically-based simulation from
play and tracking technologies of these devices enable high fidelity the visual rendering path.
graphics rendering and natural interaction with the virtual environ- In this paper, we present U NREAL H APTICS to enable high-
ments. Modern game engines like Unreal or Unity have simplified fidelity haptic rendering in a modern game engine. Following the
the development of VR applications dramatically. They almost hide idea of decoupling the simulation part from the core game engine,
the technological background from the content creation process so U NREAL H APTICS consists of three individual plugins:
that today, everyone can click their way to their own VR appli-
• A plugin that we call H APTICO: it realizes the communication
cation in a few minutes. However, consumer VR devices are pri-
with the haptic hardware.
marily focused on outputting information to the two main human
• The computational bottleneck during the physically-based sim-
senses: seeing and hearing. Also game engines are mainly limited
ulation is the collision detection. Our plugin called C OLLETTE
to visual and audio output. The sense of touch is widely neglected.
builds a bridge to an external collision detection library that is
This lack of haptic feedback can disturb the immersion in virtual
fast enough for haptic rendering.
environments significantly. Moreover, the concentration on visual
• Finally, F ORCE C OMP computes the appropriate forces and
feedback excludes a large number of people from the content cre-
torques from the collision information.
ated with the game engines: those who cannot see this content, i.e.
blind and visually impaired people. This modular structure of U NREAL H APTICS allows other re-
searchers to easily replace individual parts, e.g. the force computa-
The main reasons why the sense of touch is widely neglected in
tion or the collision detection, to fit their individual needs. We have
the context of games are that haptic devices are still comparatively
integrated U NREAL H APTICS into the Unreal Engine 4 (UE4). We
bulky and expensive. Moreover, haptic rendering is computation-
use a fast, lightweight and highly maintainable and adjustable event
ally and algorithmically very challenging. Although many game
system to handle the communication in U NREAL H APTICS.
engines have a built-in physics engine, they are most usually lim-
ited to simple convex shapes and they are relatively slow: for the As a use case we present a novel asymmetric collaborative mul-
visual rendering loop it is sufficient to provide 60-120 frames per tiplayer game for sighted and blind players. In our implementation,
second (FPS) to guarantee a smooth visual feedback. Our sense of H APTICO integrates the CHAI3D library that offers support for a

submitted to Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS


(2018)
2 Bissel et al. / U NREAL H APTICS: A Plugin-System for High Fidelity Haptic Rendering in the Unreal Engine

wide variety of available haptic devices. For the collision detec-


tion we use the state-of-the-art collision detection library CollDet
[Zac01] that supports complexity independent volumetric collision
detection at haptic rates. Our force calculation relies on a penalty-
based approach with both 3- and 6-degree-of-freedom (DOF) force
and torque computations. Our results show that U NREAL H APTICS
is able to compute stable forces and torques for different 3- and
6-DOF devices in Unreal at haptic rates.

2. Related Work
Game engines enable the rapid development with high end graph-
ics and the easy extension to VR to a broad pool of developers.
Hence, they are usually the first choice when designing demanding
3D virtual environments. Obviously, this is also true for haptic ap-
plications. Consequently, there exist many (research) projects that
Figure 1: A typical haptic integration without U NREAL H APTICS.
already integrated haptics into such game engines, e.g. [AMLL06],
Left: different haptic devices available with their libraries. Right:
[dPECF16], [MJS04] to name but a few. However, they usually
Scheme of UE4, which we want to integrate the devices with.
have spent a lot of time in developing single use approaches which
are hardly generalizable and thus, not applicable to other programs.
Actually, there exist only a very few approaches that provide
comfortable interfaces for the integration of haptics into modern [H3D]. Same as CHAI3D, it is extensible in both the device and
game engines. We only found [Kol17] and [Use16] that provide algorithm domain. However by default H3DAPI supports less de-
plugins for UE4 that serve as interfaces to the 3D Systems Touch vices and likewise does not provide 6-DOF force feedback.
(formerly SensAble PHANToM Omni) [PHA] via the OpenHaptics
A general haptic toolkit with a focus on web development was
library [3D 18]. OpenHaptics is a proprietary library that is spe-
presented by Ruffaldi et al. [RFB∗ 06]. It is based on the eXtreme
cific to 3D Systems’ devices, which means that other devices can-
Virtual Reality (XVR) engine, utilising the CHAI3D library, in or-
not be used with these plugins. Furthermore, the plugins are not
der to allow rapid application development independent from the
actively maintained and seem to not be working with the current
specific haptic interface. Unfortunately, the toolkit has not been fur-
version of UE4 (version 4.18 at the time of writing). Another ex-
ther developed and there is no documentation to be found, since
ample is a plugin for the PHANToM device presented in [The14],
their homepage went down.
also based on the OpenHaptics library. Like the other plugins, it
is no longer maintained and was even removed from Unity’s asset All approaches mentioned above are limited to 3-DOF haptic
store [The18]. During our research, we could not find any actively rendering. Sagardia et al. [SSS14] present an extension to the Bul-
maintained plugin for a commonly used game engine that supports let physics engine for faster collision detection and force compu-
3- or 6-DOF force feedback. tation. Their algorithm is based on the Voxmap-Pointshell algo-
Outside the context of game engines, there are a number of li- rithm [MPT99]. Objects are encoded both in a voxmap that stores
braries that provide force calculations for haptic devices. A gen- distances to the closest points of the object as well as point-shells on
eral overview is given in [KK11]. One example is the CHAI3D the object surface that are clustured to generate optimally wrapped
library [CHA18b]. It is an open-source library written in C++ that sphere trees. The penetration depth from the voxmap is then used to
supports a variety of devices by different vendors. It offers a com- calculate the forces and torques. In contrast to Bullet’s build-in al-
mon interface for all devices that can be extended to implement gorithms this approach offers full 6-DOF haptic rendering for com-
custom device support. For its haptic rendering, CHAI3D accel- plex scenes. However, the Voxmap-Pointshell algorithm is known
erates the collision detection with mesh objects by using an axis- to be very memory intensive and susceptible to noise. [WSM∗ 10]
aligned bounding box (AABB) hierarchy. The force rendering is
based on a finger-proxy algorithm. The device position is proxied 3. U NREAL H APTICS
by a second, virtual position that tries to track the device position.
When the device position enters a mesh the proxy will stay on the The goal of our work was to develop an easy-to-use and simulta-
meshes surface. The proxy tries to minimize the distance to the neously adjustable and generalizable system for haptic rendering
device position locally by sliding along the surface. Finally, the in modern game engines. This can be used in games, research or
forces are computed by exerting a spring force between the two business related contexts, either as whole or in parts. We decided to
points [CHA18a]. Due to this method’s simplicity, it only returns use the Unreal Engine for development because of several reasons:
3-DOF force feedback, even though the library generally allows for • it is one of the most popular game engines with a large commu-
also passing torques and grip forces to devices. Nevertheless we nity, regular updates and a good documentation,
are using CHAI3D in our use case, but only for the communication • it is free to use in most cases, especially in a research context
with haptic devices. where it is already heavily used [RTH∗ 17], [MJC08],
A comparable, slightly older library is the H3DAPI library • it is fully open-source, thus can be examined and adapted,

submitted to Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS (2018)
Bissel et al. / U NREAL H APTICS: A Plugin-System for High Fidelity Haptic Rendering in the Unreal Engine 3

• it offers programmers access on the source code level while the rendering system will use the mesh components, the physics
game designers can use a comfortable graphical editor in combi- system will use the physics components etc.).
nation with a graphical scripting system called Blueprints. Thus,
UE4 allows its users to attach new components to actors in the
it combines the advantages of open class libraries and extensible
scene graph, which allows extending objects with new behavior.
IDEs
Furthermore, if a new class is created using UE4’s C++-dialect,
• it is extendable via plugins,
variables of that class can be exposed to the editor. By doing so,
• and finally, it is build on C++, which makes it easy to integrate
users have the ability to easily change values of an instance of the
external C++-libraries. This is convenient because C++ is still
class from within the editor itself, which minimizes programming
the first choice for high-performance haptic rendering libraries.
effort.
Our goals directly imply a modular design for our system. The main
UE4 not only provides a C++ interface, but also a visual pro-
challenges when including haptics into programs are fast collision
gramming language called Blueprints. Blueprints abstract functions
detection, stable force computation and communication with hard-
and classes from the C++ interface and present them as “build-
ware devices. Figure 1 presents the previous state before our plu-
ing blocks” that can be connected by execution lines. It serves as
gins: on the one side, there are different haptic devices available
straightforward way to minimize programming effort and even al-
with their libraries. On the other side, there is UE4 in which we
lows people without programming experience to create game logic
want to integrate the devices. Consequently, our system consists of
for their project.
three individual plugins that realizes one of these tasks. In detail
these are: When extending the UE4 with custom classes, the general idea
is noted in [Epi18]: programmers extend the existing systems by
• A plugin called H APTICO, which realizes the communication exposing the changes via blueprints. These can then be used by
with haptic hardware, i.e. it initializes haptic devices and dur- other users to create game behaviour. We followed this idea as well
ing runtime receives positions and orientations and sends forces when implementing our plugins.
and torques back to the hardware.
• A plugin called C OLLETTE that communicates with an (exter- Furthermore, to make the code more reusable and easier to
nal) collision detection library. Initially, it passes geometric ob- distribute, UE4 allows developers to bundle their code as plug-
jects from Unreal to the collision library (to enable it to poten- ins [Epi17]. Plugins can be managed easily within the editor. All
tially compute acceleration data structures etc.). During runtime, classes and blueprints are then directly accessible for usage in the
it updates the transformation matrices of the objects and collects editor. We implemented our work as a set of three plugins to make
collision information. the distribution effortless and allow the users to choose which fea-
• F ORCE C OMP, a force rendering plugin which receives collision tures they need for their projects.
information and computes forces and torques that are finally Finally, UE4 programs can be linked against external libraries
send to H APTICO. The force calculation is closely related to the at compile time, or dynamically load them at runtime, similar to
collision detection method because it depends on the provided regular C++ applications. We are using this technique to base our
collision information. However, we decided to separate the force plugins on already existing libraries. This ensures a time-tested and
and torque computation from the actual collision detection into actively maintained base for our plugins.
separate plugins because this allows an easy replacement, e.g. if
the simulation is switched from penalty-based to impulse-based.
3.2. Design of the Plugin Communication
The list of plugins already suggest that communication plays an
important role in the design of our plugin system. Hence, we will As described above, our system consists of three individual plug-
start with a short description on this topic before we detail the im- ins that exchange data. Hence, communication between the plugins
plementations of the individual plugins. plays an important role. Following our goal of flexibility, this com-
munication has to meet two major requirements.
• The plugins need to communicate with each other without
3.1. Integration into Unreal
knowledge about the others’ implementation because users of
UE4 is a game engine that comprises the engine itself as well as a our plugins should be able to use them individually or combined.
3D editor to create applications using the engine. We will start with They could even be replaced by the users’ own implementations.
a short recap of UE4’s basic concepts. Thus, the communication has to run on an independent layer.
• Users of the plugins should be able to access the data produced
UE4 follows the component-based entity system design. Ev-
by the plugins for their individual needs. This means that it must
ery object in the scene (3D objects, lights, cameras, etc.) is at its
be possible to pass data outside of the plugins.
core a data-, logic-less entity (in the case of UE4 called actors).
The different behavior between the objects stems from components To fulfill both these requirements, we implemented a messaging
that can be attached to these actors. For example, a StaticMe- approach based on delegates. A delegator is an object that repre-
shActor (which represents a 3D object) has a mesh component sents an event in the system. The delegator can define a certain
attached, while a light source will have different components at- function signature by specifying parameter types. Delegates are
tached. These components contain the data used by UE4’s internal functions of said signature that are bound to the delegator. The dele-
systems to implement the behavior of the composed objects (e.g. gator then can issue a broadcast which will call all bound delegates.

submitted to Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS (2018)
4 Bissel et al. / U NREAL H APTICS: A Plugin-System for High Fidelity Haptic Rendering in the Unreal Engine

Figure 3: The structure of the system. Right: The UE4 game


Figure 2: Unreal’s editor view of the game. On the left side, you thread. Left: The haptic thread which is separated from the game
see the Phantom player in the virtual environment. In front of him thread. Data is passed between the plugins by using a delegate sys-
are the virtual tool (pen) and a ColletteStaticMeshActor tem. The game thread updates its visual representation at a low
to be recognized (crown). On the right, the scene graph is displayed frequency.
with our custom classes.

will execute delegates one after another with just a single addi-
Effectively, the delegates are functions reacting to the event repre- tional step in the call stack. The data is always passed around as
sented by the delegator. A delegator can pass data to its delegates references internally, preventing any additional copies.
when broadcasting, completing the messaging system.
3.2.2. H APTICO Plugin — Haptic Device Interface
While UE4 provides the possibility to declare different kinds of
H APTICO enables game developers to use haptic devices directly
delegates out of the box, we opted for a custom C++ solution. The
from UE4 without implementing a connection to the device manu-
details behind that decision are explained in Section 3.2.1.
ally. It automatically detects a connected haptic device and allows
The setup of the delegates between the plugins can be handled to retrieve data of the device and to apply forces and torques to
for example in a custom controller class within the users’ projects. it, due to the underlying CHAI3D library, from Blueprints or C++
We describe the implementation details for such a controller in Sec- Code.
tion 3.2.5.
H APTICO consists of mainly three parts: The haptic manager,
In the next few sections we will give an overview of U NREAL - the haptic thread and the haptic device interface. The haptic man-
H APTICS and its parts in more detail. ager is the only user interface and represented as an UE4 actor in
the scene. It provides functions to apply forces and torques to the
3.2.1. Our Light Delegate System device and to get informations such as position and rotation of the
end effector. To be used for haptic rendering the execution loop of
UE4 provides the possibility to declare different kinds of delegates the plugin must be separated from UE4’s game thread which runs
out of the box. However, these delegates have a few drawbacks. at a low frequency. The plugin uses its own haptic thread internally.
Only Unreal Objects (declared with the UOBJECT macro etc.) can The haptic thread reads the positional and rotational data from the
be passed around with such delegates, limiting their use for more device, provides it for the haptic manager and applies the new force
general C++ applications. They also introduce several layers of and torques retrieved from the haptic manager to the device in every
calls in the call stack since they are implemented around UE4’s re- tick. When new haptic data is available a delegator-event MoveOn-
flection system. This may influence performance when many dele- HapticTick is broadcasted, which passes the device data to the
gates are used. Finally, we experienced problems at runtime: UE4- haptic manager in every tick. Users of the plugin can easily hook
delegators temporarily forgot their bound functions which led to their own functions to this event, allowing to react to the moved de-
crashes when trying to access the addresses of these functions. vice. A second delegator-event ForceOnHapticTick is broad-
casted, which allows users to hook force calculation functions into
To overcome these problems we implemented our own Dele-
the haptic thread. Our own F ORCE C OMP plugin uses this mecha-
gator class. It is a pure C++ class that can take a variable num-
nism, which is further described in Section 3.2.5.
ber of template arguments which represent the parameter types
of its delegates. A callable can be bound with the addDel-
3.2.3. C OLLETTE — Collision Detection Plugin
egate(...) function. Our solution supports all common C++
callables (free functions, member functions, lambdas etc.). The del- The physics module included in UE4 has two drawbacks that makes
egates can be executed with the broadcast() function which it unsuitable for haptic rendering:

submitted to Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS (2018)
Bissel et al. / U NREAL H APTICS: A Plugin-System for High Fidelity Haptic Rendering in the Unreal Engine 5

1. It runs on the main game thread, which means it is capped at


120 FPS.
2. Objects are approximated by simple bounding volumes, which
is very efficient for game scenarios but too imprecise to compute
the collision data needed for haptic rendering.
This leads to the realization that for haptic rendering, UE4’s physics
module has to be bypassed.
Our C OLLETTE plugin does exactly that. We do not implement
a collision detection in this plugin, but provide a flexible wrapper
to bind external libraries. In our use case we show an example how Figure 4: The detailed structure of the system. Right: The collision
to integrate the CollDet library (see Section 4.2.2). Like H APTICO, detection thread. Left: The haptic thread which is separated from
C OLLETTE can run in its own thread. Thus, the frequency needed the game thread and the collision detection thread.
for haptic rendering can be achieved.
The plugin uses a ColletteStaticMeshActor for repre-
senting collidable objects. This is an extension to UE4’s Stat-
icMeshActor. It supports loading additional pre-computed ac- between the individual plugins as well as the core game engine
celeration data structures to the actor’s mesh component when the (and subsequently between the different threads).
3D asset is loaded. For instance, in our example application we load
Figure 3 and figure 4 show an overview of the plugin commu-
a pre-generated sphere tree asset from the hard drive which is used
nication: Assuming there are two ColletteStaticMeshAc-
for internal representation of the underlying algorithm.
tors in the scene, one of which is controlled by the haptic device
The collision pipeline is represented by a ColletteVolume, (the virtual tool) while the other is a static 3D object in the scene.
which extends the UE4 VolumeActor. We decided to use a vol- Due to our flexibility requirement, we want to avoid that the H AP -
ume actor because it can allow limiting the collision pipeline to TICO plugin works only with a specific implementation of the ac-
defined areas in the level. This is especially useful for asymmetric tor. To solve this challenge we place a BridgingController
multiplayer scenarios as described in Section 4. To register collid- in the scene. It has a reference to the virtual tool. The reference
able objects with the pipeline, they can be registered with an Ad- is exposed to the UE4 editor as a property, so that it can easily be
dCollisionWatcher(...) blueprint function. The function set by dragging-and-dropping the HapticManager instance on
takes references to the ColletteVolume as well as two Col- the controller instance in the editor window. The controller binds
letteStaticMeshActors. a function to the MoveOnHapticTick event that is broadcasted
by the haptic thread. The position and orientation data that is trans-
During runtime, the collision thread checks registered pairs with
mitted by this event is forwarded to the virtual tool. This has the
their current positions and orientations. If a collision is determined,
same effect as if the virtual tool would be updated directly in the
the class ColletteCallback broadcasts an OnCollision
haptic thread. With this solution however, we keep the concrete im-
delegator-event. Users of the plugin can easily hook their own func-
plementations of the plugins separate from each other.
tions to this event, allowing reactions to the collision. Blueprint
events cannot be used here as they are also executed on the game The force computation is executed on the haptic thread after up-
thread and thus run at a low frequency. The event also transmits dating the virtual tool’s transform. The collision detection, which
references to the two actors involved in the collision, as well as the provides necessary data for the force computation, is executed on
collision data generated by the underlying algorithm. This data can the separate collision thread (see Figure 3) so that in case of deep
then be used for example to compute collision response forces. collisions the haptic thread is not slowed down. In order to real-
ize the communication between the two threads, a ForceCon-
3.2.4. F ORCE C OMP Plugin troller is placed in the scene which has a reference to both
The force calculation is implemented as a free standing function the ColletteVolume and the HapticsManager. These ref-
which accepts the data from two ForceComponents that can erences can also be easily set in the editor. The controller first binds
be attached especially to ColletteStaticMeshActors and a function to the OnCollision delegator-event of the Collet-
depends on the current transform of the ColletteStaticMe- teVolume. This function receives the collision data transmitted
shActor. The ForceComponent provides UE4 editor proper- by that event and stores it in shared variables. The controller also
ties needed for the physical simulation of the forces: For instance binds a second function as delegate to the ForceOnHapticTick
the mass of the objects, a scaling factor or a damper (see Sec- delegator-event of the HapticsManager. By doing this, the hap-
tion 4.2.3). We have separated the force data from the collision de- tic thread will execute the delegate after it has updated the virtual
tection. This allows users to use the C OLLETTE plugin without the tool’s transform. The delegate itself reads the data from the shared
force computation. variables and based on it computes the collision forces. Afterwards,
it passes the forces back to the HapticsManager, which in turn
3.2.5. Controlling Data Flow Between the Plugins applies them to the associated haptic device.
Before running the plugin system, the delegator-events and their By following this approach we have ensured that even though the
respective delegates need to be set up to organize the data flow different plugins require data from each other, they are modularized

submitted to Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS (2018)
6 Bissel et al. / U NREAL H APTICS: A Plugin-System for High Fidelity Haptic Rendering in the Unreal Engine

(a) Half-section of the crown (b) Half-section of the phantom

Figure 5: Meshes from our game application and their ISTs. The
left object is a crown that has to be detected by the Phantom player.
The right object is the virtual tool controlled by the haptic device.

Figure 6: In-game screenshot of our implemented game. The Phan-


tom player sits at the table recognizing objects. A guard (right) is
and can be customized through user-defined behavior (via delegates patrolling the room.
and UE4 properties).

4. Use Case not to get spotted or make too much noise as these guards are highly
We applied the U NREAL H APTICS to a real-world application with sensitive to sounds. Vive’s job is to break the displays, collect the
support for haptic rendering. The main idea was the realization of artifacts while distracting the guards and bring them to Phantom.
an asymmetric virtual reality multiplayer game [Juu10] where a vi- Phantom’s job on the other hand is to recognize the right artifact
sually impaired and a seeing player can interact collaboratively in based on Falcon’s description using his shape recognition exper-
the same virtual environment. While the seeing person uses a head tise. The goal of the game is to steal and identify all the specified
mounted display (HMD) and tracked controllers like the HTC Vive artifacts before the time runs out.
hand controllers, the blind person operates a haptic force feedback In order to identify objects and the differences between fake and
device, like the PHANToM Omni. We will start with the descrip- real objects in the game, the Phantom player uses a haptic force
tion of the basic game idea before we explain the integration of ap- feedback device to sweep over the virtual collected objects. As soon
propriate collision detection, force rendering and communication as the virtual representation of the haptic device collides with an
libraries into our plugin system. object, U NREAL H APTICS detects these collisions and renders the
resulting forces back to the haptic device. It is therefore possible
4.1. Game Idea for visually impaired people to perceive the object similarly to how
they would in real life. Adding realistic sounds to this sampling
An extensive research involving interviews with visually impaired could further improve this experience.
people was done to understand their perspective for a good game
before going into development phase. It turned out that most peo- Even if the gameplay is in the foreground in our current use case,
ple we interviewed attach great importance to a captivating sto- it is obvious that almost the same setup can be easily extended to
ryline and ambiance. Therefore we included believable recordings perform complex object recognition tasks or to combine HMD and
and realistic sound effects to achieve an exciting experience. haptic interaction for the sighted player.

The game takes place in a museum owned by a dubious relics


collector called earl Lazius. A team of three professional thieves, 4.2. Implementation Details
Phantom, Vive and Falcon, attempt to break into the museum in or- The concept behind U NREAL H APTICS is explained in Section 3.
der to steal various valuable artifacts. The blind player takes control The following sections will give an insight into our concrete imple-
over Phantom, a technician, particularly skilled in compromising mentations for the individual plugins.
security systems and an expert for forgeries. Vive is played by the
sighted player using an HMD. He is a professional pickpocket and 4.2.1. Device Communication via CHAI3D
a master of deceiving people. Falcon, the operator of the heist is a
The basis for H APTICO is the CHAI3D library. As already men-
non-player character (NPC) in the game who acts as an assistant
tioned in Section 2, this library supports a wide variety of haptic
and provides valuable intel over a voice communication channel
devices, including the PHANToM and the Haption Virtuose [Hap]
and displaying images of items to be stolen via a device located on
which we used for testing. CHAI3D is linked by H APTICO as a
Vive’s arm.
third-party library at compile time. For the most part, the usage of
For every exhibit in the museum, there are several fake artifacts CHAI3D is limited to its Devices module to interface with the de-
that look exactly the same as the real ones. Since Vive is incapable vices, especially to set and retrieve positions and rotations. We did
of differentiating between real and fake artifacts, it is the job of not use CHAI3D’s force rendering algorithms as they do not sup-
Phantom to apply his skills here. Also, several guards patrol in the port 6-DOF force calculation. We also skipped CHAI3D’s scene
premises for possible intruders (see Figure 6). Vive has to be careful graph capabilities, as that is already handled by UE4 in our case.

submitted to Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS (2018)
Bissel et al. / U NREAL H APTICS: A Plugin-System for High Fidelity Haptic Rendering in the Unreal Engine 7

lisions in the pipeline, it passes the IDs of the respective actors to


the CollDet functions who implement the collision checking. Like
with CHAI3D, C OLLETTE links to the CollDet library at compile
time.

4.2.3. Force Calculation


Force and torque computations for haptics usually rely on penalty-
based approaches because of their performance. The actual force
computation method is closely related to the collision information
that is delivered from C OLLETTE. In case of the ISTs this is a list
(a) Orginal Mesh (b) 70 spheres of overlapping inner spheres for a pair of objects. In our implemen-
tation we apply a slightly modified volumetric collision response
scheme as reported by [WZ09]:
For an object A colliding with an object B we compute the resti-
tution force ~FA by
vel · ε
   
~FA = ∑ ~FAi = ∑ ~ni, j · max voli, j · εc − i, j d , 0
j∩i6=∅ j∩i6=∅
Voltotal
(1)
where (i, j) is pair of colliding spheres, ~ni, j is the collision normal,
voli, j is the overlap volume of the sphere pair, Voltotal is the total
overlap volume of all colliding spheres, veli, j is the magnitude of
(c) 670 spheres (d) 10000 spheres the relative velocity at the collision center in direction of ~ni, j . Ad-
ditionally, we added an empirically determined scaling factor εc for
Figure 7: Stanford bunny represented filled with inner spheres. (a) the forces and applied some damping with εd to prevent unwanted
shows the full 3D object imported as StaticMeshActor in Un- increases of forces in the system.
real. (b)–(c) show the corresponding pre-computed spheres.
Only positive forces are considered to prevent an increase in the
overlapping volume of the objects. The total restitution force is then
computed simply by summing up the restitution forces of all col-
4.2.2. Collision Detection With CollDet liding sphere pairs.
CollDet is a collision detection library written in C++ that imple- Torques for full 6-DOF force feedback can be computed by
ments a complete collision detection pipeline with several layers of
~τA = ∑ Ci, j − Am × ~FAi

filtering [Zac01]. This includes broad-phase collision detection al- (2)
j∩i6=∅
gorithms like a uniform grid or convex hull pre-filtering as well as
several narrow phase algorithms like a memory optimized version where Ci, j is the center of collision for sphere pair (i, j) and Am is
of an AABB-tree, called Boxtree [Zac95], and DOP-trees [Zac98]. the center of mass of the object A. Again, the total torques of one
For haptic rendering, the Inner Sphere Trees data structure fits best. object are computed by summing the torques of all colliding sphere
Unlike other methods, IST define hierarchical bounding volumes pairs [WZ09].
of spheres inside the object. These spheres should fill the objects
as accurately as possible, completely and yet without overlapping
4.3. Performance
as shown in Figure 7 and 5. This approach is independent of the
object’s triangle count and it has shown to be applicable to hap- We have evaluated the performance of our implementation in the
tic rendering. The main advantage, beyond the performance, is the game on an Intel Core i7-6700K (4 Cores) with 64 GB of main
collision information provided by the ISTs: they do not simply de- memory and and a NVIDIA GeForce GTX 1080 Ti running Mi-
liver a list of overlapping triangles but give an approximation of crosoft Windows 10 Enterprise.
the objects’ overlap volume. This guarantees stable and continuous
We achieved almost always a frequency of 1 KHz for the
forces and torques [WSM∗ 10]. The source code is available under
force rendering and haptic communication thread. It only dropped
an academic-free license.
slightly in case of situations with a lot of intersecting pairs of
C OLLETTE’s ColletteVolume is, at its core, a wrapper spheres. The same appears for the collision detection that slightly
around CollDet’s pipeline class. Instead of adding CollDet ob- dropped to 200-500 Hz in situations of heavy interpenetrations.
jects to the pipeline, the plugin abstract this process by register- This corresponds to the results reported in [WSM∗ 10].
ing the ColletteStaticMeshActors with the volume. In-
ternally, a ColletteStaticMeshActor is assigned a ColID
5. Conclusions and Future Work
from the CollDet pipeline through its ColletteStaticMesh-
Component, so that each actor represents a unique object in the We have presented a new plugin system for integrating haptics into
pipeline. When the volume moves the objects and checks for col- modern plugin-orientated game engines. Our system consists of

submitted to Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS (2018)
8 Bissel et al. / U NREAL H APTICS: A Plugin-System for High Fidelity Haptic Rendering in the Unreal Engine

three individual plugins that cover the complete requirements for [dPECF16] DE P EDRO J., E STEBAN G., C ONDE M. A., F ERNÁNDEZ
haptic rendering: communication with different hardware devices, C.: Hcore: a game engine independent oo architecture for fast develop-
collision detection and force rendering. Intentionally we used an ment of haptic simulators for teaching/learning. In Proceedings of the
Fourth International Conference on Technological Ecosystems for En-
abstract design of our plugins. This abstract and modular setup hancing Multiculturality (2016), ACM, pp. 1011–1018. 2
makes it easy for other developers to exchange parts of our system
[Epi17] E PIC G AMES: Plugins, 17.11.2017. URL: https:
to adjust it to their individual needs. In our use case, a collabora- //docs.unrealengine.com/latest/INT/Programming/
tive multiplayer VR game for blind and sighted people, we have Plugins/index.html. 3
demonstrated the simplicity of integrating external C++-libraries [Epi18] E PIC G AMES: Introduction to C++ Programming in UE4,
with our plugins, namely CHAI3D for the communication with the 2018. Website. URL: https://docs.unrealengine.com/en-
hardware and the collision detection library CollDet. Our results US/Programming/Introduction. 3
show that our plugin system works stably and the performance is [H3D] H3DAPI: Website. URL: http://h3dapi.org/. 2
well suited for haptic rendering even for complex non-convex ob- [Hap] H APTION SA: Virtuose 6d desktop. URL: https:
jects. //www.haption.com/pdf/Datasheet_Virtuose_
6DDesktop.pdf. 6
Future projects now have an easy way to provide haptic force
[Juu10] J UUL J.: The game, the player, the world: Looking for a heart of
feedback in haptic enabled games, serious games, and business re- gameness. PLURAIS-Revista Multidisciplinar 1, 2 (2010). 6
lated applications. Even though other developers may decide to use
[KK11] K ADLE ČEK P., K MOCH S. P.: Overview of current develop-
different libraries for their work, we are confident that our experi- ments in haptic APIs. In Proceedings of CESCG (2011). 2
ences reported here in combination with our high-level UE4 plugin
[Kol17] KOLLASCH F.: Sirraherydya/phantom-omni-plugin, 11.12.2017.
system will simplify their integration effort enormously. Moreover, URL: https://github.com/SirrahErydya/Phantom-
our system is not limited to haptic rendering but it can be also used Omni-Plugin. 2
to integrate general physically-based simulations. [MJC08] MÃ ŞL A. C. A., J ORGE C. A. F., C OUTO P. M.: Using a
However, our system, and the current CHAI3D and CollDet- game engine for vr simulations in evacuation planning. IEEE Computer
Graphics and Applications 28, 3 (May 2008), 6–12. doi:10.1109/
based implementation also have some limitations that we want to MCG.2008.61. 2
solve in future developments: Currently, our system is restricted to
[MJS04] M ORRIS D., J OSHI N., S ALISBURY K.: Haptic Battle
rigid body interaction. The inclusion of deformable objects would Pong: High-Degree-of-Freedom Haptics in a Multiplayer Gam-
be nice. In this case, a rework of the interfaces would be necessary ing Environment. URL: https://www.microsoft.com/en-
because the amount of data to be exchanged between the plugins us/research/publication/haptic-battle-pong-
would increase significantly; instead of transferring simple matri- high-degree-freedom-haptics-multiplayer-gaming-
environment-2/. 2
ces that represent the translation and orientation of an object we
would have to augment complete meshes. Direct access to UE4s [MPT99] M C N EELY W. A., P UTERBAUGH K. D., T ROY J. J.:
Six degree-of-freedom haptic rendering using voxel sampling. In
mesh memory could be helpful to solve this challenge. Moreover,
Proceedings of the 26th Annual Conference on Computer Graph-
it would be nice to directly access deformable meshes on the GPU ics and Interactive Techniques (New York, NY, USA, 1999), SIG-
because there are a lot of appealing GPU-based collision detection GRAPH ’99, ACM Press/Addison-Wesley Publishing Co., pp. 401–
methods. 408. URL: http://dx.doi.org/10.1145/311535.311600,
doi:10.1145/311535.311600. 2
Also, our use case offers interesting avenues for future works. [PHA] PHANTOM O.: Sensable technologies. Inc., http://www. sens-
Currently, we plan a user study with blind video game play- able. com. 2
ers to test their acceptance of haptic devices in 3D multiplayer [RFB∗ 06] RUFFALDI E., F RISOLI A., B ERGAMASCO M., G OTTLIEB
environments. Moreover, we want to investigate different haptic C., T ECCHIA F.: A haptic toolkit for the development of immersive and
object recognition tasks, for instance with respect to the influ- web-enabled games. In Proceedings of the ACM symposium on Virtual
ence of the degrees of freedom of the haptic device or with bi- reality software and technology (2006), ACM, pp. 320–323. 2
manual vs single-handed interaction. Finally, other haptic interac- [RTH∗ 17] R EINSCHLUESSEL A. V., T EUBER J., H ERRLICH M., B IS -
tion metaphors could also be interesting, e.g. the use of the haptic SEL J., VAN E IKEREN M., G ANSER J., KOELLER F., KOLLASCH
F., M ILDNER T., R AIMONDO L., R EISIG L., RUEDEL M., T HIEME
devices as a virtual cane to enable orientation in 3D environments
D., VAHL T., Z ACHMANN G., M ALAKA R.: Virtual reality for
for blind people. user-centered design and evaluation of touch-free interaction techniques
for navigating medical images in the operating room. In Proceed-
ings of the 2017 CHI Conference Extended Abstracts on Human Fac-
References tors in Computing Systems (New York, NY, USA, 2017), CHI EA
’17, ACM, pp. 2001–2009. URL: http://doi.acm.org/10.1145/
[3D 18] 3D S YSTEMS: Geomagic OpenHaptics Toolkit, 2018.
3027063.3053173, doi:10.1145/3027063.3053173. 2
Website. URL: https://www.3dsystems.com/haptics-
devices/openhaptics. 2 [SSS14] S AGARDIA M., S TOURAITIS T., S ILVA J. L. E .: A New Fast
and Robust Collision Detection and Force Computation Algorithm Ap-
[AMLL06] A NDREWS S., M ORA J., L ANG J., L EE W.-S.: Hapticast: A
plied to the Physics Engine Bullet: Method, Integration, and Evaluation.
physically-based 3d game with haptic feedback. 2
In EuroVR 2014 - Conference and Exhibition of the European Associa-
[CHA18a] CHAI3D: CHAI3D Documentation — Haptic Rendering, tion of Virtual and Augmented Reality (2014), Perret J., Basso V., Fer-
2018. URL: http://www.chai3d.org/download/doc/html/ rise F., Helin K., Lepetit V., Ritchie J., Runde C., van der Voort M.,
chapter17-haptics.html. 2 Zachmann G., (Eds.), The Eurographics Association. doi:10.2312/
[CHA18b] CHAI3D: Website, 2018. URL: http:// eurovr.20141341. 2
www.chai3d.org/. 2 [The14] T HE G LASGOW S CHOOL OF A RT: Haptic demo in Unity using

submitted to Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS (2018)
Bissel et al. / U NREAL H APTICS: A Plugin-System for High Fidelity Haptic Rendering in the Unreal Engine 9

OpenHaptics with Phantom Omni, 2014. Online Video. URL: https:


//www.youtube.com/watch?v=nmrviXro65g. 2
[The18] T HE G LASGOW S CHOOL OF A RT: Unity Haptic Plugin for
Geomagic OpenHaptics (HLAPI/HDAPI), 2018. Website. URL:
https://assetstore.unity.com/packages/templates/
unity-haptic-plugin-for-geomagic-openhaptics-
hlapi-hdapi-19580. 2
[Use16] U SER Z EONMK II: Zeonmkii/omniplugin, 17.3.2016. URL:
https://github.com/ZeonmkII/OmniPlugin. 2
[WSM∗ 10] W ELLER R., S AGARDIA M., M AINZER D., H ULIN
T., Z ACHMANN G., P REUSCHE C.: A benchmarking suite for
6-dof real time collision response algorithms, 2010. URL: http:
//dl.acm.org/ft_gateway.cfm?id=1889874&type=pdf,
doi:10.1145/1889863.1889874. 2, 7
[WZ09] W ELLER R., Z ACHMANN G.: A unified approach for
physically-based simulations and haptic rendering. In Sandbox 2009
(New York, NY, 2009), Davidson D., (Ed.), ACM, p. 151. doi:
10.1145/1581073.1581097. 7
[Zac95] Z ACHMANN G.: The boxtree: Exact and fast collision detection
of arbitrary polyhedra. In SIVE Workshop (July 1995), pp. 104–112. 7
[Zac98] Z ACHMANN G.: Rapid collision detection by dynamically
aligned DOP-trees. In Proc. of IEEE Virtual Reality Annual Interna-
tional Symposium; VRAIS ’98 (Atlanta, Georgia, Mar. 1998), pp. 90–97.
7
[Zac01] Z ACHMANN G.: Optimizing the collision detection pipeline.
In Procedings of the First International Game Technology Conference
(GTEC) (2001). 2, 7

submitted to Workshop on Virtual Reality Interaction and Physical Simulation VRIPHYS (2018)

You might also like