VR Programming: Electrical and Computer Engineering Dept
VR Programming: Electrical and Computer Engineering Dept
VR PROGRAMMING
VR Toolkits
System architecture
VR Programming Toolkits
Are extensible libraries of object-oriented functions
designed to help the VR developer;
Support various common i/o devices used in VR (so
drivers need not be written by the developer);
Allow import of CAD models (saves time), editing of
shapes, specifying object hierarchies, collision detection
and multi-level of detail, shading and texturing, run-time
management;
Have built-in networking functions for multi-user
interactions, etc.
VR Toolkits can be classified by:
Whether text-based or graphical-programming;
The type of language used and the library size;
The type of i/o devices supported;
The type of rendering supported;
Whether general-purpose or application specific;
Whether proprietary (more functionality, better
documented) or public domain (free, but less
documentation and functionality)
VR Toolkits in Early ‘90s
RenderWare (Cannon), VRT3/Superscape (Dimension
Ltd.), Cyberspace Developer Kit (Autodesk), Cosmo
Authoring Tool (SGI/Platinum/CA), Rend386 and others;
They allowed either text-based programming
(RenderWare, CDK and Rend386), or graphical
programming (Superscape and Cosmo);
They were platform-independent and generally did not
require graphics acceleration hardware;
As a result they tended to use “low-end” i/o devices
(mouse) and to support flat shading to maintain fast
rendering.
Rend386 scene
VR Toolkits discussed in this chapter
Name Appl Prgm Propri Language
mode etary
Java3D General text no Implemented in C
(Sun Micro) Purpose Programming in Java
Vizard Toolkit General Text/ yes OpenGL-based Python
and PeoplePak Purpose graph scripting language
WorldViz
GHOST (SensAble Haptics for text yes C++
Technologies) Phantom
H3D Haptics/ text no C++
Graphics
PeopleShop Military/ graph yes C/C++
(Boston Dynamics) civilian
Unity 3D Game Text/ yes JavaScript, C#,
engine graph and Python
The scene graph:
Is a hierarchical organization of objects (visible or not) in the
virtual world (or “universe”) together with the view to that world;
Scene graphs are represented by a tree structure, with nodes
connected by branches.
Visible objects are represented by external nodes, which are
called leafs (they have no children). Example nodes F, G, H, I
Internal nodes represent transformations (which apply to all their
children) Root node
A Internal node
B C
External node
D E J
F G H I
Scene graphs are not static
Scene
palm Ball
palm
Ball
palm
Button
Panel
Ring
Thumb
Middle
Index
Scene
palm
Button
Panel
Ring
Thumb
Middle
Index
Knob 1
Model Geometry
Define networking
Start Simulation
Render scene
(graphics, audio,
haptics)
Exit Simulation
VR Toolkits discussed in this chapter
Name Appl Prgm Propri Language
mode etary
Java3D General text no Implemented in C
(Sun Micro) Purpose Programming in Java
Vizard Toolkit General Text/ yes OpenGL-based Python
and PeoplePak Purpose graph scripting language
WorldViz
GHOST (SensAble Haptics for text yes C++
Technologies) Phantom
H3D Haptics/ text no C++
Graphics
PeopleShop Military/ graph yes C/C++
(Boston Dynamics) civilian
Unity 3D Game Text/ yes JavaScript, C#,
engine graph and Python
Java and Java 3D
Java
object oriented programming language
developed for network applications
platform independent
slower than C/C++
Java 3D
Java hierarchy of classes that serves as an interface to 3D
graphics rendering and sound rendering systems
Perfectly integrated with Java
Strong object oriented architecture
Powerful 3D graphics API
Model Geometry
Networking
Model Geometry
Networking
Java 3D geometry:
Geometry can be imported
from various file formats Imported geometry
(e.g. 3DS, DXF, LWS, NFF, loader.load(“Hand.wrl")
Geom.setAppearance(Appr)
Model Geometry
Networking
Java3D node types:
BranchGroup Compilable sub-graph
Switch Select which of the children are visible (useful for LOD)
Node
Loading objects from files
Palm.addchild(IndexProximal );
IndexProximal .addchild(IndexMiddle );
IndexMiddle .addchild(IndexDistal );
Palm.addchild(MiddleProximal );
MiddleProximal .addchild(MiddleMiddle );
MiddleMiddle .addchild(MiddleDistal );
Palm.addchild(RingProximal );
RingProximal .addchild(RingMiddle );
RingMiddle .addchild(RingDistal );
Palm.addchild(SmallProximal );
SmallProximal .addchild(SmallMiddle );
SmallMiddle .addchild(SmallDistal );
Model Geometry
Define behaviors
Networking
Input devices in Java3D
The only input devices supported by Java3D are the mouse and
the keyboard
The integration of the input devices currently used in VR
applications (position sensors, track balls, joysticks, sensing
gloves…) relies entirely on the developer
Usually the drivers are written in C/C++. One needs either to re-
write the driver using Java or use JNI (Java Native Interface) to
call the C/C++ version of the driver. The latter solution is more
desirable.
Java3D provides a nice general purpose input device interface
that can be used to integrate sensors. However, many times
developers prefer custom made approaches
Java3D General purpose sensor interface
class PhysicalEnvironment - stores information about all the input devices
and sensors involved in the simulation
class InputDevice - interface for an input device driver
class Sensor - class for objects that provide real time data
PhysicalEnvironment
InputDevices Sensors
Model Geometry
Networking
Java3D - Animating the simulation
Java3D offers Behavior objects for controlling the simulation
A Behavior object contains a set of actions performed when the object receives
a stimulus
A stimulus is sent by a WakeupCondition object
Some wakeup classes:
WakeupOnCollisionEntry
WakeupOnCollisionExit
WakeupOnCollisionMovement
WakeupOnElapsedFrames
WakeupOnElapsedTime
WakeupOnSensorEntry
WakeupOnSensorExit
WakeupOnViewPlatformEntry
WakeupOnViewPlatformExit
Java3D - Behavior usage
Universe • We define a behavior Bhv that rotates the
Root sphere by 1 degree
• We want this behavior to be called each
frame so that the sphere will be spinning
View platform 2
View platform 1
Model Geometry
Define behaviors
Networking
Java3D - Networking
Java3D does not provide a built-in solution for networked virtual environments
However, it’s perfect integration in the Java language allows the developer to
use the powerful network features offered by Java
Java3D applications can run as stand alone applications or as applets in a web
browser
Server
Java3d Release 3.1 Beta 1 has less system latencies than WTK Release 9
But Java3d has more variability in the scene rendering time
WTK does not have spikes in the scene rendering time
VR Toolkits discussed in this chapter
Name Appl Prgm Propri Language
mode etary
Java3D General text no Implemented in C
(Sun Micro) Purpose Programming in Java
Vizard Toolkit General Text/ yes OpenGL-based Python
and PeoplePak Purpose graph scripting language
WorldViz
GHOST (SensAble Haptics for text yes C++
Technologies) Phantom
H3D Haptics/ text no C++
Graphics
PeopleShop Military/ graph yes C/C++
(Boston Dynamics) civilian
Unity 3D Game Text/ yes JavaScript, C#,
engine graph and Python
Vizard characteristics:
Uses Python which is a scalable and cross-platform;
It is object-oriented and simple to integrate with C/C++
It runs on Unix, Windows, Mac and other platforms;
Uses a 4-window “workbench” which allows programmers to
write and execute code, inspect 3D models, drag-and-drop objects,
and issue commands while the scene is being rendered.
Resource window –
Text list of word assets
Stack of scripts – errors
are highlighted as you type
3D window –
Explore individual
Interactive window –
objects
input commands
Workbench use:
Icon menu
Scene exploration
with the mouse
Importing objects
Vizard virtual hand:
import viz
import hand
viz.go()
#Identify the 5DT glove's port.
PORT_5DT_USB = 0
action
User 1 User 2
Vizard networking example:
Import viz
Viz.go()
Ball=viz.add(‘ball.wrl’) #create a Ball object that is controlled by the other user
#add the world that will be displayed on your computer
#Use a prompt to ask the other user the network name of his computer.
target_machine = viz.input('Enter the name of the other machine'). upper()
#Add a mailbox from which to send messages to the other user. This is your outbox.
target_mailbox = viz.add(viz.NETWORK, target_machine)
#Add an id for the timer.
BROADCAST = 1
(0,0,0)
Provides an extensible environment for z
extending haptic interaction technologies x
Haptic
Scene Creation
Rendering
100 Hz y
Collision detection
Scene Rendering z (0,0,0)
Collision response x 1000 Hz
30 fps
Haptic
Scene Creation
Rendering
Collision detection y
Scene Rendering z (0,0,0)
Collision response x
30 fps
1000 Hz
Haptic State Haptic
Update Servo loop
gstBoundaryCube
gstCone
gstCube
gstForceField gstCylinder
gstSphere
gstTorus
gstConstantForceField gstTriPolyMeshHaptic
y
Head
x
y z y
y x y
Left Shoulder Right ShoulderRight Elbow
Left Elbow x x
z z z x
z
y
Torso
x
z
y
Body
x
z
Static scene graph – only separators and geometry nodes as leaves
GHOST code example:
#include <stdlib.h>
#include <gstBasic.h>
#include <gstSphere.h>
#include <gstPHANToM.h>
#include <gstSeparator.h>
#include <gstScene.h>
Main()
gstScene *scene = new gstScene;
gstSeparator *root = new gstSeparator;
gstSphere *sphere = new gstSphere;
Haptic
Scene Creation
Rendering
Collision detection y
Scene Rendering z (0,0,0)
Collision response x
30 fps
1000 Hz
Haptic State Haptic
Update Servo loop
Normal Force
(depends on spring
and damper coefficients)
Friction Force
(depends on static and dynamic
friction coefficients)
Dynamic nodes
gstButton
behavioral example
Updating the application
Application Process Haptic Process
Haptic
Scene Creation
Rendering
Collision detection y
Scene Rendering z (0,0,0)
Collision response x
30 fps
1000 Hz
Haptic State Haptic
Update Servo loop
phantomSep
Transform M rotation
Camera
phantomSep
Transform M camera
phantomSep
Transform M zaxisOffset
Phantom workspace
Camera Camera
Camera
Dxmax Dphantomxmax
Camera
SurfaceKdampingnew = SurfaceKdampingcurrent/Sfrustum
http://www.youtube.com/watch?v=mrMsb71ZJ1I
Other Applications and Projects
Why H3DAPI over Ghost?
Define character
path
PeopleShop
Initiation
Define
sensors
Define
behavior
Define networking
Scene Geometry
Define character
path
PeopleShop
Initiation
Define
sensors
Define
behavior
Define networking
PeopleShop Characters
38 polygons
2500 polygons
Supplemental bdi.DI-
Guy-LOD.mpg
Character level-of-detail segmentation based on distance to virtual camera
improves real-time performance (up to about 100 characters can be in a scene)
Scene Geometry
Define character
path
PeopleShop
Initiation
Define
sensors
Define
Behavior
Define networking
PeopleShop path specification
Waypoint
Slope
Adjuster
Path
Action End
bead bead
Stacked action
beads
Editing actions
VC 6.6
on book CD
VC 6.5
on book CD
Scene Geometry
Define character
path
PeopleShop
Initiation
Define
sensors
Define
behavior
Define networking
BDI Toolkits
Sensor boundary
Soldier A
Soldier B
When soldier A enters the sensor volume, the system is notified – this triggers
soldier B’s shooting of A
PeopleShop sensors
Sensor boundary
Supplemental
Sensors are user-defined volumes in space that detect
bdi.farmhouse.mpg when a character enters them
(PeopleShop User’s Manual)
Scene Geometry
Define character
path
PeopleShop
Initiation
Define
sensors
Define
behavior
Define networking
PeopleShop Behaviors
Behaviors can be reflex (based on signals received from sensors);
Behaviors can also be specified with decision beads;
Decision beads can be placed on the character’s path (colored red);
The two parameters characterizing a decision bead are
distance and length;
Distance specifies how far from the start of the path the decision
Bead is placed;
Length indicates the distance from the beginning of the decision
region that the decision bead is active;
Decisions can be converted to script:
BDI Toolkits
Decision clauses (IF/THEN/ELSE)
PeopleShop
Run-time
PeolpeShop
Initiation
PeolpeShop
Initiation
VC 6.7
User is interacting in real time with the simulation using a joystick or mouse
and menu. Limited control and immersion. Natural speeds should not be
exceeded.
(from Koechling et al., 1997)
PeopleShop
Run-time
PeolpeShop
Initiation
Sensorized weapon
Omni-directional treadmill
User is interacting in real time with the simulation using a trackers and sensors.
Control is at the joint level and immersion is increased.
(BDI, 1997)
Scene Geometry
Define character
path
PeopleShop
Initiation
Define
sensors
Define
behavior
Define networking
PeopleShop Networking
Task-level change
(action, orientation, velocity)
XBox360+Arcade XBox360+Arcade
Game Console Deployment
Wii+WiiWare PS3 No No No
(Licenses Required)
PS3 Sony NGP
Disatvantages
Scene manipulation done strictly through source, leads to
slow turnaround.
Higher level control is up to the programmer.
3D sound very buggy.
Community support only, no longer any commercial
support.
Why Unity?•Free edition offers robust development
environment and educational licenses
available.
Unity
•Supports multiple programming
Price Free / $1500 languages to design and manipulate the
Graphical Editing Yes scene.
Plugin required? Web only
Project Panel
Unity – Project Panel
This panel shows all of the available game assets in the
current project directory.
Game assets can include scripts, prefabs, 3D models,
texture images, sound files, fonts, etc…
New assets can be added by simply dragging them into
the project panel or placing them into the project
directory.
These files can be edited and saved using external
programs and the scene will be updated automatically.
Unity supports a number of 3D model formats and
converts to the Autodesk FBX format when added to the
project.
Unity - Scene Hierarchy
Provides a means of adding new game objects and
managing existing ones.
Game objects can contain a combination of transforms,
graphics objects, physics colliders, materials, sounds,
lights, and scripts.
Each game object in the hierarchal tree represents a node
in the scene graph.
Similarly to VRML and Java3D, the scene graph
represents the relative spatial relationship of objects.
Example: A finger connected to a hand will translate
accordingly when the hand is moved.
Unity - Simple Hierarchy Example
Unity - Inspector
Shows the components attached to currently selected
game object and their properties.
Manual control over an object’s transform allows precise
placement of objects.
Variables exposed by scripts attached to the object can be
viewed and set through this panel, allowing parameters to
be changed without the need to edit source.
These changes can be done while the project is live and
the scene will update immediately.
Unity - Simple Game Object
Defines spatial properties
(Transformation matrix)
Console
Unity - Scene Editor
Allows graphical monitoring and manipulation of scene
objects.
Switch between various orthogonal and perspective
views.
Objects can be translated, rotated, and scaled graphically
with the mouse.
When live, the editor works like a sandbox in which you
can play around with objects without actually modifying
the scene.
Shows “Gizmos” for invisible scene objects, such as light
sources, colliders, cameras, and sound effects.
Unity - Simple JavaScript Example
Public variables are exposed to the editor, allowing monitoring and editing of
the live scene. This also allows for communication between objects.
The Update() method is called at every frame.
In this example, every time the left-mouse button is clicked (1) a copy of the
input object is created and added to the scene in front of the camera (2), the
cube counter is increased (3), a randomly colored material is used (4), and a
force is applied (5). This gives the appearance that the object is being
launched away from you.
(1)
(2)
(3)
(4)
(5)
Unity - Complex Scene
Unity - Asset Store
A marketplace to buy
and sell assets used
within Unity.
This includes 3D
models, textures, scripts,
etc…
Can be used to
drastically reduce
development time, or
sell assets you have
created.
Unity - Union Marketplace
Similar to Apple’s App Store, this is a
marketplace in which games can be sold for
various platforms.
Allows developers to reach out to markets that
would be otherwise inaccessible.
70% of profits go to the developer while 30%
goes to Union.
Unity - VR Applications
Unity is able to use .Net libraries and external
shared libraries.
This enables the use of nearly any hardware
device within Unity applications.
Cameras can be used to create augmented reality.
Unity - AR on IPhone
Unity on iPhone