Camera Tracking

Summary

Camera tracking has been available in disguise for some time. This page gives an overview of how camera tracking is achieved, with a focus on some of the upgrades brought in r25.0.

 

The main changes are:

  • Under the hood refactoring to reduce code complexity

  • Adding receive time smoothing to object tracking sources, rather than just cameras

  • Additional graphs and logging

  • Added the ability to disable particular axes in tracking sources

  • Added lens focus parameters (focus distance and aperture) for tracking sources which support them

  • General UI improvements.

General UI Improvements

We have now highlighted values in the camera that have been overridden when using external tracking sources or spatial calibrations. We have also added tooltips that indicate to the user that each value has been overridden.

We have also added an Open Driver button to go straight to the driver from the tracking source, and vice versa we have added an Open Tracking Sourcesbutton in the driver to go to the tracking source to improve navigation.

Added receive time smoothing to tracking sources.

Within the tracking source, we have added a number of new options. The first is the Smooth receive time box, which when turned on smooths out the tracking data received by disguise, but may add a small amount of latency. This was previously an always-on option but can now be optionally turned on or off by the user.

 

Added the ability to disable particular axes in tracking sources

The Values box can be expanded to provide all values being provided within the tracking source, and these will be highlighted in green. There are also check boxes next to each value to allow users to untick values that they don’t want passed through to the tracking source. These values will change within the camera when unticked, and can be edited by the user manually within the camera.

 

Add a tracking source to a camera

  1. Under Devices add a PositionReceiver.

  2. In the PositionReceiver under Drivers, add the relevant tracking driver and set it up to receive tracking data.

  3. Engage the position receiver.

  4. Create a Camera in the Stage.

  5. Under the Camera’s Settings, select the tracking source. This should have been automatically generated by the driver once tracking data started to be received.

  6. Data from the Tracking Source should start overwriting the values in the camera.

Toggle values applied from the tracking source

  1. From the Camera, open the Tracking Source editor

  2. Expand the Values section.

    • Values highlighted in green are being received from the tracking source.

    • Values with a dark grey background are not being received.

    • Values with a light grey background are received but toggled off.

     

  3. Untick the checkbox to stop the value being applied in the camera.

  4. Tick the checkbox to apply values again.

Toggle smooth receive time

To toggle receive time smoothing, check or uncheck the ‘Smooth receive time’ box in the tracking source. Turning this one should result in smoother tracking data, but may add a small amount of latency.

 

To use lens focus parameters in Unreal Engine

  1. Set up a Renderstream layer with an Unreal asset, with a channel created from a CineCamera

    Map the channel to a Camera or MR set’s backplate or frontplate.

  2. Start the Renderstream workload.

  3. Adjust the Focus mode under Camera->Physical->Lens->Focus.

    There are 3 options:

    1. Disabled: No depth of field

    2. Manual: Set the focus distance and aperture manually to adjust the depth of field. These can also be received from tracking sources which provide them

    3. Object tracking: Select an object in designer to calculate the focus distance from. The aperture can be set manually or received from tracking data

  4. Adjusting the lens focus parameters in the camera should cause the depth of field effect to change in Unreal Engine.

 

Troubleshooting

FAQs

What should I do if the logs contain ‘Attempted to access tracking data beyond end of buffer’?

We dynamically adjust the point at which we look back in the buffer to find smooth data, based on the history of tracking data receive times. You may occasionally see this message if tracking data comes in later than expected. You should find that the message stops after a second or two as the buffer lookup updates dynamically to account for the latest data. If this error appears constantly, there is either a bug in d3 or the tracking data is not reliable.

 

How do I read the tracking smoothing graph?


New tracking smoothing graphs have been added to help debug tracking issues. The lines in the graph can be interpreted as follows:

  • Lookup offset. This is the time in ms between the smooth tracking data we are looking up and the latest data received. This should almost never go below zero, as that would mean we received tracking too late to be able to smooth it. If it is consistently much greater than zero, that means we are adding unnecessary additional latency while waiting to receive smooth data.

  • Index delta is the change in index of the tracking data we look up in the buffer each d3 frame, e.g. if the tracking frame rate is twice the d3 frame rate this would be around 3. The smoothness of this line shows the success of the receive time smoothing - we are aiming for a consistent index offset between each packet of tracking data we look up.

  • tReceived delta is the change in the receive time of the tracking data each frame. This shows us how smooth the tracking data would be without receive time smoothing - it can be compared against the index delta line to check that receive time smoothing is working.

  • Instantaneous smooth data delay tells us how far we would have to look in the buffer to ensure we have received tracking, based on the latest received packet. This is aggregated over time to give a stable lookup time.

  • Local smooth data delay is a maximum over time of the instantaneous smooth data delay, for this machine. It shouldn’t change often.

  • Global smooth data delay is the maximum smooth data delay over all machines, which is particularly relevant when we are redistributing tracking data across machines. It should provide a consistent period to look back in the buffer to ensure we have smooth tracking data.

Requirements

General Requirements

  • Tracking sources only override values which they are actually receiving - values which they don’t provide can still be edited in the camera or object.

  • The user can toggle tracking values on or off to stop them being applied to the object/camera.

  • The user can turn on and off receive time smoothing in both object and camera tracking sources.

  • The UI clearly indicates which tracked values are being overridden, and where from.

  • Aperture and focus distance can be set on the camera and used to control depth of field in UE assets.

  • Aperture and focus distance can be set by a tracking source which provides them.

  • Focus distance can be set to automatically track an object in the stage.

Limitations

Non-functional requirements

  • This work builds upon existing camera and object tracking workflows in d3.

  • Object tracking now shares some of the features of camera tracking, such as receive-time smoothing.

  • Previous Renderstream assets should work as they did before (with no focus parameters).

Technical description

The main part of the refactor for r25.0 involved consolidating the logic used for object and camera tracking sources. Receive time smoothing was previously implemented on the camera, this has now been moved to the TrackingDataHistory class, which is used by the ObjectTrackingSource.

 

There is now an inheritance hierarchy of tracking data types, based on ObjectTracking, allowing different types of tracking to be treated in the same way where appropriate. These include CameraTracking, CameraRenderTracking and SkeletonTracking.

 

To achieve the toggling of specific pieces of tracking data, and allow tracking sources to define which values they send, the TrackingFlags struct was created. This can be set in a tracking source where the values sent differ from the defaults.

 

UI elements

Object and camera tracking source widgets have been modified, with various new options:

  • Added tick boxes to the values in the tracking source widgets, to toggle the values on and off

  • Added buttons to move between tracking sources and the corresponding drivers

  • Some of the tracking delays (e.g. lens delay offset, rotation delay offset) have moved from the camera to the tracking source.

  • An additional set of ‘Focus’ parameters have been added under the Lens parameters in the Camera widget

 

Tests added

Added various unit tests to ensure the reliability of the tracking code. In particular, test_cameratracking.cpp contains some high-level tests of typical camera tracking functionality. Other unit tests have been added for specific objects and structs, e.g. test_cameratrackingsource.cpp, test_cameratrackingstructs.cpp etc.

 

Additional information

Integration (with existing software)

Focus parameters are supported for CineCameras in Unreal Renderstream assets. Assets must use a Renderstream plugin version RS2.0 and above to achieve this.

 

Hardware

External tracking systems are integrated as before.

 

Compatibility

Backwards compatibility with previous Renderstream versions has been maintained.