Skip to content

Latest commit

 

History

History
49 lines (26 loc) · 2.92 KB

sample-projects-virtual-camera.md

File metadata and controls

49 lines (26 loc) · 2.92 KB

Contents | Home > Sample projects > Virtual Camera test

Virtual Camera Test

This sample shows how you can use camera tracking data to achieve an MR effect (e.g. virtual set extension on an LED wall). Although you would not use an iPhone camera on a real production, you can in principle replicate this with any LiveCapture input device.

It tests the following features:

  • Component property replication
  • Tracked perspective projection policy

Camera and screen space setup

The MR effect requires that the camera poses are tracked with respect to a known origin in the real world. An image marker is used to define the origin of the tracking space in the real world, and the "ClusterDisplay" object in the scene hierarchy defines the tracking space in the game.

Virtual Camera Companion App

A customization of the Virtual Camera Companion App can be found in the VirtualProduction git repository on the branch wen/experimental/vcam-marker-origin. This version of the of the app shows the ARKit camera feed and reports camera poses in the tracking space (i.e. with respect to the image marker).

You can customize the image marker by modifying the XR/ReferenceImageLibrary asset. Print out the image used by this asset.

IMPORTANT: The "Physical Size" setting must match the printed version.

Scene Setup

Open the SampleScene in the VirtualCameraTest project. Place printed image marker near the displays. Note the orientation of the local image axes (x: right, y: out of the page, z: up).

TIP: Place the marker somewhere convenient so that you can measure the positions of the displays relative to it. To avoid measuring angles, keep the image axis-aligned with the display surfaces.

Use the ClusterRenderer editor to enter the physical sizes and positions of the projection surfaces (in meters) relative to the image marker. Note that "Local Position" specifies the center of the surfaces. Also note the local axes of the "ClusterDisplay" game object. These axes should align with the axes of the printed image marker.

TIP: You can specify a single project surface if you don't want to run the demo in a cluster (run in fullscreen in the Editor or standalone without arguments).

The game is now ready to build and run.

Running the Demo

Run the standalone player using Mission Control.

Build and run customized Virtual Camera app on a supported iOS device. Connect to the emitter host. Point the camera at the marker image until the 3D axes appear (this could take a few seconds). You should now see the game responding to movements of the phone.

NOTE: The axes may be clipped (not drawn) if you're really close to the marker. Try pulling back a bit if you're not seeing them.