XR-Geometry Calibration

Lets Smode know the position of cameras and Led Screens in the real world

Video Tutorial

A video tutorial that uses a simulator so you can learn the calibration process without a real Stage
You can download the project file here: calibration-tuto-simulator

1) Theory


How to let know Smode the positions of cameras and screens ?
That's the job for the geometry calibration. This calibration part will enable to Smode to detect the positions of the screens in the real stage to adjust the virtual one.
The calculation works thanks April Tag Generate an April Tag Read More and April Tag detector modifier Detect all the April Tag in frame Read More , called Locators
The geometry calibration part consist of taking a sufficient number of shoot called Single frame of different points of view of your real stage with April Tag Generate an April Tag Read More broadcasted in screens. During this step, you will have to do succinctly:
  • Take a shoot (= store an single frame)
  • Move the camera
  • Wait for the tracker Position&Orientation deviation to be at 0 ( or a very low value )
  • Take a shoot (= store an single frame)
  • Move the camera
  • Wait for the tracker Position&Orientation deviation to be at 0 ( or a very low value )
  • Take a shoot (= store an single frame)
  • Move the camera
  • Wait for the tracker Position&Orientation deviation to be at 0 ( or a very low value )
  • etc

You don't have to store a frame that display all the screen. But be sure to detect enough of April Tag Generate an April Tag Read More . Modify the focus of your camera if needed.

2) Before Starting a geometric Calibration

Before launching the latency calibration, you must ensure that:
Check that the camera position and orientation in the Stage is approximately the same as in the Real Stage
For this you can:
  1. Ask the people who set up the tracking device where de 0 is on the real stage.And place the Tracking System To visualize tracked points using a tracking device Read More to this point.
  2. Moves the camera up-down, front-back, and left-right axes. And rotate,if needed, the Tracking System To visualize tracked points using a tracking device Read More according to your observation
  3. Pan and till the camera to verify that the camera look in the same direction, if not you need to offset the orientation of the Tracker a Tracker is part of a Tracking System Read More
Once you ensure the previous points, you can start a geometry calibration.

2.1) For Stype tracking system

  1. [On Stype world]: Also ensure that the camera position is correctly supported by Stype (seen by all cameras). Example below is not good:
  2. [On Stype world]: Calibrate the min and max zoom of the Stype Computer.

For FreeD tracking system

You need to report the maximum and minimum values of zoom on your Camera model. Zoom in and out until the maximum values for min and max are reached. Verify if the values set in the Custom Zoom Interval are correct.
If not, you can set it manually.

3) UI Check-out

Go to the Geometry Tab of the Calibrator Editor Editor panel for XR calibration Read More
  1. Viewport : Display the stream of the XR Calibrator A specific tool for XR calibration Read More Video Input Extract the image from a Video Input Device Read More
  2. Enable Detector : Enable the detection of April Tag Generate an April Tag Read More and display helpers in the viewport.
  3. Detection count : Number of April Tag Generate an April Tag Read More detected. (result of the April Tag detector modifier Detect all the April Tag in frame Read More )
  4. Tracker information : Display the current position orientation of the Tracker a Tracker is part of a Tracking System Read More of the Physical Camera A camera that simulate a real-one Read More such as the deviation. A positive value of Position Deviation and Orientation Deviation means that your tracker is currently moving.
  5. Send Locators : Display a April Tag Grid Generate a grid of April Tag Read More in each Led screens.
  6. Store Single Frame : Trigger to store a frame for calibration
  7. List of single frames : every single frames appears in that list
  8. Single frame information: display the number of April Tag Generate an April Tag Read More detected for each screens and the pixel gap between their position in the video input stream and the stage simulation.
  9. Evaluate : Make an average evaluation of the pixel gap
  10. Calibrate : Start a calibration.
  11. Console output
  12. Save as Calibration State : Save the calibration results as a calibration state.
  13. Calibration States list : Every calibrations results can be called back as states. They appears in that list.
Learn more about XR-Geometry Calibration Lets Smode know the position of cameras and Led Screens in the real world Read More

4) Calibration process


Enable "Send Locator" (1)
Wait until the "Standard deviation" parameter for position and orientation reaches 0 (2). This data represents "jitter" in the signal output from your tracking system. Either the camera is not yet stable or there is a problem, verify with the people who set up the tracking device tracking system.
Verify that a sufficient number of tags are detected (3). If necessary, adjust the Focus and use the Enable detector function (4) without being on Air to view the detected tags in the viewport

Then you can store single frame (5).
and you can move the camera for the next frame,then wait until it is stable (Position/orientation deviation),Then store single frame (5).
The last steps need to be repeated several times.

When you have multiple frame: you can press Evaluate (6) and delete or mute frames with errors way above the average.

Once you have verified your frames press Calibrate (7)
Wait until the toggle is automatically unset at the end of the calibration

4.1)Stype Calibration process

When using Stype you need to chose if you want to use the optic data from Stype or calibrate a Prime lens optic in Smode in your Physical Camera A camera that simulate a real-one Read More with the mode parameter

4.2) FreeD Calibration process Beta

FreeD Calibration take time ( 30 min to 1 hour )
1) Roughly place the position and orientation of the tracking system in the stage (which corresponds to the position and direction of the rail)
2) Scan zoom to get the min & max zoom => c.f. display in the Physical Camera "custom zoom interval
3a) Go to a wide zoom level and capture frames at both ends of the rail in high and low position with different orientations so that we cover the 4 corners of the camera image with aprils tag. (2x2x2 => 8 frames in total)
3b) Set the polynomials Fov, K1, k2, ShiftX, ShiftY to degree 1 and initialize the coefficients with: fov=1.5 (radians), k1=0 and k2=0. and calibrate: Position/Orientation tracking system + Fov, K1, K2, Shift, aspect ratio.
4a) Staying in the current position is to engage different new zooms, covering the whole matrix of the detection camera, with the the whole matrix of the detection camera, especially in the wide shots (=> 8 frames in total)
4b) Calibrate with degree 2
4c) Calibrate with degree 3
5a) Look for problematic areas, make more frames
5b) Calibrate with degree 4
5c) Go back to 5a if there are still problematic areas
to save your Camera model check: Physical Camera A camera that simulate a real-one Read More

5) Export Calibration

If you are satisfied with the calibration, export it. This will create a .geocal file directly in the Smode project.

6) TroubleShoots

  1. Try calibrating with only one frame enabled
  2. Verify that the screen are connected to the right output
  3. Verify your UV if you are using Fbx File
  4. Verify that the orientation of the screen is correct ( with a test pattern for example )
Next step: XR-Color Calibration Blend perfectly the walls colors with the virtual surrounding. Read More

See Also: