In order to Extended and augmented reality for XR studio
, the colors of the video-input must be as close as possible to the colors of the overlayed image.
However, between the screens and the camera, the image passes through several colorimetric profiles.
In the same way, the Leds composing the screens are oriented and welded in a certain way (defined by the manufacturer).
The red, green, blue and white of an image do not have the same intensity depending on the angle of view of the camera.
Because the screens emit light on each other, the angles of incidence between each of these displays causes a luminous and colorimetric intensity different from the other extremities.
The objective of the color calibration is to change the colors of the screens according to the camera angle, as well as their positioning to intelligently erase the angles of the setup and make the screens "transparent".
The picture below has been taken before applying a color calibration of the screens. you see that the Led displays too much blue and green. Also the junction between the wall and the floor is clearly visible. The extended trick does not work at all.
How it works:
Just like the Geometric Calibration, Smode will need several viewpoint of the setup to determine the color correction to be applied to the screens.
These points of view are not images but sequences of grids of different colors casted in the screens. Smode will then compare each color sent in each square with the one received. He will then determine which is the color model of your setup.
According to the calculated color model, Smode will, in a second step, predict an inverted color model. The inverted color model corresponds to the color correction to be applied so that the wished colors appear on the screen.
the simulation of the colorimetric profil of your setup
Inverse color model
: Define which color must be displayed for which location on which screen to match the requested color.
2) Before Starting a Color Calibration
Your Smode is fully optimised in graphical perfomances. Use the Profile feature if needed.
The frame rate is stable. If not, it might be because your set-up isn't fully genlocked
You are filming every impacted screens
the camera does not move
Ensure every point to check before starting a XR-Geometry Calibration
Turn off every lights on the stage
3) UI Check-out
: Display the stream of the XR CalibratorVideo Input
: Enable the detection of April Tag
and display helpers in the viewport.
: Number of April Tag
detected. (result of the April Tag detector modifier
: Display the current position orientation of the Tracker
of the Physical Camera
such as the deviation. A positive value of Position Deviation and Orientation Deviation means that your tracker is currently mooving.
: Display a April Tag Grid
in each Led screens.
: Start the shoot of a viewpoint for calibration
List of viewpoints
: every viewpoints appears in that list
display the number of April Tag
detected for each screens and the pixel gap between their position in the video input stream and the stage simulation.
: Make an average evaluation of the differences of colors between Emitted colors and Recevied colors
: Start a calibration. Calculation depends on the number of viewpoints shooted.
Save as Calibration State
: Save the calibration results as a calibration state.
Calibration States list
: Every calibrations results can be called back as states. They appears in that list.
4) Calibration process
In the XR Calibrator, Color tab, enable the Enable locator.
Try to detect the maximum amount of apriltag, especially at the junctions of the walls and the corner, as these are the places where there is the most need to catch up on color.
Play with the focus of the camera to detect more of them.
There is also the possibility to change the "Quad Decimate" parameter (In -> Detector: April Tag) to increase the number of tags displayed in the screens.
The more you Decimate, the faster the calibration will go because there will be fewer tags to analyze.
Take a viewpoint shoot.
Remove the locators before each viewpoint shoot to optimize performances.
When a ViewPoint is taken, several different colors are sent to the screens at the level of each tag. Smode then records, for each detected tag, each color: the one that is emitted at this place of the screen, and the one that is received.
It is then able to deduce the difference between a color sent, at a given place on the screen, compared to the color received at the same place.
You can visualize the data of a viewpoint by unfolding "viewpoints" at the bottom of the color parameter of the XR-Calibrator:
Once your viewpoint has been shooted, moove the camera to get another point of view, wait for your camera being stable and take another one.
You don't have to look at all the screens for a viewpointshoot. Look at the interessants parts of the stage to be calibrated.
we advise to take at least those viewpoint:
Try as much as possible to take viewpoints seen from the front of the screens. Feel free to shoot twice on the same position. The colors sent being randomized, this can improve the quality of the measurements.
in some cases, it can be interesting to "merge" the color models of the screens that make up the walls of your setup.
Select the corresponding XR Display information
and then in Color Model -> General -> Type -> Switch to Merge.
Do the same with the inverse color model.
A panel warns you that several target parameters will be deleted. Press "YES".
In XR Display information
, merge the color model of one screen with the other one.
You are ready for starting a calibration. Press Calibrate.
It's time for you to take a break, because the calculation will take around 20 minutes.
The calculation generate a collection of LUT for your setup called "Smart Lut". There are automatically savec in a LUT folder associated to your project.
You can activate and finetune the color correction thanks the following parameters in the XR Display information
By screens, modify the impact of the correction in Luminance, chrominance and its overall intensity (Overall).
To avoid bounding phenomena, use Ditherize. This will smooth the correction areas.
You will certainly need a better view of the result of your color calibration.
You can see the result calculated in live by Smode by simply visualize the "
" parameter bank:
This visualisation can be created in a compo or processor thanks the Visualize Smart Lut
1) RGB LUT behaviour curves
: View how the LUT respond to RGB signal (linear curve = perfection)
2) Grayscale LUT Behaviour curves
:View how the LUT respond to Grayscale signal (linear curve = perfection)
3) RGB/Grayscale gradient comparison
: Visualize a Red/Green/Blue gradient emitted (on the top) and received (on the bottom).
4) Angle RBG behaviour
The emitted color is out of the circle. The circle is a representation of how this color respond on horizontal and vertical axis.
6) Export Calibration
Just like the XR-Geometry Calibration
, you have the possibility to save your color calibration into .smartlut file. Those files can be reimported later.
Remember that every 3D Scene
can be placed in classical Scene
which mean that you can apply any other color 2D Modifier
for specific content or over your entire Show.
The Display Angle Mask
can also be helpful because it's role is to mask any 2D Modifier
or 2D Generator
according to the angle of the selected Stage Element