You are on page 1of 118

PHYSLIGHT

ANDERS LANGLANDS & LUCA FASCIONE


WETA DIGITAL

physlight is weta’s system for reproducing the entire imaging chain of a physical scene.
PHYSLIGHT

- Lighting in physical units

2 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

It consists of two orthognal components: using physical units in light transport,


PHYSLIGHT

- Lighting in physical units


- Modelling camera response

3 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

…and simulating the response of digital stills and cinema cameras to be able to image a digital scene as if it was shot with a real camera.
PHYSLIGHT

- Lighting in physical units


- Modelling camera response
- In use in production since 2015

4 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

it’s also production proven: we’ve been using it on all shows for the last 5 years.
PHYSLIGHT
Luca Fascione Luke Millar

Kimball Thurston Dan Lemmon

Johannes Meng Guy Williams

Sehera Nawaz Erik Winquist

Johannes Hanika Ross McWhannell

5 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Before we begin, this is mostly Luca’s work but many people at weta have contributed on both the technical and production side a few of whom we’d like to thank
here
6 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

To understand the problem we’re trying to solve, let’s consider a simple scene.
7 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

To understand the problem we’re trying to solve, let’s consider a simple scene. light leaves a light source, bounces around a bit, and is captured by a camera
8 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

and the camera generates an image of the scene


9 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

the bit in the middle we can account for:


- we use Manuka, a state-of-the-art spectral pathtracer
- we think we can calculate how light energy is transported through the scene with a high degree of accuracy (or at least, with known inaccuracy)
10 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

But what about either end?


- How do we know how much light is entering the scene?
- Given some amount of light reaching the camera, how do we know what the pixel value will be?
11 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

In visual effects we spend a lot of time trying to make CG objects feel like part of the photography…
12 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

…we do a lot of work on top to make shots sing, but making our CG ape feel like it’s a physical object there in the scene, shot with the picture camera is the
foundation on which the rest of the lighting is built.
So knowing these values would be hugely beneficial.
PHYSLIGHT
TRADITIONAL METHOD

- Assume we can’t know absolute values of scene


- Ignore camera response
- Light to match a calibration object

13 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Traditionally though (since the film days), we assumed that those functions were unknowable or too hard to measure. So instead we’d ignore the camera response
and set our lighting to match a known calibration object
PHYSLIGHT
TRADITIONAL METHOD

- Capture HDRI + grey ball reference

14 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

You start by capturing the HDRI onset with a DSLR


PHYSLIGHT
TRADITIONAL METHOD

- Capture HDRI + grey ball reference

15 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

along with a “balls pass” with the picture camera.


PHYSLIGHT
TRADITIONAL METHOD

- Capture HDRI + grey ball reference


- Use HDRI to light CG grey ball

CG Ref

16 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Then you light a CG grey ball using that HDRI


PHYSLIGHT
TRADITIONAL METHOD

- Capture HDRI + grey ball reference


- Use HDRI to light CG grey ball
- Tweak light until they match

CG Ref

17 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

And you tweak the exposure and tint parameters of your IBL until the CG ball matches the reference ball. And you hope that your CG ball BRDF matches the real
one, even after it’s been out under the elements for several weeks
PHYSLIGHT
TRADITIONAL METHOD

- Capture HDRI + grey ball reference


- Use HDRI to light CG grey ball
- Tweak light until they match
- Lights often in “stops”
CG Ref

18 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

The lighting is now in some arbitrary space relative to the grey ball reflectance, so we use ‘stops’ as a unit of relative exposure when talking about brightness of
lights, since we don’t know what the actual values in the scene were
PHYSLIGHT
TRADITIONAL METHOD

- PROBLEM: camera response and lighting mixed together


‣ What if we change camera between shots?

‣ Have to adjust lighting, even if lighting is unchanged

19 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

The trouble is now our camera response and our lighting are baked into a single tint value on the IBL. If we want to share light rigs between shots but the camera
settings have changed, we can’t because there’s no way to separate the contribution of the camera from that of the lights
PHYSLIGHT
TRADITIONAL METHOD

- PROBLEM: camera response and lighting mixed together


‣ What if we change camera between shots?

‣ Have to adjust lighting, even if lighting is unchanged

- SOLUTION: neutral-grade sequence?


‣ Now lighting and camera baked into neutral grade

‣ Lighting is consistent, but subtly off

20 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

A common solution is to grade a series of shots assumed to have “similar” lights and camera settings so that they match visually, then light to that.
PHYSLIGHT
TRADITIONAL METHOD

- PROBLEM: camera response and lighting mixed together


‣ What if we change camera between shots?

‣ Have to adjust lighting, even if lighting is unchanged

- SOLUTION: neutral-grade sequence?


‣ Now lighting and camera baked into neutral grade

‣ Lighting is consistent, but subtly off

21 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

This doesn’t solve the problem, but at least makes like shots consistent. But they will be subtly wrong as variation that should be present has been graded out. All
this grading and tweaking is a lot of time that would be better spent making great-looking images
PHYSLIGHT
TRADITIONAL METHOD

- PROBLEM: how to represent known physical values


‣ Sun & sky

‣ Measured sources

‣ Fire etc.

100,000lx = ??? stop

22 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

We also have no way of representing physical values. How bright should the sky be in stops? How bright should the illumination from a 2k or a practical fire
appear on a character in the shade?
PHYSLIGHT
TRADITIONAL METHOD

- PROBLEM: how to represent known physical values


‣ Sun & sky

‣ Measured sources

‣ Fire etc.

- SOLUTION: fudge it!


‣ Have to throw away useful information
100,000lx = ??? stop

23 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

We can calculate, or we can measure, the correct, absolute physical values for all these sources, but we have to throw that information away because we don’t
have a framework we can use it in - we have to figure out our correct value in “stops” for each new plate and lightsource combination by trial and error because
the brightness of our lights implicitly depends on the plate exposure settings
PHYSLIGHT
TRADITIONAL METHOD

- No way to solve this for film

‣ Too many variable chemical processes


- But we’re mostly digital now.

‣ We can do better!

24 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

In the film days this relative matching to ref was the best we could do. But nowadays most shoots are digital. We don’t have to deal with chemical baths and film
scanners any more so there must be some function we can measure that relates exposure to pixel values
PHYSLIGHT

25 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

So…
PHYSLIGHT

H
26 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

So given a radiance entering the camera system we first want to find the exposure or energy density, H, at the camera sensor…
PRGB
PHYSLIGHT

L
Wcam

H
27 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

…and then we want to find some function W_cam, that tells us how each camera converts that to an RGB pixel value, P.
PHYSLIGHT
DIGITAL SOLUTION

- Measure camera response function, Wcam

- Convert exposure, H, (in Wm −2s) to pixel value, PRGB

- Do the reverse to find L from PRGB

28 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Conversely, if we have captured an HDRI with a measured camera, we apply the inverse to find the exposure and hence the radiance that arrived at each
photosite. We can then use that to reconstruct the absolute physical values of the lighting in the scene
IMAGING

πtS PRGB = Wcam(H, λ)


H= ⋅L
CN 2
Imaging Ratio Sensor Response Function

29 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

We call the first part…


IMAGING

πtS PRGB = Wcam(H, λ)


H= ⋅L
CN 2
Imaging Ratio Sensor Response Function

30 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

the imaging ratio


IMAGING
IMAGING RATIO

- H depends on:
‣ t - exposure time in seconds
‣ S - ISO sensitivity

‣ N - T-stop (note T not f-number) πtS


H= ⋅L
‣ C - calibration constant (312.5 * scale)
CN 2
‣ L - radiance entering camera system

31 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

The parameters are mostly your standard camera settings…


IMAGING
IMAGING RATIO

- H depends on:
‣ t - exposure time in seconds
‣ S - ISO sensitivity

‣ N - T-stop (note T not f-number) πtS


H= ⋅L
‣ C - calibration constant (312.5 * scale)
CN 2
‣ L - radiance entering camera system

32 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

We use T-stop rather than f-number for aperture as this is the value we get from cine cameras, and we can then ignore the lens transmission in this formula.
IMAGING
IMAGING RATIO

- H depends on:
‣ t - exposure time in seconds
‣ S - ISO sensitivity

‣ N - T-stop (note T not f-number) πtS


H= ⋅L
‣ C - calibration constant (312.5 * scale)
CN 2
‣ L - radiance entering camera system

33 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

The calibration constant, C, is basically the overall sensitivity scale of the sensor. We start with a baseline value of 312.5 for historical reasons, and then measure
each camera’s sensor relative to that.
IMAGING
IMAGING RATIO

34 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Note that we’re also ignoring the angle that light hits the sensor as well. Basically this means we’re ignoring some portion of vignetting (since we’re just going to
apply it to taste in comp anyway).
IMAGING
https://github.com/wetadigital/physlight/blob/master/physlight_imaging.ipynb

35 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

There’s a notebook working through an example of this at the physlight repo on the wetadigital github
IMAGING

πtS PRGB = Wcam(H, λ)


H= ⋅L
CN 2
Imaging Ratio Sensor Response Function

36 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Once we’ve got some amount of energy, H, at the sensor, the sensor response function, converts spectral exposure to RGB pixel values.
IMAGING
SENSOR RESPONSE

- Wcam is the spectral sensor response function


‣ Converts spectral focal-plane exposure, H (Wm −2s), to RGB pixel values

‣ Equivalent to using CIE Standard Observer but to Camera RGB space rather than XYZ

Wcam(H, λ) = [⟨r̄, H⟩, ⟨ḡ, H⟩, ⟨b̄, H⟩]

37 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

You can think of it like the CIE standard observer functions, except that instead of going to XYZ we go first to camera RGB space (and from there XYZ and then to
whatever output space we want).
IMAGING
SENSOR RESPONSE

38 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

So that just looks like your typical tristimulus weighting. Simple


MEASUREMENT

But how do we get those weighting functions? How do we go about measuring the spectral sensitivity of a camera that we’re interested in?
MEASUREMENT

So how do we go about measuring the spectral sensitivity of a camera that we’re interested in?
MEASUREMENT

41 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
MEASUREMENT

42 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Double check how even is the illuminance landing on the chart


MEASUREMENT

43 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Then take a number of exposures, 1/3 of a stop apart


MEASUREMENT

44 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Then stitch all the data together into curves


DESIGN
DESIGN
DESIGN 1: AN “EMISSIVE MACBETH CHART”

- Difficult to build well

‣ LED brightness 30mcd to 8000mcd

‣ Camera doesn’t have that much range

‣ 30mcd: f/16 @ ISO800

‣ 8000mcd: f/22 @ ISO100

46 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
e)
to scal
DESIGN (brig htnes s not

DESIGN 1: AN “EMISSIVE MACBETH CHART”

- Difficult to build well

‣ Unclear effect of manufacturing spread and


ageing on peak wavelength and brightness

‣ Big gap between 505nm and 430nm

Based on data from Lite-ON data sheets: http://www.liteon.com

47 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Approx 1 every 10nm from 655 to 505nm,


but note the gap between 505 and 430 nm
ale)
x to sc
DESIGN (brig ht
ness a ppro

DESIGN 1: AN “EMISSIVE MACBETH CHART”

- Difficult to build well

‣ LED brightness 30mcd to 8000mcd

‣ Camera doesn’t have as much range

‣ 30mcd: f/16 @ ISO800

‣ 8000mcd: f/22 @ ISO100

Based on data from Lite-ON data sheets: http://www.liteon.com

48 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Approx 1 every 10nm from 655 to 505nm,


but note the gap between 505 and 430 nm
MEASUREMENT
DESIGN 2: A PRISM

SCHOTT - SF66 - High dispersion Flint

49 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Lining up image to registration for wavelength very tricky


Lots of buckling on one side (1 ray every 10nm, 380..720)
DESIGN
Problem 1:
blurry edges
DESIGN 2: A PRISM

SCHOTT - SF66 - High dispersion Flint

50 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Lining up image to registration for wavelength very tricky


Very few bits in the data, fixed precision!
DESIGN Problem 2:
uneven spacing
DESIGN 2: A PRISM

Actual

SCHOTT - SF66 - High dispersion Flint

51 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Lining up image to registration for wavelength very tricky


Lots of buckling on one side (1 ray every 10nm, 380..720)
DESIGN Problem 3:
unknown sensitivity
DESIGN 2: A PRISM

Actual

SCHOTT - SF66 - High dispersion Flint

52 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

As the sensitivity is still unknown you don’t have features in the image to register to
DESIGN 264mm

22
m
m
m
m
32
16mm
DESIGN 3: A “TRANSMISSIVE MACBETH CHART”

m
m
32
m
- Use narrow-band bandpass filters

m
32
60mm

264mm
66mm

m
m
35
m
4m
th

10 m
id

m
h 8m
w

0u
ve

pt th
oo

de id
Gr

nt t w
de n
In nde
Grey to represent Circular holes have retention

I
cut through hole lip of 0.5mm

53 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
DESIGN
DESIGN 3: A “TRANSMISSIVE MACBETH CHART”

- Incoming light field is uneven

‣ Slits help compensate

54 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
DESIGN
DESIGN 3: A “TRANSMISSIVE MACBETH CHART”

- CamSPECS XL

https://www.image-engineering.de/products/equipment/measurement-devices/588-camspecs-express

55 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
DESIGN
DESIGN 3: A “TRANSMISSIVE MACBETH CHART”

- Incoming light field is uneven

‣ Slits help compensate


- Filters are VERY reflective

‣ This is how they avoid transmitting

56 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
DESIGN
DESIGN 3: A “TRANSMISSIVE MACBETH CHART”

- Incoming light field is uneven

‣ Slits help compensate


- Filters are VERY reflective

‣ This is how they avoid transmitting


- Note spill on chair is just red and green

57 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Excuse horrid auto-color correction from cellphone camera


MEASUREMENT
DESIGN 3: A “TRANSMISSIVE MACBETH CHART”

- Filters only work accurately for angles < 3∘

58 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
DESIGN
DESIGN 3: A “TRANSMISSIVE MACBETH CHART”

- Filters only work accurately for angles < 3∘


a t
- Greater angles shift bandpass region to k ” is
e ffe c
in
lower wavelengths “k t i o n
a
lar is
p o

Plot for near-infrared filter

59 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
filter holder

light source

camera mount

DESIGN 4: THE LIGHTSABER

60 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
MEASUREMENT

61 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Using this device you get a series of pictures like these


MEASUREMENT

62 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

… which come from spectral radiance distributions shaped like this plot on the left hand side
MEASUREMENT

R = hp, r̄i
G = hp, ḡi
<latexit sha1_base64="41bkFCX3jDuLA/QyxZ4VKtXZg6w=">AAACSHicdZBLSwMxFIUz9VXHV9Wlm2ARXEiZ6cPORii60GUt9gGdoWTStA3NZIYkI5TSn+fGpTt/gxsXirgznVaoVg8EDue7lyTHjxiVyrKejdTK6tr6RnrT3Nre2d3L7B80ZBgLTOo4ZKFo+UgSRjmpK6oYaUWCoMBnpOkPr6a8eU+EpCG/U6OIeAHqc9qjGCkddTKdGryALkO8zwiMzlwfCSigK2aB65rXy7y/yC+Xub/AO5mslXPO8+ViCWpTtkqOkxinWLChnbMSZcFc1U7mye2GOA4IV5ghKdu2FSlvjISimJGJ6caSRAgPUZ+0teUoINIbJ0VM4IlOurAXCn24gkm6uDFGgZSjwNeTAVID+ZtNw79YO1Y9xxtTHsWKcDy7qBczqEI4bRV2qSBYsZE2CAuq3wrxAAmEle7e1CV8/xT+bxr5nF3I5W+L2UptXkcaHIFjcApsUAYVcAOqoA4weAAv4A28G4/Gq/FhfM5GU8Z85xD8UCr1BSNnrpU=</latexit>
B = hp, b̄i

63 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Given a pixel, we’ll have a readout from the camera that is simply the scalar product of the incoming spectral power distribution, p, and the response of the
device, r bar, g bar or b bar.
CLICK We will do this for a bunch of different narrow-band filters, so we’ll actually get a family of these readouts
MEASUREMENT
Z
hp, r̄i = p( )r̄( )d
X
= pi r̄i = (p1 r̄1 + · · · + pn r̄n )
i

<latexit sha1_base64="pSqeOHawqN5DqO95kQHQdecmcFI=">AAACvXicdZFdb9MwFIad8DXKV4FLbo6oQC1MVdJ1NEKamAQXXA5Et0l1FRzH7cwcx7JPQFXUP4m44d/gtgmCCY5k6fF73nNsH2dGSYdR9DMIr12/cfPW3u3Onbv37j/oPnx06srKcjHlpSrtecacUFKLKUpU4txYwYpMibPs8u0mf/ZVWCdL/QlXRswLttRyITlDL6XdH1QxvVQCzD7NmAUL1O6E53AEVGpMvaPIcgam39Cgcf7eQ94QpZ1tmauKVILxa2fdwDuhkEHb7Oiq0Pf2uLXH8BIoz0t0HkyqW10P2hMKhheItbf0zb4dwAtwhnGpl2ugNO32omHyajQZH4KHSXSYJFtIxgcxxMNoGz3SxEna/U7zkleF0MgVc24WRwbnNbMouRLrDq2c8O0v2VLMPGpWCDevt9NfwzOv5LAorV8aYav+WVGzwrlVkXnn5tbuam4j/is3q3CRzGupTYVC891Bi0oBlrD5SsilFRzVygPjVvq7Ar9glnH0H97xQ2hfCv+H09EwPhiOPox7xx+bceyRJ+Qp6ZOYTMgxeU9OyJTw4HXwOZDBl/BNKEIV6p01DJqax+SvCL/9Aqz80gg=</latexit>
= dot(p, r) ⇤ spacing

64 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

The scalar product has a few different expressions depending on where you are in an analytical or discrete context, in numpy you can use the dot() operator. In
discrete numerics it is the process of multiplying two vectors element-by-element and adding it all together
MEASUREMENT
8 9
>
> R0 = p0,0 r̄0 + . . . + p0,n r̄n >
>
>
< R1 >
=
= p1,0 r̄0 + . . . + p1,n r̄n
..
>
> . >
>
>
: >
;
Rn = pn,0 r̄0 + . . . + pn,n r̄n i20,...,n
<latexit sha1_base64="IhMpB4JJAXQgDJoQRowWuIEpgCs=">AAAC2HicdVLLbtQwFHXCq4RHAyzZWFSMkFqNnGnLZIM0EpsuS8W0FeNR5DjOjFXHiWynaGRFYgFCbPkEPokdX8Ev4CRDBXS4kqWjc++5T6eV4Nog9MPzb9y8dfvO1t3g3v0HD7fDR49PdVkryqa0FKU6T4lmgks2NdwIdl4pRopUsLP04nXrP7tkSvNSvjWris0LspA855QYRyXhTyxYbrANcMoWXFqiFFk1VlHRBCcJgoNXA2irxKI91OCUKKgStItFVhq929GygWu+RdiJoitRtFkUXRPBAb5svQPYp5BXKeTmFPJaCsxktu4+wIovlgY3ieXYzYT2eqHTNEm4g4bxy9H44BA6MEaHcdyB+GA/gtEQdbYzWYbfJtHR++Mk/I6zktYFk4YKovUsQpWZu0qGU8FcrVqzitALsmAzByUpmJ7b7jANfO6YDOalck8a2LF/KiwptF4VqYssiFnqf30tuck3q00ezy2XVW2YpH2hvBbQlLC9Msy4YtSIlQOEKu56hXRJFKHG/YXALeH3pPD/4HQ0jPaHozduGyegty3wFDwDL0AExmACjsAxmALqTT3rffQ++e/8D/5n/0sf6ntrzRPwl/lffwEhXuHR</latexit>

<latexit sha1_base64="QeAWYJi2MFl3sSLm/yNPZd1WwQ0=">AAACKXicdVDLSgMxFM34rPVVdekmKIJCKTN92NkIBTcuq9gqdMqQyaSaNpOZJhmhDN35LW66cuFfuFFQ1K3f4N60VVHRA4HDOfdwc48XMSqVaT4bE5NT0zOzqbn0/MLi0nJmZbUuw1hgUsMhC8WphyRhlJOaooqR00gQFHiMnHid/aF/ckGEpCE/Vr2INAN0xmmLYqS05GYqkZvQbLsP92Dk0m2H6aiP3PYOdLrdGPnwS4EO5dBJCraZdZgfKpkt502n72Y2zZy9my8XS1CTslmy7RGxiwULWjlzhM1K8e3m8nqQqrqZe8cPcRwQrjBDUjYsM1LNBAlFMSP9tBNLEiHcQWekoSlHAZHNZHRpH25pxYetUOjHFRyp3xMJCqTsBZ6eDJA6l7+9ofiX14hVy24mlEexIhyPF7ViBlUIh7VBnwqCFetpgrCg+q8QnyOBsNLlpnUJn5fC/0k9n7MKufyhbuMIjJEC62ADbAMLlEEFHIAqqAEMrsAteACPxsC4M56Ml/HohPGRWQM/YLy+A4XBqWU=</latexit>
pi,j = pi ( j) j 2 {380, . . . , 720}

65 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

So for example, for the red channel we’d have a few of these dot products, one per filter. The p_ij’s are samples of the lighting through the filter at a few
wavelength bins, we’ll pick as many bins as we have filters. So now we have a linear system, with as many unknowns as we have equations
MEASUREMENT
8 9
< p0,0
> ... p0,n >
= R = {R0 , . . . , Rn }
.. ..
<latexit sha1_base64="bZio0bFO2P4hZ1XDHK9rbkLtvAg=">AAAB/3icdVC7SgNBFJ2NrxgfiQo2NoPBRxHCbh5mGyFgYxmDeUA2LLOTSTJkdnaZmRXCmsJfsbFQxNYPEHs7/8ZJoqCiBy4czrmXe+/xQkalMs13I7GwuLS8klxNra1vbKYzW9tNGUQCkwYOWCDaHpKEUU4aiipG2qEgyPcYaXmjs6nfuiJC0oBfqnFIuj4acNqnGCktuZndOjyFTlx3zZzDeoGSubrLnYmbyZp5+6RQKZWhJhWzbNszYpeKFrTy5gzZasE5en05TNfczJvTC3DkE64wQ1J2LDNU3RgJRTEjk5QTSRIiPEID0tGUI5/Ibjy7fwIPtNKD/UDo4grO1O8TMfKlHPue7vSRGsrf3lT8y+tEqm93Y8rDSBGO54v6EYMqgNMwYI8KghUba4KwoPpWiIdIIKx0ZCkdwten8H/SLOStYr5wodOogzmSYA/sg2NggQqognNQAw2AwTW4Bffgwbgx7oxH42nemjA+Z3bADxjPH1UfmAk=</latexit>

P = ..
> . . . >
: ;
pn,0
<latexit sha1_base64="owMr4Xd9Vj9xqcgkdgU2ZiYk4WU=">AAACbnicdVHNatwwEJbdn6RO22wa6CGhRDQ05BAWe5N0fSkEcslxW7pJYLVsZXm8KyLLRhoHFuNjnyQP0nforc/QS1+gUK13C21oB4Q+fd98M9IoKZW0GIbfPP/Bw0eP19afBBtPnz3f7Gy9uLRFZQQMRaEKc51wC0pqGKJEBdelAZ4nCq6Sm/OFfnULxspCf8R5CeOcT7XMpODoqEnn84C+o0xBhqwOWAJTqWtuDJ83tRCiCcpJHR6FDT1gKi3QHtCW0A1lLKDstuVYutzaE10oLknfd+mVi4FOVy0CZuR0hqyZdPbDbvy21z85pQ70w9M4bkF8chzRqBu2sX/26Wf5ZYfdDSadrywtRJWDRqG4taMoLHHsyqIUClzhykLJxQ2fwshBzXOw47odV0PfOCalWWHc0khb9k9HzXNr53niMnOOM3tfW5D/0kYVZvG4lrqsELRYNsoqRbGgi9nTVBoQqOYOcGGkuysVM264QPdDgRvC75fS/4PLXjc67vbeu2l8IMtYJ7vkNTkkEemTM3JBBmRIBPnubXk73q73w3/pv/L3lqm+t/Jsk7/CP/wFSE69vw==</latexit>
... pn,n r̄ = {r̄0 , . . . , r̄n }
<latexit sha1_base64="EAKROTFP0wTp6nqUKTFAzzwXzbI=">AAACDnicdZDLSgMxFIYz9VbrpVWXboKl6qKUmWlrZyMU3LisYi/QKSWTpm1oJjMkGaEMfQI3voobF4q4El27821ML4KK/hD48p9zSM7vhYxKZZofRmJpeWV1Lbme2tjc2k5ndnYbMogEJnUcsEC0PCQJo5zUFVWMtEJBkO8x0vRGZ9N685oISQN+pcYh6fhowGmfYqS01c3kXA8JKOApdOM5ds28y3qBkvnFnbuTbiZrFpwTu1IqQw0Vs+w4M3BKRQtaBXOmbNV2j15fDtO1bubd7QU48glXmCEp25YZqk6MhKKYkUnKjSQJER6hAWlr5MgnshPP1pnAnHZ6sB8IfbiCM/f7RIx8Kce+pzt9pIbyd21q/lVrR6rvdGLKw0gRjucP9SMGVQCn2cAeFQQrNtaAsKD6rxAPkUBY6QRTOoSvTeH/0LALVrFgX+g0LsFcSbAPDsAxsEAFVME5qIE6wOAG3IEH8GjcGvfGk/E8b00Yi5k98EPG2ydeAJ4S</latexit>

66 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

But if you think about it for a second you can see easily that if we put out p_ij’s into a square matrix, the readouts in a vectors and our unknown r bar into another
vector … CLICK
MEASUREMENT
8 9
< p0,0
> ... p0,n =
> R = {R0 , . . . , Rn }
.. ..
<latexit sha1_base64="bZio0bFO2P4hZ1XDHK9rbkLtvAg=">AAAB/3icdVC7SgNBFJ2NrxgfiQo2NoPBRxHCbh5mGyFgYxmDeUA2LLOTSTJkdnaZmRXCmsJfsbFQxNYPEHs7/8ZJoqCiBy4czrmXe+/xQkalMs13I7GwuLS8klxNra1vbKYzW9tNGUQCkwYOWCDaHpKEUU4aiipG2qEgyPcYaXmjs6nfuiJC0oBfqnFIuj4acNqnGCktuZndOjyFTlx3zZzDeoGSubrLnYmbyZp5+6RQKZWhJhWzbNszYpeKFrTy5gzZasE5en05TNfczJvTC3DkE64wQ1J2LDNU3RgJRTEjk5QTSRIiPEID0tGUI5/Ibjy7fwIPtNKD/UDo4grO1O8TMfKlHPue7vSRGsrf3lT8y+tEqm93Y8rDSBGO54v6EYMqgNMwYI8KghUba4KwoPpWiIdIIKx0ZCkdwten8H/SLOStYr5wodOogzmSYA/sg2NggQqognNQAw2AwTW4Bffgwbgx7oxH42nemjA+Z3bADxjPH1UfmAk=</latexit>

P = ..
> . . . >
: ;
pn,0
<latexit sha1_base64="owMr4Xd9Vj9xqcgkdgU2ZiYk4WU=">AAACbnicdVHNatwwEJbdn6RO22wa6CGhRDQ05BAWe5N0fSkEcslxW7pJYLVsZXm8KyLLRhoHFuNjnyQP0nforc/QS1+gUK13C21oB4Q+fd98M9IoKZW0GIbfPP/Bw0eP19afBBtPnz3f7Gy9uLRFZQQMRaEKc51wC0pqGKJEBdelAZ4nCq6Sm/OFfnULxspCf8R5CeOcT7XMpODoqEnn84C+o0xBhqwOWAJTqWtuDJ83tRCiCcpJHR6FDT1gKi3QHtCW0A1lLKDstuVYutzaE10oLknfd+mVi4FOVy0CZuR0hqyZdPbDbvy21z85pQ70w9M4bkF8chzRqBu2sX/26Wf5ZYfdDSadrywtRJWDRqG4taMoLHHsyqIUClzhykLJxQ2fwshBzXOw47odV0PfOCalWWHc0khb9k9HzXNr53niMnOOM3tfW5D/0kYVZvG4lrqsELRYNsoqRbGgi9nTVBoQqOYOcGGkuysVM264QPdDgRvC75fS/4PLXjc67vbeu2l8IMtYJ7vkNTkkEemTM3JBBmRIBPnubXk73q73w3/pv/L3lqm+t/Jsk7/CP/wFSE69vw==</latexit>
... pn,n r̄ = {r̄0 , . . . , r̄n }
<latexit sha1_base64="EAKROTFP0wTp6nqUKTFAzzwXzbI=">AAACDnicdZDLSgMxFIYz9VbrpVWXboKl6qKUmWlrZyMU3LisYi/QKSWTpm1oJjMkGaEMfQI3voobF4q4El27821ML4KK/hD48p9zSM7vhYxKZZofRmJpeWV1Lbme2tjc2k5ndnYbMogEJnUcsEC0PCQJo5zUFVWMtEJBkO8x0vRGZ9N685oISQN+pcYh6fhowGmfYqS01c3kXA8JKOApdOM5ds28y3qBkvnFnbuTbiZrFpwTu1IqQw0Vs+w4M3BKRQtaBXOmbNV2j15fDtO1bubd7QU48glXmCEp25YZqk6MhKKYkUnKjSQJER6hAWlr5MgnshPP1pnAnHZ6sB8IfbiCM/f7RIx8Kce+pzt9pIbyd21q/lVrR6rvdGLKw0gRjucP9SMGVQCn2cAeFQQrNtaAsKD6rxAPkUBY6QRTOoSvTeH/0LALVrFgX+g0LsFcSbAPDsAxsEAFVME5qIE6wOAG3IEH8GjcGvfGk/E8b00Yi5k98EPG2ydeAJ4S</latexit>

8 8 1
< R = P r̄ < r̄ = P R
1
G = P ḡ ) ḡ = P G
: : 1
B = P b̄
<latexit sha1_base64="TMCFX3HuENBHR5f+HlvFWKqZ8C4=">AAACyHicdVFNb9MwGHYyPkb4KuPIxWJC4rIo6TaaC1I3BEOcSkW3SXWpbMdJrTpOsJ1NUdQLEn+QGyd+CRJOUlZA2ytZep7n/fT7kkJwbYLgh+Nu3bp95+72Pe/+g4ePHvee7JzqvFSUTWgucnVOsGaCSzYx3Ah2XiiGMyLYGVm+afxnF0xpnstPpirYLMOp5Amn2Fhp3vuJBEsMqj1EWMpljZXC1aqmK28MX8MRRAQrqCBC3smGpw0/3nDScMRkvE5HiqcL43voS4ljiMYNs578EnaKd2PTdTtb+XO9F67guK3c9bwST65EshGPr59h3tsN/OhVf3BwCC0YBIdR1ILoYD+EoR+0tjuc/vo2fHe0N5r3vqM4p2XGpKECaz0Ng8LMbFHDqWB2wlKzAtMlTtnUQokzpmd1e4gVfGGVGCa5sk8a2Kp/Z9Q407rKiI3MsFno/32NeJ1vWpokmtVcFqVhknaNklJAk8PmqjDmilEjKgswVdzOCukCK0yNvb1nl/Dnp/BmcNr3w32//9FuYww62wbPwHPwEoRgAIbgPRiBCaDOW2fpGKd0P7iFe+lWXajrrHOegn/M/fobYFfZaw==</latexit>
b̄ = P B

67 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Our linear system reduces to a simple matrix-vector product. All we need to do is invert a square matrix P. Inverting matrices becomes increasingly delicate the
more they have rows, so it’s good to take a look to see if this operation is well behaved
MEASUREMENT

68 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

The plot on the right is P, you can see the matrix is strongly diagonally dominant, which makes it very stable for numerical inversion. Effectively these two plots are
showing the same data: on the right hand side each row of pixels is one of the curves from the left.
MEASUREMENT

Canon 1D Mk III

69 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020
IMAGING
CAMERA RGB

- The imaging function gives us a value in Camera RGB space


- Need to go from here to XYZ, and then to wherever (sRGB, ACES etc.)

70 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

So now we’ve taken a spectral radiance and converted it to a pixel value as seen by our measured camera model. The final step is to take this from camera RGB
space to CIE XYZ, from where we can convert to anything we want.
IMAGING
CAMERA RGB

- The imaging function gives us a value in Camera RGB space


- Need to go from here to XYZ, and then to wherever (sRGB, ACES etc.)
- Can get the matrices from either:

‣ Published data, e.g. dcraw

‣ Manual solve

71 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

To do so we’ll need a camera RGB to XYZ matrix. Unfortunately camera manufacturers don’t publish these so we can either rely on someone else finding it for us,
e.g. dcraw, or we can derive them ourselves.
IMAGING
CAMERA RGB

- Convert set of SPDs to Camera RGB and to XYZ

‣ e.g. Macbeth chart


- Solve for a matrix to convert from one to the other

? ? ?
[? ? ?]
× ? ? ? =

72 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Fortunately the process is fairly simple - just convert some set of spectral reflectances to both camera RGB and to XYZ, then solve for a 3x3 matrix that converts
from one to the other.
IMAGING
CAMERA RGB

- Convert set of SPDs to Camera RGB and to XYZ

‣ e.g. Macbeth chart


- Solve for a matrix to convert from one to the other

‣ Linear least squares is fine

‣ More training data = better?

73 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

We’re still experimenting with different combinations of colours and solvers. In general though, a large data set like that in rawtoaces combined with a linear least
squares solver is pretty good.
IMAGING
WHITE BALANCE

- Can either:

1. Use single matrix and divide by whitepoint in Camera RGB

2. Solve matrix for specified whitepoint at runtime

3. Solve extremes and interpolate the matrices

4. Use Bradford/CAT02
- All valid but consider what you’re matching to

74 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

There’s many ways of handling white balance. you can either


-
convert the desired white point spectrum to Camera RGB and then divide your colour by that
-
you can derive a new Camera RGB to XYZ matrix at rendertime for the chosen illuminant.
-
you can solve one matrix for warm whites and another for cool and interpolate between them
-
or you can use a standard chromatic adaptation method with Bradford or CAT02.
All of these methods are valid, and all of them are used somewhere by some combination of camera and raw conversion software. Really, what matters is matching
whatever solution your practical raw processing pipeline does.
IMAGING
https://github.com/wetadigital/physlight/blob/master/physlight_camera_model.ipynb

75 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Again we have a notebook in the physlight repository showing how to use measured camera data and solve matrices for Camera RGB to XYZ using different
methods for white balance
76 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

So now we’ve defined how to take radiance arriving at the camera and turn that into a pixel value as generated by our camera model.
πtS
H= ⋅L PRGB = Wcam(H, λ)
CN 2

77 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

That just leaves us with how we want to define the light entering the scene
UNITS
WHY PHOTOMETRIC UNITS?

- Standard in industry

‣ Light meters

‣ Published data for fixtures

‣ How many Watts of visible light does a 100W bulb emit? Not 100W!

‣ Single value for artists to use

78 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

We use photometric units for lights as that’s what’s common in the photographic and film industry for talking about brightness. Your light meter works in
photometric units. If you look up data for a fixture it will be in photometric units.
LIGHTS
LIGHT UNITS

- Area Light (rectangle, sphere etc.)

‣ Defined in terms of luminous power, Φv


-
Image-based Light (IBL)

Defined in terms of illuminance, Ev

79 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

We’ll consider area lights and IBLs or environment lights here. We use lumens for area lights and lux for environment lights. Manuka works in spectral radiance
rather than photometric units, so we’ll need to convert from one to the other when we render
LIGHTS
AREA LIGHTS

̂ ⋅ D(ω)
L ↑ = T(λ) ⋅ L(λ) [ m 2 ⋅ sr ⋅ m ]
W

80 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Exitant radiance from an area light depends on a few factors


LIGHTS
AREA LIGHTS
̂ ⋅ D(ω)
L ↑ = T(λ) ⋅ L(λ) [ m 2 ⋅ sr ⋅ m ]
W

- ̂ - spectral distribution
L(λ)

‣ Illuminant D, e.g. D65

‣ Blackbody

‣ Tabulated data

81 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

First, a spectral distribution. Most commonly this will be standard illuminant D65, but we also support blackbody specified by some temperature, and more
recently tabulated spectral data measured from real light sources on set.
LIGHTS
AREA LIGHTS
̂ ⋅ D(ω)
L ↑ = T(λ) ⋅ L(λ) [ m 2 ⋅ sr ⋅ m ]
W

- T(λ) - tint function

‣ Constant colour

‣ Texture map

82 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

A tint function. Normally this would be a texture. These simulate gels or an emission pattern on the source, such as an LED array, or an HDRI of a practical fixture
LIGHTS
AREA LIGHTS
̂ ⋅ D(ω)
L ↑ = T(λ) ⋅ L(λ) [ m 2 ⋅ sr ⋅ m ]
W

- D(ω) - angular distribution

‣ lambertian, D(ω) = 1

‣ powered cosine, D(ω) = cosp(nl ⋅ ω)

‣ IES profile

83 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

and an angular distribution. We use powered cosine distributions a lot for focussing area lights for example
LIGHTS
AREA LIGHTS ̂ ⋅ D(ω)
T(λ) ⋅ L(λ)

L = Φv ⋅ [ m 2 ⋅ sr ⋅ m ]
W

ke
- Φv - luminous power in lm

84 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

We want the user to be able to specify the lighting in terms of the total output power, so we need to find some factor k_e that normalizes the radiometric output
based on those parameters and converts it to lumens
LIGHTS

85 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

and that’s just the integral of each term in the numerator. It’s trivial to compute this scale factor at render startup then just multiply it in when sampling the light
LIGHTS
ENVIRONMENT LIGHT ̂
T(ω, λ) ⋅ L(λ)

L = Ev+ ⋅ [ m 2 ⋅ sr ⋅ m ]
W

ke
- Ev+ - illuminance from upper hemisphere

86 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

For an IBL, the illuminance as measured on an upward facing patch is a more natural and easier to use parameter than power. It also neatly corresponds to
measuring the illuminance of a real scene using a light meter pointed straight up.
Here again we can select an Illuminant D or measured spectrum for the light, which is multiplied by the Tint function, which will almost always be an HDRI
panorama captured on set.
LIGHTS
ENVIRONMENT LIGHT

87 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

To normalize the illuminance from the HDRI image we precalculate the illuminance from the upper hemisphere then store that in the exr metadata for easy
retrieval later.
LIGHTS
ENVIRONMENT LIGHT

2
̂ ⋅ Y(PRGB) ⋅ CN
L ↑ = L(λ)
πtS
88 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

The final piece of the puzzle is then knowing how to tie a pixel luminance back to the incoming radiance. That’s simply inverting the imaging ratio we saw earlier.
Again we can precalculate this and store it in the image header to multiply in at render time
So with all that in place, let’s shoot a test.
RESULTS

90 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Here we’ve got a macbeth chart illuminated by a pair of blondes and we’ve shot it with an Alexa LF. We’ve also captured an HDRI from the chart position with a
5D that we’ll use to illuminate the scene.
First of all, we’ll generate correct radiance values from the HDRI using the precalculation we just described and use that to do a render but *without* considering
the camera at all
raw radiance
91 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

As you can see that comes out a little bit too bright. To fix that we’ll…
πtS
H= ⋅L
CN 2

raw radiance
92 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

add in the imaging ratio using the settings from the Alexa…
π ⋅ 0.02 ⋅ 800
H= ⋅L
312.5 ⋅ 11.9 2

with imaging ratio


93 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

That brings the pixel values into the correct exposure range as we’d expect. It’s pretty close in brightness but the colours on the chart are a fair bit off because
we’re just using CIE standard observer here, we’re not considering the LF’s spectral response
with sensor response
94 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

now if we apply the sensor response function using our measured curves for the Alexa, we get a much closer match
Here’s a crop of the plate focusing on the chart
and here’s the matching render with our model. the colours aren’t an exact match but they feel pretty close.
here’s the render just using the imaging ratio. I hope you can see on the video if I compare it to the plate that the colours are significantly warmer and more
saturated as this is just using CIE standard observer to go from spectral to XYZ.
If you were there on the day, this is closer to how you would have seen the chart, because that’s what the standard observer is designed to do after all: model
human vision. But we’re not interested in that, we want to image the scene as the camera saw it
99 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

The effects of metamerism in the sensor are more pronounced using measured spectra for the lights. In this example from Gemini Man, junior is illuminated by an
area light with the HDRI texture you see bottom-left.

the right-hand side of Junior, the grey ball, and each patch is rendered using a D65 spectrum multiplied by the uplifted values from the HDRI texture while the
left-hand side uses the full measured spectrum you see top-left, recorded on set.
With a spiky spectrum like this flourescent exam light there is a significant difference in colour rendition that we can capture with the spectral sensor model
100 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

For smooth, incandescent spectra like this arrimax 300, the two sides are almost identical: uplifting the RGB lightmap to a spectrum works well here since the
underlying emission spectrum of the light is of a similar shape.
101 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

Similarly, a slightly spikier spectrum like this HMI is pretty close as the overall shape of the measured spectrum is still fairly smooth
102 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

Physlight really shines in naturalistic scenes. In this shot from War for the Planet of the Apes, the fortress environment is lit by an HDRI captured on location. Due
to the large exposure difference between the sky and the torchlight, the interior lighting is augmented with hidden fixtures captured from similar lighting setups
on real-world set pieces.
103 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

The torches dotted around the interior have their brightness and colour derived from torches captured in the HDRI in a different scene. We then tuned the fire
simulation to give a blackbody emission matching the colours and brightness in the HDRI.
Because we know how the camera responds to physical radiance, we can now use it as a calibration device.
104 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

We can then put that same simulation into shots with very different camera and lighting setups and everything behaves correctly…
105 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

…without needing to tune the fire shader for each shot


PHYSLIGHT
CONCLUSIONS

- So it works perfectly and we never have to light a grey ball again?

106 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

So, everything works perfectly and we can stop shooting ball passes now?
PHYSLIGHT
CONCLUSIONS

- So it works perfectly and we never have to light a grey ball again?


- Almost…

107 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

almost…
PHYSLIGHT
BENEFITS

- When the data is good, it’s great

‣ Easier to share light rigs between shots

‣ Less time setting up, more time beauty lighting


- Consistent way to talk about brightness and colour

‣ “Think like a DOP”

108 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

When you’ve got good data it works very well indeed and even without good data, working in a consistent set of units gives you a framework for talking and
thinking about real-world lighting values which is hugely beneficial when deciding how to approach a lighting setup.
The things that stop up being able to just match a given shot out of the box are purely practical:
PHYSLIGHT
LIMITATIONS

- Can’t always capture a matching HDRI

‣ Sets are busy, sun and clouds move a lot

109 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

The biggest one is that it’s not always possible to get an HDRI exactly matching your setup. You might not be able to get to the right location, or even capture
one at all. Overzealous sparks might move lights while you’re not looking, and the sky has a nasty habit of changing continuously. All this means your IBL will often
need a little bit of tweaking and so we still have exposure and tint parameters on our lights for that
PHYSLIGHT
LIMITATIONS

- Can’t always capture a matching HDRI

‣ Sets are busy, sun and clouds move a lot


- Can’t always get good camera data

‣ Metadata can be patchy

‣ Film still exists

110 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

Depending on the camera kit, you may not get a full set of metadata for the camera settings, and manually recorded data is prone to human error and is difficult
to parse automatically. There are as many different ways of recording a T-stop value as there are data wranglers in the world.
And of course some people still do occasionally shoot on film.
PHYSLIGHT
STILL TO DO

- Measure, measure, measure

‣ Neutral density filters are not neutral

‣ Neither are lenses

‣ Neither is vignetting…

111 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

And there’s still lots we need to measure and refine.


For example, ND filters from the same model and manufacturing batch can vary wildly in their transmission spectra. We need to measure each of them to account
for this.
Lenses have “character” that affects their colour rendition due to their construction and the anti-reflective coatings on each element. Again, we need to figure out
a low-footprint process for measuring the lens kit during a shoot.
The more you drill into this stuff, the less you trust anything.
112 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

Here’s a fun one that Erik Winquist found recently: this is a series of brackets on aperture priority in an integrating sphere with the sigma 8mm that I think a lot of
us use for capturing HDRI. Since the camera is adjusting shutter speed to maintain exposure, the EV100 is roughly the same value…
113 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT
© WETA DIGITAL LTD 2020

… but there’s over a stop difference between f/22 and f/3.5. I include this as an amusing example of how it often feels like the closer you get to getting a
“correct” answer on something, the more variables you discover that you hadn’t even considered needing to account for.
PHYSLIGHT
SUMMARY

114 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

So to summarize…
PHYSLIGHT
SUMMARY

- Specify lighting in photometric units

‣ We should all do this!

115 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

specifying lighting in physical, photometric units is great, we should all be doing this
PHYSLIGHT
SUMMARY

- Specify lighting in photometric units

‣ We should all do this!


- Spectral camera model is great

‣ But if you’re stuck with RGB can just use imaging ratio

116 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

The spectral camera model can also give compelling results and capture effects of metamerization that a simpler model can’t…
…but if you’re stuck with an RGB renderer you can just use the imaging ratio part in order to be able to work with physical units
PHYSLIGHT
LINKS

- https://github.com/wetadigital/physlight

‣ Notebooks, slides
- https://github.com/mmp/pbrt-v4

‣ Camera sensor model implemented in PBRT v4


- https://github.com/anderslanglands/pbrt-v4

‣ Fork with full implementation

117 ANDERS LANGLANDS & LUCA FASCIONE / PHYSLIGHT


© WETA DIGITAL LTD 2020

The python notebooks with examples of the imaging function and camera model are avilable on the wetadigital github alongside these slides.
Also the upcoming version 4 of PBRT v4 includes an implementation of our camera model in the base renderer, and I have a fork on my github with an
implementation of everything we’ve talked about here so you can see how simple it is to add it to an existing renderer
PHYSLIGHT
ANDERS LANGLANDS & LUCA FASCIONE
WETA DIGITAL

You might also like