The Sensor Post
In 2024 I was generously lent a pre-production version of the SL3-S by Leica to test and give feedback before release. I’ve been using Small FormFactor Cameras™ for different purposes ever since the HDSLR Revolution™, but they were always lacking one thing or the other. What could the Leica bring to the table? Could it be the Swiss Army knife I had always been looking for? What first caught my eye was the onboard Apple ProRes recording (which was also offered on the SL2-S, several Panasonic models, and probably others). I had been carrying the excellent Sony A7-series cameras for years, but they were still limited by H.264 and H.265 compressed formats. The BETA test perfectly coincided with a new feature film I was prepping to shoot in the fall of 2024, a movie (now) called “The Last Resort” by director Maria Sødahl, so I brought the Leica with me to try and find a good use case.
What’s the difference?
When I say I’ve been using what are essentially prosumer mirrorless hybrids, I’ve not actually shot a full feature film or a drama series on any of them. It’s been a tool in my kit; for pre-production, for B-roll, for 2nd unit work or small splinter unit assignments, and so on. As exciting as these cameras are, they are also limited in a few ways that cause most cinematographers to prefer higher end systems. Most of my digital work has been primarily shot on ARRI Alexa cameras, sprinkled with some Sony Venice and RED.
One of the reasons why productions rely on sturdy high end cinema cameras by default, from “The Snow Sister” feature film for Netflix.
The high end represents a legacy of reliability in all kinds of conditions. They are robust cinema making machines, built to handle humidity and temperatures as much as is possible for a computer to endure. They offer compatibility with a wide range of accessories and established workflows in the camera department, and feature well documented pipelines into editing and post production. In terms of size they are mostly small enough to allow comfortable handheld work, but big enough to avoid the jitters that smaller cameras suffer. Which is actually a big reason why you would want a larger camera for handheld work. They have standardized lens mounts for access to premium cinema glass, and sensors that to this day feature industry leading latitude.
It’s important to keep this in mind, as there are otherwise many appealing aspects of the smaller siblings. They are obviously cheaper, but also offer a smaller footprint for productions where agility is an advantage. There are few directors who wouldn’t want to work with smaller crews and less logistics. And lately we have seen many high end productions embrace low end systems for artistic reasons or otherwise, for example “28 Years Later” by director Danny Boyle and cinematographer Anthony Dod Mantle, shot on iPhones. The term “better” is often used about cameras, but I guess the question is really “better, for what?”.
A fully rigged ARRI Alexa 35 used to shoot “The Last Resort”.
A barebones Sony A7C II used to shoot “Fata Morgana”.
Anyway, the ARRI Alexa 35 is the current king of the hill among the high end cameras, with a claim of 17 stops dynamic range on a 4.6K chip up to 120 frames per second (660 on the newly released Xtreme). And the famously cinematic ARRI colors.
This is a picture from the ARRI website demonstrating how their Reveal color science excels at representing extreme colors in a pleasing and cinematic way that feels true to reality.
ARRI has always touted their color science as being the best there is. And they’re not wrong. But they’re also not the only ones doing “color science” in the process of making a movie. Which lead me into a different rabbit hole as I was pondering how to get some good use out of the Leica SL3-S I was borrowing for our movie.
Any digital sensor is basically built the same (yes and no, but let’s keep it simple). Each photo site on the sensor can capture a certain range of light intensities, in either red, green or blue. Sensors don’t actually see color, each photo site is therefore filtered to capture greyscale values within a specific color channel. ARRI then apply their custom made color science to interpolate full RBG-values for each pixel, and map these values to the specific colors and brightness you see on screen. Their magic sauce is the math that calculates what the color of each pixel should be, based on the combined information from a 2x2 block of photo sites. But the process is more or less the same for all modern cameras.
Introducing: The Colorist
For most narrative filmmaking, there is also a third process, which layers a creative interpretation of color and contrast on top of the camera color science.
Nurali Kushkov working on the creative coloring of our movie “The Last Resort”.
The creative coloring of a movie mainly happens at the very end of post production in a collaborative effort between three artists: me, the director and our colorist. This is an essential part of crafting an image that resonates emotionally and tells the story. The secret sauce ARRI applies to convert greyscale light intensities into color is an important intermediary step, and when it is well made, it could make a difference for how easily we’re able to portray certain colors. But we have so many other tools at our disposal in the coloring suite, that mostly anything can be fixed given time and a skillful colorist. If you watch a movie shot on ARRI, you’re not watching ARRI color science, you’re watching the creative color output of the colorist, cinematographer and director, built on top of the color science.
The process begins already in pre production, when we make a first draft of the look. Because it affects the choices made on set – and by other departments, we want to display this preview on our monitors while we are filming. We also want to embed this vision into our offline editing files, for the director and editor to see something as close as possible to a final look while cutting the movie.
The file containing the data about how to shift the luminance and color is called a LUT (LookUp Table), and is custom made for whatever camera system we are shooting on.
Making a LUT
And this is where my ramblings about color science and the loan of a pre-production Leica SL3-S finally converge. Because there is an eternal conundrum when creating said LUT that I hadn’t figured out yet on my previous projects.
When we are creating the final color of the movie, we are seated in a projection suite with the actual edited movie on the screen, and then manipulate the colors in real time using for example DaVinci Resolve or Baselight. It’s an amazing process where the film comes to life in front of our eyes. But when we make the LUT, we haven’t shot the film yet. So what material are we going to grade to create the look?
This is one of several test shots we used when crafting a LUT for the movie “Narvik”. It contains costumes and a color chart, and a high contrast scenario, shot at the rental house with the correct camera and lens combination.
For the material to be relevant, it should be shot on the camera and lens we have selected for the project. We need to correctly judge the unique color cast created by the glass and the interpretation of color from the sensor. And we’d want to shoot in environments that are relevant for the film, preferably with the skin tones of our actors and the colors of the clothes they are going to wear. Usually most of these things are either not accessible, too far away, not decided or built yet, or demanding too much in terms of resources like crew and extra equipment.
These might be trivial problems for some productions with the right amount of resources, but in a small industry like the Norwegian one, even high budget movies struggle to make this process as relevant as I’d wished.
We do have test shoots, but they are usually a very technical exercise. Art department will want to try out color samples, fabrics, practical lights, and so on. Costume and make-up have similar needs to try out samples or techniques and see how they appear in camera. I’ll have optical filtration I want to test, lighting techniques, maybe even specific colors I need to treat very specifically like I detailed in my blog post on the LUT-colored coat in the movie “Narvik”. Sometimes we’ll have access to a few of the actors, but we’re stuck in Oslo in the fall, while the movie will be shot in the winter in Northern Norway. Or the summer in the Canary Islands.
I’m not saying that fabric samples or color charts are not useful, but if that’s all you’ve got to lean on, you’re not crafting color from emotion or even the reality you’ll be shooting in. And what choices you are able to make will be severely limited by that.
As I was pondering this process for the movie “The Last Resort”, I was also traveling with the heads of departments to scout and recce potential locations in Fuerteventura. I got to see spectacular places in amazing light, but did not have access to our ARRI Alexa 35 with the Cooke Anamorphics we’d decided to shoot on. I wished I could bring these experiences into my lUT-sessions, but acquiring the material would need a lightweight intermediate solution I could carry in my backpack.
An idea started forming in my head.
The Leica shoots Apple ProRes 422HQ, which is a quite robust codec carrying a lot of color data. And it’s small enough to live in my backpack. If you reduce it to its very basics, it’s a sensor with photo sites that capture light intensities in either red, green or blue. Just like the ARRI Alexa 35, but with slightly less dynamic range.
What if I could add a layer of «ARRI color science» on top of the log-material from the Leica that would map the colors to mimic the Alexa 35 instead of the color science Leica provides? Maybe even mimic what the Alexa 35 looks like with a specific lens on? In theory I could then shoot material from relevant locations and in real light conditions on the Leica, and use that material to grade the creative LUT that I would use for principal photography on the Alexa 35. Because the LUT would be created on top of what might as well be an actual Alexa 35 (minus a bit of latitude), it should be the same as if I’d brought the Alexa 35 and my camera crew on the location scouts.
I reached out to three colorists I’ve previously collaborated with to test my theory. The question I asked was this: Imagine I set up an ARRI Alexa 35 with a Cooke Anamorphic /i S35 lens next to a Leica SL3-S with a Leica Summicron SL 35mm f2 APO lens. And then shot the exact same scenario from the same angle, with some color charts and grey scales, some highlights and shadows, and some skin tones. A technical test shot in a controlled environment. Could we use the color charts to map the Leica SL3-S colors to the exact hues of the ARRI Alexa 35mm with the Cooke lens? Because we were shooting on the Cooke, I wanted to map the Leica to look as if it was an Alexa with a Cooke lens mounted. Of course, we could never match the distortions of a Cooke Anamorphic, but the goal was to emulate whatever color cast was embedded in the glass in addition to the ARRI color science.
Christian Wieberg-Nielsen is senior colorist at Storyline Studios in Norway. Didrik Bråthen is senior colorist at TMLS in Norway. And Nurali Kushkov is senior colorist at Chemistry in Denmark. Nurali was the actual colorist for the movie I was working on, but I wanted a few different perspectives on the problem. They all agreed it should work in theory, and agreed to try it out if I could send them the material. In addition to my first suggestion of color charts, grey scales and skin, they requested some over- and underexposed side-by-sides, to showcase where the Leica started clipping, and to see how colors behaved in the highlights and in the shadows. My three expert colorist friends attacked the problem in three different ways, and very soon came back with some exciting results.
In modern colour-managed workflows, like Davinci Resolve’s RCM, Baselight’s TrueCAM or vendor agnostic Academy Color Encoding System (ACES), we bring each camera into a common scene-referred, wide-gamut working space using the appropriate input colour space transform (ACES calls these IDTs). These input transforms are built from camera characterisation data (measured spectral sensitivities, published colorimetry, log curves). If the input transform is well made, images from different cameras will look similar in a colour grading software.
Well made input transforms is a good starting point for matching cameras.
But even if we use input transforms, images from different cameras will perceptually differ for several reasons. Generic input transforms don’t always perfectly represent a given camera’s RAW or in-camera encoding. Different models that share the same recording colour space can decode differently with the same input transform.
For example, using the same S-Log3/S-Gamut3.Cine input on both a Sony VENICE 2 and a Sony A7S IV will not, in my experience, produce a good enough match. Even with two cameras of the same model, it wont look the same, both age of the sensor and manufacturing inconsistencies will affect this. Even two units of the same model won’t be identical—sensor age, manufacturing tolerances, and internal processing all contribute. That said, two high-end cinema bodies of the same model (e.g., ARRI ALEXA, Sony VENICE) generally track each other better than lower-priced lines.
When I approach camera matching, I first bring each source through the proper input transform into a scene-linear, wide-gamut working space. There I normalise exposure and white balance and apply flare compensation with the Base Grade, and for most cameras I use X-grade to address residual hue/saturation differences. And when matching lower-end SLR bodies whose log decodes arrive with a non-matching contrast ratio or low-end misalignment under the input transform, I’ll use the curve tool to set white/black point before the X-grade clean-up. Then I use the X-grade tool to adjust differing hues. The principle is to use broad, robust tools so the matching transform holds under a wide variety of scene illuminants and exposures (day, night, coloured LEDs, etc.) Any issues from the colour matching will be amplified later during look development and the final grade.
Another consideration is sharpness. Many smaller/consumer bodies ship with stronger in-camera sharpness compared to higher end cameras. In my experience you would generally want to lower sharpness on lower end cameras to be able to have a good match in the end, even if we have tools now in post production where you can mitigate this somewhat.
— Christian Wieberg-Nielsen
Christian would use the X-grade tool and curves in Baselight, Didrik would utilize MatchGrade in Nuke, while Nurali did a manual match using his own tools developed for DaVinci Resolve. They all came to the same conclusion: The colors could become almost indistinguishable for most purposes, but there was a noticeable difference in dynamic range. I first did a session with Christian and later with Didrik to see their process live and discuss the implications of this.
Quick flowchart of how the LUT process worked on “The Last Resort” with Nurali Kushkov coloring the movie. Featuring my excellent 1st AC Kjetil Fodnes as model.
I then had Nurali create an actual LUT-file for the Leica SL3-S that would transform it into a (sort of) mini ARRI camera. With the ARRI mimicking LUT loaded in the Leica, I could point my lens onto the world, and see it as if I were carrying the Alexa 35 with the Cooke-lens mounted, and it was quite marvelous.
I brought the Leica to scout locations in Fuerteventura, Spain, where we would end up shooting the movie a few months later. I used it to simultaneously take pictures of locations as part of the scouting process, and to shoot video using the ARRI mimicking LUT. I would film landscapes, towns, houses, interiors and exteriors, people, sunsets, daylight, how the sun would shine into buildings, how the sun would bounce off the brightly colored sand dunes, nights, practical lights, etc. I ended up with a 30 minute timeline of little snippets from the real environments of our movie.
Back in Oslo, I also shot the usual technical tests, trying out different optical filtration and exposures on the Alexa 35, as well as color charts. But because the movie was produced in Denmark, I once again could utilize the Leica to capture costume tests in Copenhagen, to see how the colors of the textiles would play with our color mix, and testing make-up on the real actors who were there for rehearsals. This material was all brought into the coloring suite where Nurali was patiently waiting.
Using in-house know-how and our own toolset allowed us to get a very close match while keeping the transformation delicate and smooth. I left some minor misalignment in favor of smoothness, since I knew this would only serve as an intermediate preview.
I approached this task with a simple mindset:
I was actually building a show look for the Alexa 35, using the Leica as a temporary stand-in to give a first impression of what the look would be once principal photography started. Think of it as a very good preview. The transform we built aligned the color reproduction between the two cameras across the most useful exposure levels with fairly high accuracy. There were some limitations during the capture phase, because of limited time and resources, but for our purposes it worked fine.
Later, I made minor adjustments to specific colors while reviewing actual captured material. Overall, this process proved to be a valid way of getting a fair representation of the desired look on the go, while location scouting.
I think it’s impressive how far modern stills cameras have come, and it certainly opens up new possibilities for creativity.
— Nurali Kushkov
Together with director Maria Sødahl we would sit in Copenhagen and craft our project specific “color science” using the Fuerteventura material and the technical tests from Oslo and Copenhagen. The ProRes from the Leica in L-log was loaded into Resolve, and transformed using the ARRI mimicking LUT to look like it was shot on the Alexa. We then worked creatively with the material as if I had brought the real thing to Spain and Copenhagen. I was looking at actual skin tones in the Fuerteventura low afternoon sunlight. We were judging the saturation of the sand dunes. The greens of the palm trees. The red and yellow colors on our action vehicles. We could finesse our look based on a whole frame that might as well be a scene from our movie. But I was also all the time cross referencing with color charts and technical tests shot on the Alexa 35 with Cooke Anamorphics in Norway. When we felt satisfied, we stripped away the layer of Leica-to-ARRI color mapping, and exported the remaining corrections as an ARRI camera look file (using their proprietary ALF-format). This would load straight into our ARRI Alexa 35 production camera, and give us the exact colors we had crafted in the grading suite in Copenhagen.
These are a few frame grabs from the Fuerteventura material we used for the production LUT, with the LUT applied. Originally shot on the Leica SL-3S.
Bottom line
The implications of this are wider than the specific use case on “The Last Resort”. I feel like sensors are increasingly getting to a point where we can almost disregard the brand as a dominant component in the aesthetic of the final image. When everything was shot on analog film, we didn’t choose an ARRI or an Aaton because they did anything to the look of the image. The lens and the film stock affected the image, the metal in between was just a mechanical mechanism to fuse light with chemistry. In the digital world, the film stock might as well now be the work of the dynamic trio in the grading suite, more than the camera manufacturer. Because the general quality of sensors and debayering mathematics are at such a high level. The look is something we shape independently through optical tools (for example lenses and filters) and through digital manipulation (LUTs, textures, and so on). With high end codecs becoming available in the prosumer lineups, like the new Nikon Zr with R3D recorded in body, and Panasonic offering ARRI LogC3 as a purchasable license on their LUMIX cameras, this future is now.
I hope camera manufacturers will see this as an opportunity for new types of innovation. Like form factor or specific functions that could make a camera suitable for very specific tasks. I feel like the DJI Ronin 4D is an example of this, fusing a camera with a gimbal to create a stabilized handheld device. Maybe we’ll see high quality sensors in smaller formats, like s16? Aaton once created a digital camera that physically moved the sensor a half pixel between each frame to subtly create an analog feeling to the digital image. They sadly went bankrupt shortly after, but these are the kind of ideas I’d be excited to see manufacturers explore in the future.
TLDR;
Sensors are now extremely high quality across the board. With high end codecs also appearing in low end devices, cameras become almost interchangeable. I’ve explored using one camera to mimic another for the purpose of gathering material for grading a project LUT. But you could also use these techniques to successfully blend manufacturers and formats in a project shot on several cameras. In the end, I’m arguing that this is a form of creative freedom filmmakers can utilize, and that we could start discussing cameras differently when the brand is not so much about the look anymore.