# Too much dynamic range?



## nightbreath (Nov 20, 2012)

An interesting thought came to me before I went to bed. Below you'll find an assumption that came suddenly to my head, so please don't take it too seriously.

So... Let's assume there are two cameras with similar color tones reproduction abilities, but with different possible lightness level capturing ability. For example:
- sensor of camera A has 12 stops of DR, 16 billion tones it can distinguish
- sensor of camera B has 10 stops of DR, 16 billion tones it can distinguish

Having a flat scene (i.e. low DR scene) on a shot we'll push an image with, say, 8 DR to be captured with both sensors. And then both images will be edited in post to retrieve lacking contrast. So we need to add:
- 4 stops for 12-stop camera
- 2 stops for 10-stop camera

So my point is: with lower DR camera we'll have lower tone delta (difference of the initial color tone in the scene with reproduced tone by the sensor) when processing the low DR shot made using lower DR sensor. That happens because of decreased amount of modifications made to the file to achieve required result.

What do you guys think about that?


----------



## Hector1970 (Nov 20, 2012)

I was a bit bamboozled with the question.
Having read it I wondering with the camera with the higher dynamic range it could have if the dynamic range of the scene was wider gone two more stops but because its not required the extra dynamic range would be redundant therefore the cameras are equal for that scene.
The conclusions being if you own the lower dynamic range camera don't take high contrast scene or else you should have bought the camera with the higher dynamic range (which is probably more expensive) . This would lead to severe buyers regret and a need to sell the camera purchased at a loss. This would be sad


----------



## Edwin Herdman (Nov 21, 2012)

nightbreath said:


> So my point is: with lower DR camera we'll have lower tone delta (difference of the initial color tone in the scene with reproduced tone by the sensor) when processing the low DR shot made using lower DR sensor.


I understand the question.

It's similar to the Adobe RGB versus sRGB space question - at least to hear Ken Rockwell tell it, aRGB sacrifices tonal gradations for gamut. sRGB should allow for finer gradations in color change.

I think that before sensors will reach the limit of the current color space in terms of usable bit depth, the recorded bit depth will be increased. If I am up to date with my reading, the analog-digital converter (ADC) is often claimed to be the stumbling block in this, but it is probably really the sensors themselves. The ADC has to provide more precision than the original sensor capture data in order to preserve quality, so the actual bit depth ends up being a bit arbitrary (a best tradeoff).


----------



## Policar (Nov 21, 2012)

It's a valid concern, but someone did some posterization tests and found that noise is still the limit by a very big margin. Even 8 bit JPEGs are good enough for most purposes.

I have worked with Alexa footage (14 stops DR quoted, but in practice it feels like dramatically more than any dSLR) and it's compressed into a 10 bit wrapper... almost no posterization no matter how you grade it. I wouldn't worry.


----------



## tim (Nov 21, 2012)

nightbreath said:


> So my point is: with lower DR camera we'll have lower tone delta



It doesn't work like that. Suppose you have a sensor with a dynamic range of 10 stops. That's (roughly) equivalent to saying that the noise is 1 part in 1000 of the maximum signal (2^10 = 1024), so you can only distinguish 1000 different shades. Even if you put a 16-bit ADC on the sensor, you're still limited to 1000 shades.

Now I've glossed over a lot of things in the paragraph above. For one thing noise is not constant over the tonal range: it gets larger at higher signal amplitudes, so in the example you'll actually be able to distinguish less than 1000 shades.

There's also a reason why you might use an ADC with a larger dynamic range than your sensor. If you used a 10-bit ADC on a sensor capable of distinguishing 1000 shades, every shade in a scene would be assigned a definite value in the photograph. Subtle gradients in the scene would be rendered as stepwise increments in the photograph, and under heavy processing this could become visible as a posterisation-type effect. But with a higher bit-depth ADC the steps would be blurred out by the electronics noise so that they are not visible. There's no extra information in the photo, but it looks nicer.


----------



## helpful (Nov 21, 2012)

This question is totally relevant, and mathematically justified. A camera's sensor at the hardware level is analog. If it records a total dynamic range of 14 stops (let's say) and this data is converted to a 14-bit digital signal, then one could say that 2^14 = 16,384 tones are being recorded per channel, ignoring noise.

If the camera's sensor records a total dynamic range of 10 stops (let's say), and this data is also converted to a 14-bit digital signal, then there are still 16,384 tones being recorded per color channel, ignoring noise.

However, those tones are representing a range of bright to dark which has 4 stops (16 times) less variation in tone. Ignoring noise, tones are recorded with 4 stops (16 times) as much sensitivity to tiny shifts in color/contrast. This is true only for tones within that limit of 10 stops of dynamic range. Tones outside that range are lost, which is the drawback.

A camera sensor X could be designed with the same amount of noise as a camera sensor Y, but a lower dynamic range for X and a higher dynamic range for Y. If the signal were accurately converted to the same 14-bit RAW digital output, then under the given assumption that both sensors had the same amount of noise, then there would always be better gradation between tones in the output from camera sensor X, but better resistance to blown highlights / lost shadows in the output from camera sensor Y.

Nothing in the world can increase without a trade-off, and that is definitely true for dynamic range as well. If all other factors are held constant, increasing dynamic range has the drawback of decreasing gradation in tone. In the limiting case of infinite dynamic range, then all levels of signal would be rendered as a single flat tone, just like a delta "spike" function in Fourier analysis corresponds to a signal of all wavelengths.


----------



## tim (Nov 21, 2012)

No really, it doesn't work like that. The sensor noise determines both the dynamic range *and* the number of tones which can be distinguished. They're inextricably linked. If you want better accuracy of tones you need to reduce the noise, which automatically increases the dynamic range.


----------



## tim (Nov 21, 2012)

Here's a simplified example. Suppose you have two sensors:

Sensor-A gives a signal between 0 and 1000mV, and has a noise of 10mV.
Sensor-B gives a signal between 0 and 1000mV, and has a noise of 1mV.

Sensor-A can just distinguish signals corresponding to 500mV and 510mV, but it cannot distinguish signals corresponding to 500mV and 501mV. In other words Sensor-A can distinguish 100 levels. And the ratio between the maximum and (average) minimum signals is 100, which is the same as saying it has a dynamic range of 100 (= 6.6 stops).

Sensor-B meanwhile can distinguish 1000 levels and has a dynamic range of 1000 (= 10 stops).

The noise determines _both _the dynamic range _and _the precision of the intermediate gradations.


----------



## Sporgon (Nov 21, 2012)

Interesting to read what helpful has to say. Is this why the tonal graduation of the 5dmkiii and 1dx is so good ? Many pictures from these cameras have a "film" like quality which looks to me to be the way the chip is handling graduation to highlight and low light. Photographers that I know who have the Mkiii have really noticed this: it is superior to the mkii. That's why I'm so surprised in reading the criticisms directed at the Mkiii/1DX vs the D800. (from what I have seen so far ) the D800 does not have this film like quality, and surely the best of film is the Holy Grail of digital?


----------



## jukka (Nov 21, 2012)

Show me one pictures taken with 5dmk2 and mk3 and the difference you are talking about (i have them)
Show me one picture from 5dmk3 and d800 and the difference you are talking about (I have also d800)


----------



## jukka (Nov 21, 2012)

tim said:


> No really, it doesn't work like that. The sensor noise determines both the dynamic range *and* the number of tones which can be distinguished. They're inextricably linked. If you want better accuracy of tones you need to reduce the noise, which automatically increases the dynamic range.



and this is Canons big problem with their old read out circuits , the read out noise from 5dmk3 (as one example) are 12 times higher than d800 at base iso, therefore 5dmk3 has about 11 stops DR and Nikon 14 stop


----------



## nightbreath (Nov 21, 2012)

I didn't want to focus only on DR (possible lightness levels) in my initial post, my main idea was to draw attention to color tones reproduction abilities of a sensor.

When there are X stops of DR available, this is lightness range captured, i.e. not matter if each individual pixel has purple / yellow / gray tone. So idea of the topic is to ask everyone whether "there's life outside DR".

As I see it: DR is longitude (or Y coordinate), color tone is latitude (or X in two-dimension representation).

The more gradation steps there are in each dimension the better a sensor is.

So the questions are:
- Should we look at both dimensions instead of referencing DR only?
- Does the initial question about contrast improvement work? (contrast amplification affects lightness level and color tones at the same time, so initial colors can be thrown away when editing an image)
- Why is there difference in color tones reproduction of 1D vs. 5D lines? What do we miss?
- Is that noise at the floor so important? Or there are other things that take part in the game?


----------



## nightbreath (Nov 21, 2012)

I'm waiting for someone smart to chime in and describe whether my initial thought can affect real life shooting, or whether there's something important we don't pay much attention to. One of things that may be related to the topic and was confirmed by several photographers is:



nightbreath said:


> difference in color tones reproduction of 1D vs. 5D lines


Even if you take 1D Mark IV, the pixels are better suited to color modification than those from 5D Mark III. Similar thing is mentioned in 1D X review here:



> One odd thing I’ve noticed before, now definitely in this test- there’s a big difference in how the 5D series and 1D series interprets shadows. The 5D2 notoriously expresses shadow detail with a purple hue and the 5D3 shows this trend still continues; whereas the 1D4 has to be pushed to its limits before the purple shows up in the shadows. I thought this was mostly due to the crop sensor excluding the lens edges, but the 1Dx- full frame- follows the trend of the 1D4 by not degrading shadows with purple hues. I’ve asked a Canon rep why this is and he’s forwarded it on up to a uber-geeky tech, so hopefully we’ll get an explanation about this. It certainly stands to reason that color-integrity in the shadows is a perk of paying for the higher model, but I’d still like to know what’s the difference. I’ll let you know if we get an answer.


----------



## jukka (Nov 21, 2012)

1D series has more expensive electronic chain and also when it comes to shielding etc and 1d series is probably also better matched in terms of RGB
The old 1dsmk3 has a better response regarding middle tones than 5d mk2 mk3 series an can be seen in a even colored surface.
There also different CFA in the old 5d compared to 5dmk2 mk3 and some experiencing the colors better in the old 5d
Canon changed their color filters (not so dense ) in order to gain more light/ increasing sensitivity


----------



## helpful (Nov 22, 2012)

One extremely theoretical way to view dynamic range is the ratio of the sensor's noise level (in photons / quantized energy units) to the sensor's white point (in the same units). 

This extremely theoretical way of viewing noise and dynamic range to be equivalent is useless in the real world.

Viewing dynamic range as a number of stops that can be represented in an image, as I did, is much more practical and less theoretical.

The engineering truth is that noise exists also at each point between the black level and the white level of the sensor.

Consider a camera whose sensor is exposed the 10 brightness levels of Ansel Adam's zone system, in proportion to the camera's actual level of dynamic range (arbitrarily scaled to begin with 25 "units" for level 0):

Level 0 = black level = less than or equal to 25 units of true light, which is lost within 25 units of random noise
Level 1 = 50 units of true light energy +/- a different amount of random noise, which is varies considerably from sensor to sensor and at different levels--it is not necessarily 25.
Level 2 = 100 units of true light energy +/- an even different amount of random noise, etc.
...
Level 10 = 25,600 units of true light energy +- zero noise because at this point the sensor site is fully saturated and is desensitized to any further amplification of signal as well as any noise

Clearly there is fluctuation going on in the interval between level zero and level 10. The number of gradations in tone that can actually be distinguished by the camera depends on the integral/summation of the sizes of the variable noise through all brightness levels, to determine an average noise level, and then dividing the white point energy level by that average noise level that separates distinguishable levels of gradation in tone.

To say that dynamic range and the number of gradations in tone are equivalent, or to say that either one of them can be determined simply from the ratio between the black noise level and the white level, is as stupid as defining people's adult heights merely by their weight at birth.


----------



## pwp (Nov 22, 2012)

Great to read all the highly technical responses. You guys know your stuff! Brilliant...

At a working mans level, I'd regard having too much DR as a non issue, unless of course it goes too far and delivers ridiculously flat images. But it's always going to a simpler matter to crunch down the DR to achieve suitable output vs struggling to increase DR in post-pro.

-PW


----------



## MarkII (Nov 22, 2012)

helpful said:


> To say that dynamic range and the number of gradations in tone are equivalent, or to say that either one of them can be determined simply from the ratio between the black noise level and the white level, is as stupid as defining people's adult heights merely by their weight at birth.


Since you always have the ability to control exposure to exactly catch the highlights in an image without clipping (by aperture, shutter-speed or ND filters), the dynamic range is effectively limited by the weakest signal that you can measure relative to the 'almost-but-not-quite-clipped' highlights.

Today, that weakest signal is effectively limited by the noise floor in the A/D conversion - a combination of quantisation noise (limited precision ADC) plus all the other noise gunk added by the electronics. Hence the resolution (number of effective bits after noise) of the sensor is its dynamic range.

Eventually, you will have sufficiently good sensors that the dynamic range will be ultimately limited by the quantum nature of light itself (photon shot noise). All of this is also affected by things like the Bayer colour filter array and things like micro-lens design (how much light actually reaches the sensor). In a 5D, I think that the tonal resolution (meaning colour discrimination) is more a property of the Bayer filters than the ADC.

There is a nice article about all this here: http://theory.uchicago.edu/~ejm/pix/20d/tests/noise/

All that really matters is that you can improve image quality by exposing-to-the-right, averaging frames, HDR stacking or simply downsizing images. And you only need to do any of this seriously if you have a non-theoretical problem with an image you are trying to take.


----------



## Nathaniel Weir (Nov 22, 2012)

Don't worry about it... just start taking pictures and stop blabbing on about sensor designs, when it has little impact on your photography. As the great Ken Rockwell states, "You need to learn to see and compose. The more time you waste worrying about your equipment the less time you'll have to put into creating great images. Worry about your images, not your equipment." 
And...
"Your equipment DOES NOT affect the quality of your image. The less time and effort you spend worrying about your equipment the more time and effort you can spend creating great images. The right equipment just makes it easier, faster or more convenient for you to get the results you need."


----------



## MarkII (Nov 22, 2012)

Quite - none of this matters unless you are doing something comparatively unusual.

Most photographs that I have seen - and taken - need better composition, lighting and subject rather than a better sensor.


----------



## NormanBates (Nov 22, 2012)

True, but the geek inside me still enjoys these theoretical discussions.

From my point of view, as long as your ADC has significantly more gradations than the DR of the camera (e.g. "16 bits" for "13 stops at pixel level"), this is a non-issue: you have an ADC that has enough gradations to actually capture the read-out noise of your image, so that is your limiting factor.

Say you have a sensor with full well capacity of 20.000e-, and read-out noise of 2e-. Your DR is 20*log10(10000)=80dB, or 13.33 stops. I guess the D800 sensor is pretty similar to that.
Tie that up with 16-bit ADC, and you have absolutely no "lack of gradation" issues whatsoever: you have to count electrons, the most you'll find are 20K, and you have 65K gradations at your disposal. Even with a 14-bit ADC, you wouldn't have terrible issues: 20Ke- to count (max), 16K gradations to use; the 2e- read-out noise is still your bigger problem.


----------



## NormanBates (Nov 22, 2012)

In the old days, you did have a problem.

Consider a sensor with max well capacity of 13Ke-, and rea-out noise of 13e-. DR is 60dB, or 10 stops.
Pair that with 12-bit ADC. Should be enough, right? Well, yes and no. You have 4096 gradations and you have to count up to 13000 electrons, so 892 and 895 will be the same to you. No big deal, since read-out noise means you can't really distinguish between 892 and 905, but, if you can't reduce that read-out noise, there's a small benefit if you go for a 14-bit ADC: you're getting better information about the image, and you'll be in a better position to try to average out that noise. Small, I know, but it's an improvement. if the 892 comes from a very unlucky 891 and 905 from a very unlucky 906, you're in a better shape if you can say there's a 3e- difference between them (when the real-world difference is 5e-), than if all you can say is that they're the same to you.

OTOH, if you stick to 10-bit ADC, then you clearly have a problem: your ADC-stepping will be added to your read-out noise. 892 and 904 electrons are the same to your ADC, but that 904 can come from a very unlucky 910, and that 892 can come from a very unlucky 886, and if 886 and 910 can look the same to you then you're in bad shape.


----------



## NormanBates (Nov 22, 2012)

(pun: "old days", or "Canon world", however you want to put it...)


----------



## jukka (Nov 22, 2012)

NormanBates said:


> True, but the geek inside me still enjoys these theoretical discussions.
> 
> From my point of view, as long as your ADC has significantly more gradations than the DR of the camera (e.g. "16 bits" for "13 stops at pixel level"), this is a non-issue: you have an ADC that has enough gradations to actually capture the read-out noise of your image, so that is your limiting factor.
> 
> ...



D800	2.7e read noise FWC 44972	= 14.0 stop


----------



## NormanBates (Nov 22, 2012)

jukka said:


> NormanBates said:
> 
> 
> > True, but the geek inside me still enjoys these theoretical discussions.
> ...



Nice. And the ADC is 14-bit, right?

So it has 16K values to count up to 45K electrons, and read noise is close to 3 electrons. Not ideal (16-bit ADC would be better), but not bad at all.
What's sure is that you can't say "I wish it had higher read noise or lower FWC, so I could get a better use of my 14-bit DAC"


----------



## Radiating (Nov 22, 2012)

nightbreath said:


> An interesting thought came to me before I went to bed. Below you'll find an assumption that came suddenly to my head, so please don't take it too seriously.
> 
> So... Let's assume there are two cameras with similar color tones reproduction abilities, but with different possible lightness level capturing ability. For example:
> - sensor of camera A has 12 stops of DR, 16 billion tones it can distinguish
> ...



Yeah it doesn't work like that at all, whatsoever. The range of dynamic range is not determined by the camera, but by the data format.

both Canon CR2 and Nikon NEF files have 14 bit depth, or 14 stops.

When you measure a CAMERA'S dynamic range that has nothing to do with how much data it can record from maximum through minimum, that is going to be 14 stops either way. It has to do with taking those 14 stops you start with and subtracting the NOISE floor. So you take your original 14 stops and subtract how many stops are going to be noise, such as say 4.5 and you get a 9.5 stop camera.

Having more dynamic range is never bad because it means there is less noise from the get go. The tone delta is always identical.


----------



## NormanBates (Nov 22, 2012)

Radiating said:


> nightbreath said:
> 
> 
> > An interesting thought came to me before I went to bed. Below you'll find an assumption that came suddenly to my head, so please don't take it too seriously.
> ...



dynamic range is not how many shades you have on your color space
it is related to real-world things: how much brighter can one object be than another, while the camera still captures them both correctly at the same time


----------



## nightbreath (Nov 22, 2012)

NormanBates said:


> dynamic range is not how many shades you have on your color space
> it is related to real-world things: how much brighter can one object be than another, while the camera still captures them both correctly at the same time


And that's why I ask . Why would I need DR for portraits? It is what I mainly do with my cameras as wedding photographer, so number of shades sensor produces is more important than DR


----------



## bdunbar79 (Nov 22, 2012)

Nathaniel Weir said:


> Don't worry about it... just start taking pictures and stop blabbing on about sensor designs, when it has little impact on your photography. As the great Ken Rockwell states, "You need to learn to see and compose. The more time you waste worrying about your equipment the less time you'll have to put into creating great images. Worry about your images, not your equipment."
> And...
> "Your equipment DOES NOT affect the quality of your image. The less time and effort you spend worrying about your equipment the more time and effort you can spend creating great images. The right equipment just makes it easier, faster or more convenient for you to get the results you need."



You have GOT to be kidding me!


----------



## NormanBates (Nov 23, 2012)

nightbreath said:


> NormanBates said:
> 
> 
> > dynamic range is not how many shades you have on your color space
> ...



There are some usage models for which DR is important, there are many usage models for which it doesn't matter at all. If the scene you have in front of you doesn't require more than 8 stops of DR, you're fine with a camera that can capture that, no point in going for one that is the same in every respect but will record 14 stops of DR.

If your portraits are in a studio, with a standard backdrop, your DR needs will probably be pretty modest. If your portraits happen in other less-controlled locations, you may have very high DR needs (e.g. if you want to take a portrait of someone in their bedroom, and there's a window in an interesting area). Wedding photographers take lots of portraits, and, not having a lot of control over their shooting scenarios, they usually need a lot of DR (for this reason, a friend of mine was still using his Fuji S3 pro as his backup body up until the D800 came out: a 12 mpix camera from 2005... with 13.5 stops of DR as measured by dxomark).

In any case, ADC precision will only be a problem if the manufacturer screws up the sensor-ADC matching. No current camera has that issue AFAIK.


----------



## Radiating (Nov 23, 2012)

NormanBates said:


> Radiating said:
> 
> 
> > nightbreath said:
> ...



Face palm. No. No. No.

Raw images are captured in bits by intensity at the photo site. The simplest version would be a 1 bit photo site that either registers full of photons or empty.

So with a simple 2 bit system we can have:

00 = 0-100 photons in a pixel
01 = 100-200 photons in a pixel
10 = 200-400 photons in a pixel
11 = 200-infinity photons in a pixel

Then for different ISO settungs we multiply or divide the photons to produce different exposures.

This gives us 2 stops of dynamic range from 100 photons to 400 (or multiples of that). A stop is a doubling of light so 2x2=4.

This is how cameras work. A cameras dynamic range rating is essentially the theoretical dynamic range minus how ever many stops in the shadows are unreadable information. So in our 2 stop example if photons from 0-200 ISO were too noisy to determine what is supposed to be there then our theoretical camera has 1 stop of DR. You can think of noise as a random number generator that's added to the photon count. So out count of 0-400+ would have a number from 0-100 randomly added or subtracted from it. This is the noise you see when you put fill light to max. Anyways if a ranom number from 0-100 is added or subtracted it is mathematically impossible to determine how many photons were in our pixel in the 1st stop. Literally all you'd see is something resembling TV static if you tried to make a picture from it.

So cameras with more dynamic range have the static come in at a lower stop.

<---- is an engineer.


----------



## NormanBates (Nov 23, 2012)

We are saying the same, I guess I didn't make myself sufficiently clear.

What I mean is that if you add 1 bit to your ADC and instead of having

00 = 0-100 photons in a pixel
01 = 100-200 photons in a pixel
10 = 200-300 photons in a pixel
11 = 300-infinity photons in a pixel

You now have:

000 = 0-42 photons in a pixel
001 = 42-84 photons in a pixel
010 = 84-126 photons in a pixel
011 = 126-168 photons in a pixel
100 = 168-210 photons in a pixel
101 = 210-252 photons in a pixel
110 = 252-294 photons in a pixel
111 = 294-infinity photons in a pixel

...and you have the same read-out noise, you still have the same DR, because neither your full-well capacity nor your read-out noise have changed.

<----- is writing the VHDL code for the FPGA of a motion picture camera


----------



## nightbreath (Nov 23, 2012)

NormanBates said:


> There are some usage models for which DR is important, there are many usage models for which it doesn't matter at all. If the scene you have in front of you doesn't require more than 8 stops of DR, you're fine with a camera that can capture that, no point in going for one that is the same in every respect but will record 14 stops of DR.
> 
> If your portraits are in a studio, with a standard backdrop, your DR needs will probably be pretty modest. If your portraits happen in other less-controlled locations, you may have very high DR needs (e.g. if you want to take a portrait of someone in their bedroom, and there's a window in an interesting area). Wedding photographers take lots of portraits, and, not having a lot of control over their shooting scenarios, they usually need a lot of DR (for this reason, a friend of mine was still using his Fuji S3 pro as his backup body up until the D800 came out: a 12 mpix camera from 2005... with 13.5 stops of DR as measured by dxomark).
> 
> In any case, ADC precision will only be a problem if the manufacturer screws up the sensor-ADC matching. No current camera has that issue AFAIK.


I have only few times felt lack of DR in my camera. And that was before I really used to getting the pictures I won't delete later. There are several techniques to get the shot you need the way you want to see it.

Is there a web-site where I can look at your friend's photos to check why someone chooses high DR cameras?


----------



## jukka (Nov 24, 2012)

You need not look at any photos, with 14 stops DR you have more exposure options and with for example d800 you have no banding and pattern noise in lower levels. You can use one raw file and develop the raw file after highlight and one after shadows and mix them together, With a Canon you must take two or more exposure and have the Camera on a tripod and no moving objects, or you can also develope one raw file after the high lights and lift the areas in shadows and a contrasty motive with out pattern noise or banding.


----------



## bycostello (Nov 24, 2012)

can you have too much...


----------



## @!ex (Nov 24, 2012)

I hope that someday I could get a shot with this much dynamic range in a single exposure. I bracketed 7 shots at 3 EV spacing per bracketed shot. That is 18 EV spread, but each shot has it's total EV range (minus clipping) so the DR spans almost from pure black to pure white. My eye saw these images like this, but with a single exposure (including using ND filters) I could never get these shots without increasing the DR of the camera via multiple exposures. Sorry, not trying to go off topic, just thought it was relevant to the subject.




End of the Road by @!ex, on Flickr




Everything Peels... by @!ex, on Flickr


----------



## helpful (Nov 24, 2012)

NormanBates said:


> We are saying the same, I guess I didn't make myself sufficiently clear.
> 
> What I mean is that if you add 1 bit to your ADC and instead of having
> 
> ...



Right on! This is exactly what I'm trying to say, and you explained it much more clearly. DR is not the same as the number of gradations and also not the same as the bit depth (which actually just counts the number of "possible" gradations, whether or not the camera actually is capable of resolving all those gradations).

If the number of gradations accurately recorded within a 10 stop dynamic range is the same as the number of gradations accurately recorded within a 14 stop dynamic range, then the 10-stop camera has more precision and better image quality _within that 10-stop interval of light intensity_ versus the 14-stop camera. But outside that 10-stop range, the 10-stop camera has zero image quality, and so the 14-stop camera wins hands-down.

DR is not something to get angry about, just a trade-off between obtaining either greater differentiation between subtle shades of colors (like slide film with lower DR) or greater exposure latitude (like negative film with higher DR).


----------



## nightbreath (Nov 24, 2012)

helpful said:


> If the number of gradations accurately recorded within a 10 stop dynamic range is the same as the number of gradations accurately recorded within a 14 stop dynamic range, then the 10-stop camera has more precision and better image quality _within that 10-stop interval of light intensity_ versus the 14-stop camera. But outside that 10-stop range, the 10-stop camera has zero image quality, and so the 14-stop camera wins hands-down.


And that's why I've started the topic. To my understanding my near-12-stop DR camera is perfect for my work and I would think twice before getting next Canon released camera that might have bigger DR with the same number of gradations resolving power.


----------



## NormanBates (Nov 24, 2012)

nightbreath said:


> Is there a web-site where I can look at your friend's photos to check why someone chooses high DR cameras?



He's a wedding photographer, I don't think he publishes his pictures online, he gives them to his customers.
But the usual scenario he was referring to was: very sunny day, bride in shiny white, broom in matte black suit with subtle stripes, anything except his fuji (or, now, D800) will result in said suit looking like a black blotch, and there's nothing he can do about it.


Now, back to the technical discussion...


Let me add a twist: the ADC works linearly, but what you see is log

So, if you have a 14-bit ADC (you can count up to 16384)and can record 14 stops of DR, here is how those values will be distributed:

14th stop: 8192 to 16383
13th stop: 4096 to 8191
12th stop: 2048 to 4095
11th stop: 1024 to 2047
10th stop: 512 to 1023
9th stop: 256 to 511
8th stop: 128 to 255
7th stop: 64 to 127
6th stop: 32 to 63
5th stop: 16 to 31
4th stop: 8 to 15
3rd stop: 4 to 7
2nd stop: 2 to 3
1st stop: 0 to 1

So you may actually have very serious issues in the shadows... which I see in the Canons, but not in the D800!

* if you're going to have issues with "too much DR, not enough gradation", they'll be in the very deep shadows, which you wouldn't see anyway if you were shooting with a camera with the same ADC but less DR; your skin tones are unlikely to land anywhere below the 5th stop fro the top, so for them you have way more values than you need (anything above 50 gradations per stop is usually smooth even after heavy grading)

* how come I don't see this in samples from the D800?


----------



## nightbreath (Nov 24, 2012)

NormanBates said:


> nightbreath said:
> 
> 
> > Is there a web-site where I can look at your friend's photos to check why someone chooses high DR cameras?
> ...


Something like the one I've attached? Shot in the middle of the day. So it's another reason why I've started the discussion. Because I don't understand why everyone is so tempted about DR possibilities when everything depends on technique.

P.S. It's not one of the best shots from this day, I've just used one with hasrh shadows.




NormanBates said:


> * if you're going to have issues with "too much DR, not enough gradation", they'll be in the very deep shadows, which you wouldn't see anyway if you were shooting with a camera with the same ADC but less DR; your skin tones are unlikely to land anywhere below the 5th stop from the top, so for them you have way more values than you need (anything above 50 gradations per stop is usually smooth even after heavy grading)


As far as I understand each camera applies it's own tone curve to the image, or am I wrong?

Initially I wanted to be brand-agnostic and instead of discussing specific sensors, I want to identify what really matters for my needs (and maybe many others). I'm not able to tell what it is right now, so everyone's input is appreciated


----------



## nightbreath (Nov 24, 2012)

@!ex said:


> I hope that someday I could get a shot with this much dynamic range in a single exposure. I bracketed 7 shots at 3 EV spacing per bracketed shot. That is 18 EV spread, but each shot has it's total EV range (minus clipping) so the DR spans almost from pure black to pure white. My eye saw these images like this, but with a single exposure (including using ND filters) I could never get these shots without increasing the DR of the camera via multiple exposures. Sorry, not trying to go off topic, just thought it was relevant to the subject.
> 
> Shot #1
> Shot #2


No offense, but these scenarios look uninspiring to me. And I believe it's not about how you or I see it, it's about everyone's way of thinking towards DR that makes HDR overused by lots of photographers around the world.

I believe that HDR imaging has its own niche, but it should be used when the result doesn't tell you whether it's HDR or not. So better scenes is what really matters for me (rather than increased DR):


----------



## Danmix (Nov 24, 2012)

Touché nightbreath.... Great shot


----------



## nightbreath (Nov 24, 2012)

Danmix said:


> Touché nightbreath.... Great shot


Sorry, I didn't sign the last photo. It's not mine, I've found it in a social network. It was said that "Baktiar Sontani" is the author of the photo


----------



## NormanBates (Nov 24, 2012)

nightbreath said:


> As far as I understand each camera applies it's own tone curve to the image, or am I wrong?
> 
> Initially I wanted to be brand-agnostic and instead of discussing specific sensors, I want to identify what really matters for my needs (and maybe many others). I'm not able to tell what it is right now, so everyone's input is appreciated



Not really: each camera applies its own tone curve, but that happens *after* the ADC has done its job, and, as far as I know, all cameras have linear ADC (ok, some have piecewise linear, but that's actually a change for exposure, not for the ADC, which is still linear; and I'm not sure any of the ones we're talking about actually does that).

The fact that light is linear but you see it as log makes this very inefficient, and it is the reason that, for example, some Nikon cameras (low to mid-end) apply a log curve even in their RAW files: with linear, you have way more gradation than you need in the highlights, and may still be struggling in the shadows. But even this curve happens after the ADC, so it's not what we are talking about, I think.



As for the need for DR, as I said, it depends on what you do, how much time you have to do it, and how much margin of error. My friend was carrying an extra 5-year-old 12 Mpix camera just in case, because he thought he needed to do so. If you don't think you need it, well, good for you, what can I say?

Now go ask any cinematographer if they think 11 or 12 stops of DR is enough.
(hint: I hardly ever shoot stills, I'm a vidiot)


----------



## molnarcs (Nov 24, 2012)

I don't quite understand all the technical background, but I can attest to the advantages of high DR in my work. I do a lot of interiours, and most of them are high-contrast situations, often necessitating trips to PS for better noise control (I bought Noiseware Professional which used to be better than LR, maybe still is), occasionally blending in different exposures (I always do bracketed shots for interiour work), etc.

With the d7000 and now the d800, my trips to PS noticeably decreased. In fact, I don't recall any situation in the past year where I had to use blending. My only comparison is the 5D MK II and Rebel T2i, and my gut feeling is that you can push the d7k at least a stop more without image degradation (colour shift, noise, etc.) In Lightroom terms, this is about 30-40 points on the shadows slider, or being able to push both blacks and shadows on the Tone Curve significantly more. For me, it means staying in Lightroom for 99% of my workflow. You can just feel how much more malleable are NEF files then CR files. 

I'm not sure any of this matters if you're shooting JPEG. Except maybe if you have ALO enabled. With Canon's, I never used ALO (and I didn't use ADL it with the d7000), but when I bought the d800, I just left ADL (the Nikon equivalent to ALO) on Auto setting. I'm a raw shooter, but lately I've been experimenting with JPEG+RAW because I started shooting events now (not much interiour work lately). As far as I can tell, ADL works well, I don't see any image degradation, and the change is quite subtle actually, but for the better as far as I can tell. Where it truly matters is when you shoot RAW.

I heard a good description of RAW somewhere - RAW is like a box of light. When you look at a RAW shot from either Canon or Nikon you basically see the same image and DR. The difference becomes prevalent when you're start messing with it, changing exposure or using curves. With a high DR camera, you can push shadows more without significant image degradation. It is as simple as that. You have a bigger box of light with a high DR camera 

By the way, your photos are magnificent! I don't even know what most of my favourite photographers shoot. I know some shoot Canons, others shoot Nikons - and the end results are all magnificent. Once you have invested in either system, I don't see much reason to change. I didn't have a huge investment (Sigma 10-24 and a Rebel) when I bought the Nikon d7000 (and the new Sigma 8-16 which was to replace the 10-24 anyway). The 5D MK II wasn't mine. For the things I shot back then, the high DR of the d7k did make a difference, made my life easier. But for the things you shoot, it might make zero difference. As I said, what you get with high DR is a bigger box of light, but if you don't see the limits of your current box, then you don't need a high DR camera  And I think for JPEG shooters it doesn't matter at all.


----------



## noisejammer (Nov 24, 2012)

NormanBates said:


> ...Let me add a twist: the ADC works linearly, but what you see is log
> 
> So, if you have a 14-bit ADC (you can count up to 16384)and can record 14 stops of DR, here is how those values will be distributed:
> 
> ...



This is incorrect - if you replaced "bit" with "stop" then it is correct (and it's then obvious why ETTR works too.)

It's easier to consider a 3- or 4-bit digitiser. Let's say it offers 4 bit resolution then the possible counts are 0000 through 1111. This translates to 2^4 or 16 levels. Written the way that Norman stated it, there would only be four distinct levels - this is incorrect (but I understand he meant there would be 16 levels.)

Some things appear to have been glossed over in the discussion. First, the ADC operates on a per-pixel basis.

I've read the DxO tests on various sensors. It is important to remember that the notional dynamic range is referred back to an 8 mp standard. This means that the D800's quoted 14 bits dynamic range is significantly less than 14 bits at a per-pixel level. The 36 mp > 8 mp conversion gains the sensor sqrt(4.5) = 2.1x notional improvement in dynamic range. The 2.1x is slightly more than 1 stop in quoted dynamic range. This means that the true _per-pixel_ dynamic range is about 12.9 stops.

In order to read those 12.9 stops, the ADC needs a bit more resolution than the 13 bits required by the pixel. I'm quite surprised because the 14 bits in the ADC suggests that the entire detection chain has ~1 bit of noise.... It sounds improbable.

Photon shot noise has been commented on briefly. If we assume 13 bits dynamic range and (say) 10% quantum efficiency (pidooma), then the number of photons required to fill a pixel is 10 x 2^13 or 80k. Since shot noise varies with the square root of the number of photons, the pixel could have shot noise of up to 280 photons (rms). 

Since 1 bit translates to 80k/8192 = 10 photons and we must have about 6 bits of photon noise at the upper end of the sensor's dynamic range. At the bottom end, the quantum efficiency sets the performance and there must be ~3 bits of noise. 

Shot noise alone suggests that the true dynamic range of an image cannot be more than about 8-10 bits. It seems that the only way to improve on this is by greatly enhancing the sensor's quantum efficiency.

To answer the OP's question - photon noise alone suggests that there's not a whole lot of benefit to high resolution ADC. It does allow for more sophisticated noise filtering - presumably at the expense of resolution.

<---- physicst / astronomer


----------



## NormanBates (Nov 24, 2012)

* I definitely meant stops. I chose an example with a 14-bit ADC and a sensor with 14-stops of DR to make my life easier, but I meant stops. "One stop brighter" means "twice as many photons", and the ADC is linear (it counts electrons using a linear scale), that's what makes both sides match.

* I'm talking about per-pixel DR. Matching resolution makes a lot of sense when you're comparing cameras, since it's the final image that you care about, not each individual pixel. But it makes the technical discussion a lot more complex, because you have to take into account how downsampling reduces noise. I'd rather leave that out right now.

* The per-pixel DR measurement that dxomark got was 13.44 stops http://www.dxomark.com/index.php/Cameras/Compare-Camera-Sensors/Compare-cameras-side-by-side/(appareil1)/834%7C0/(brand)/Nikon/(appareil2)/795%7C0/(brand2)/Canon)
I'm also puzzled at how the D800 can manage that with a 14-bit ADC. As I said above, I'd be expecting to see a lot of posterized noise in the shadows, but it's not there.

* No idea about how quantum efficiency affects all this. I never thought about it.


----------



## LetTheRightLensIn (Nov 24, 2012)

nightbreath said:


> An interesting thought came to me before I went to bed. Below you'll find an assumption that came suddenly to my head, so please don't take it too seriously.
> 
> So... Let's assume there are two cameras with similar color tones reproduction abilities, but with different possible lightness level capturing ability. For example:
> - sensor of camera A has 12 stops of DR, 16 billion tones it can distinguish
> ...



No. You are thinking about it wrong. It doesn't work like that at all.

All it means is that the camera with more DR has less noise in the lower tones than the other camera. There is no way you can ever lose tones because of that. Whatever you are trying to do you can always exactly match what the other camera can accomplish (plus more things). In fact, since you captured with less noise you have captured MORE distinguishable tones and the captures are linear there is no different expansion you need to do with one camera vs the other, as you compress it to a screen maybe you don't use the extra tones but you won't end up with less and you might end up with more.


----------



## nightbreath (Nov 24, 2012)

LetTheRightLensIn said:


> nightbreath said:
> 
> 
> > An interesting thought came to me before I went to bed. Below you'll find an assumption that came suddenly to my head, so please don't take it too seriously.
> ...


Ok. But it doesn't explain me the difference in 1D series and 5D series color integrity when you push image colors, so there's still feeling of incompleteness I was left with 


I've already got one valuable comment on that:


jukka said:


> 1D series has more expensive electronic chain and also when it comes to shielding etc and 1d series is probably also better matched in terms of RGB
> The old 1dsmk3 has a better response regarding middle tones than 5d mk2 mk3 series an can be seen in a even colored surface.
> There also different CFA in the old 5d compared to 5dmk2 mk3 and some experiencing the colors better in the old 5d
> Canon changed their color filters (not so dense ) in order to gain more light/ increasing sensitivity


But there were no refrences to official resources / tests that could tell more.


----------



## @!ex (Nov 24, 2012)

nightbreath said:


> @!ex said:
> 
> 
> > I hope that someday I could get a shot with this much dynamic range in a single exposure. I bracketed 7 shots at 3 EV spacing per bracketed shot. That is 18 EV spread, but each shot has it's total EV range (minus clipping) so the DR spans almost from pure black to pure white. My eye saw these images like this, but with a single exposure (including using ND filters) I could never get these shots without increasing the DR of the camera via multiple exposures. Sorry, not trying to go off topic, just thought it was relevant to the subject.
> ...



That is what I'm talking about. We are all going to have a different eye for what compositions touch us the most, but my argument was about the utility of high DR and of fusion/hdr. There are a lot of scenarios where they are indispensable in getting the shot we (as artists/photographers) want. I was just showing a few extreme examples of what extreme DR can be useful. If you don't like those particular shots that is fine, as I was mainly using them to illustrate a point. I agree with you that HDR as a techniques is overused, and used as an effect to over emphasize details, rather than used to achieve dynamic range that would otherwise be unachievable without either unwieldy supplemental lighting or impossible (non linear) and creativity inhibiting ND filters. Here are a few more examples, maybe one of them will resonate with you more...




TiVo by @!ex, on Flickr




They breath profits; they eat the interest on money... by @!ex, on Flickr




Home on the range... by @!ex, on Flickr




Electric Sunset at City Park by @!ex, on Flickr




Drought by @!ex, on Flickr


----------



## Sporgon (Nov 25, 2012)

I've enjoyed following this discussion even though I haven't understood any of the technical stuff. As far as I am aware, you press the shutter, magic happens, and the picture appears on the back of the camera. 

But some interesting points have been raised in relation to dynamic range of digital cameras. When we take a photo, if our subject is only lit by incident light - the ev level falling on the subject, then the ev range is not as great as you might think. In England bright summer sun as an ev value of about 14.5, and in that situation the luminosity in even the darkest shadows would struggle to be less than 5, so the dynamic range in terms of stops is about 9. 

With regard to the classic wedding picture - brides white and grooms black, there's a big difference in reflected light, but "correct" exposure should still enable correct detail in both black and white, eased substantially but the use of a reflector or fill in flash. In the film days the reason any wedding photographer worth their salt used medium format was to have the higher sync speed. This need was reduced when focal plane shutters increased to 200 sec sync. 

The situation changes dramatically once you start to include the light source in your frame, such as bright skys, the sun itself, bride lit by bright window and that included in the frame. Then the ev range goes of the scale. 

The Op's wedding pic is lit by incident light, and his camera has DR to spare, and when the DR of the Canon technology is pushed in this situation I've found certainly the 5D mki to be fine.

The first pic attached is a snap at a friends wedding, and I've used it as an example because it was it mid afternoon bright sun, the little boy has a plain bright white shirt, and the guy a dark (ish) suit. I wasn't exposing for the high lights, so part of the boy's shirt has gone to 255, but it was reflecting the sun straight back at me, so it's going to be bright white. This first pic is straight off the camera. 

The second is 200% of the shadow on guy's leg. The third is the pic how I would produce it, lifting the dark shadows. The fourth is 200% of this. There is no noise in lightened area at all. ( even on the full size data). 

I've then lifted the shadow so it's almost gone - and we have noise coming in, but who would want a pic like that - can't post it - had my four !

As I have said, once you try to include the light source itself in your frame the situation changes. At the present time it is impossible for a camera to record in one exposure what the eye can see...............because.............the eye doesn't see, your brain does. Your eye just gathers and focuses the light - the ( camera ) lens - our brains deal with interpreting this, prospective, field of view, dynamic range. ( This why people can "see" things that aren't really there ). 

So our brains do instantly what !ex has spent some time doing with his beautiful pictures - the best of HDR technique - that makes you believe this is how we would have seen it. The dynamic range in this type of situation will be well over 14 stops. A picture that has been produced from a 14 stop DR camera in one frame, with the majority of the picture vastly under exposed due to exposing for the light source, and then pulling back the shadows, will never have the colour, luminosity and general "brio: of the technique that !ex has used. ( Well I say never, but not in the near future ). 

So if you have a camera with 11 stops of DR and really good colour, tonal graduation from black to white etc, your pictured wont be held back by technology. Having a further 3 stops of DR would be no disadvantage as long as it doesn't compromise any of the other facts that are more critical to picture quality - the OP's original question I believe.

Incidentally I still maintain the the image quality that I have seen from 5D mkiii and D1X shows that these cameras can be superb. And an older camera that I always though could produce very good tonal graduation was the D200 - with it's 10 stops of DR.

Anyway I have probably bored the pants off anyone who has read through this, but I've got nothing better to do on a very wet afternoon !

( I forgot to change the pictures to sRGB for the web )


----------



## nightbreath (Nov 25, 2012)

@!ex said:


> That is what I'm talking about. We are all going to have a different eye for what compositions touch us the most, but
> 
> 
> 
> ...


I really like these two


----------



## Neutral (Nov 25, 2012)

DR is like USD - the more you have - the more you want 
More DR like more USD gives more freedom and reduces dependencies on different circumstances and this in turn allows to get desired more easily and much quicker.


----------



## NormanBates (Nov 26, 2012)

Yes, that's a good example. The sun may be brighter here in southern Spain, and you may not want a wedding dress to blow up like that kid's shirt, and the broom suit may be black instead of gray (look at the shoe; that's black, the suit is not), and then you may be in trouble if your camera is DR-challenged.

Not the most common scenario, by a long shot, but I never said you always need lots of DR. Just that you may sometimes need it.


----------



## serendipidy (Nov 26, 2012)

DR is like USD-the less you have-the more you need


----------

