# Stop exposure at overexposed pixel threshold



## alfredo (Jul 16, 2014)

Hi all,

Few days ago I was trying to make a timelapse of a firework show. I went for a fixed exposure in every shot: well, it turns out fireworks can have a great and sudden dynamic range, so half of the pics are exposed just right, but the other half is way overexposed.

Now, I think even going to automatic exposure in every shot wouldn't actually help: besides the exposure flickering, the problem I see is that exposure is measured before shooting, and since fireworks are sudden, the dynamic of the scene can dramatically change between the measuring time and the actual shooting time.

Hence, I was thinking that maybe an exposure time based on how many pixels are reaching their overexposed status during the actual shooting would be a working approach. For example, I set ISO (or a range thereof), aperture, and a desired exposure time: the camera tries to take such a shot, but if overexposure (or a certain exposure value) happens on a percentage of the sensor pixels before the desired exposure time, it stops taking the picture as if the shutter was released earlier.

Maybe fancy DSLRs already have this feature? I'm currently using a P&S with CHDK installed: I can do great things (with unsatisfactory IQ and speed performances), but I didn't find such a feature. Technically, it doesn't seem impossible to implement, assuming each pixel can interrupt with an overexposed signal.

Am I dreaming of non-existing features? Does it sound useful/useless to you? It won't be useful in all situations, but at least in those where the subject is suddenly and unpredictably bright (just like fireworks, or lightnings).

C&C are very welcome, indeed exactly what I'm asking for ;-)


----------



## KeithBreazeal (Jul 16, 2014)

The approach to this concept is something I have pondered over the years. The imaging device would need to be passively sampled several times during the exposure. This would require a new concept in design and need out-lying processors to do this. Power requirements and size would be the biggest concern. I worked on this concept, not for photography but sonar design and software at the Naval Air Development Center. The alternate method is less complex and work with today's chip designs. Broadcast quality video cameras used a 3 chip design the generate the RGB image. This was accomplished by using a prism like design that split the light from the lens into three different directions. The video design could be adopted, but use three full spectrum imagers. The exposure value would be correct for one imager and the other two would be bracketed. As you may have concluded by now, I'm talking about doing HDR for every image captured. The three chips are required to insure motion is captured without any timing difference. This is the secret to success. The body would need to be designed around the three chips.


----------



## Marsu42 (Jul 16, 2014)

alfredo said:


> Hence, I was thinking that maybe an exposure time based on how many pixels are reaching their overexposed status during the actual shooting would be a working approach.



For that, you'd need the ability to read information from the sensor before the actual data readout, afaik that's not possible with the current design. But I already wished for this myself, "why not just engage bulb mode and let the camera figure out the rest?". 

So for fireworks, you basically have to resort to trial & error and try to be on the safe side to prevent white clipping or the colors are gone. This is tricky for short fireworks w/o previous experience, when in doubt look at shots around the net and check what expo settings they were using. Last new years eve, I was using 60sec @iso200 with f8.



alfredo said:


> Few days ago I was trying to make a timelapse of a firework show. I went for a fixed exposure in every shot: well, it turns out fireworks can have a great and sudden dynamic range, so half of the pics are exposed just right, but the other half is way overexposed.



In general for timelapses use Magic Lantern, it's got built-in expo ramping and deflickering for this very purpose.

For fireworks, it depends if you also want part of the dark background visible. In that case, you want as much dynamic range as you can grab, so use Magic Lantern's dual_iso module - if exposing @iso100 it should also result in a nice blur effect for the fireworks.


----------



## KeithBreazeal (Jul 16, 2014)

This is a photo that is the result of numerous exposure adjustments. I actually found the 'sweet spot' setting of 13 seconds, f11, iso 800. Crazy. I used DXO Pro 9 to dig out the dark areas. I exported to Lightroom and did some dodging & burning with the brush tool.



Fireworks 3 July 2014© Keith Breazeal by Keith Breazeal Photography, on Flickr


----------



## StudentOfLight (Jul 16, 2014)

Pixels can Overexpose due to factors other than true exposure. e.g. Image noise and hot pixels could cause your camera to underexpose.


----------



## Marsu42 (Jul 16, 2014)

StudentOfLight said:


> Pixels can Overexpose due to factors other than true exposure. e.g. Image noise and hot pixels could cause your camera to underexpose.



Of course the software can account for that: Magic Lantern does it with the ettr function, you can set a threshold of allowed clipping. Problem is that it only after reading the data from the sensor (i.e. in live view or analyzing a shot just taken) and thus cannot cope with fireworks.


----------



## alfredo (Jul 16, 2014)

Thanks all for the thoughtful and instructive comments.

So yes, it appears we'd need the ability to gather successive readout data while the sensor is actually taking the picture. It seems we agree this is currently not available neither on sensors nor supported on camera CPUs. Pity!

Such partial readouts would indeed open the path to a different way of taking pictures. First, one could see the histogram evolve while the picture is being taken (instructive at least). A form of "native" HDR would become possible, where near-to-clipping pixels would be stored before they clip out, while the rest of the pixels keep getting exposed until the shutter time is reached (of course this would kill contrast a bit, but in practice not more than HDR is supposed to do, as far as I understand it). One application would be... fireworks indeed!

I'm curious to see if Magic Lantern or CHDK offer at least some ways of partially implementing such ideas. And of course, somebody could find creative uses of this.

BTW: great pictures!


----------

