# Image averaging to reduce noise?



## YellowJersey (Jan 26, 2017)

Anyone else tried this? Was thinking it would be really handy for astro. (or I'm just late to the party) 

https://www.youtube.com/watch?v=4SDgfB9I4As


----------



## Mt Spokane Photography (Jan 26, 2017)

I used software called photo acute, starting in 2008. It combines images to reduce noise and increase sharpness. Like all similar techniques, it has limited use, moving subjects do not work. 

I have not used it recently, and now some of the popular software can do similar things. Originally, I submitted photos taken with various lenses and received a free license after submitting many that the writer needed. My license does not work with current versions. 

They have stopped supporting it, so only a preview version is available. It is very slow to load and process, there is probably much better software now. It does reduce noise, but is slower than DXO


----------



## Sharlin (Jan 26, 2017)

Yes, for deep sky astro people routinely stack dozens or even hundreds of exposures. Every doubling of the number of frames gives you about half a stop of noise reduction. Newer Canon bodies include a feature called "multi-shot noise reduction" which averages together four frames taken in quick succession, gaining you about a stop.


----------



## LDS (Jan 26, 2017)

YellowJersey said:


> Anyone else tried this? Was thinking it would be really handy for astro. (or I'm just late to the party)
> https://www.youtube.com/watch?v=4SDgfB9I4As



These techniques has been in use for years, in astrophotography, since digital imaging allowed for better photons capturing than films, and easier registration and processing of images. Many amateurs adapted webcams too to telescopes (especially for planets imaging), stacking thousands of frames to obtain very good images. Of course you need no changes in the frames - especially changes that cannot be corrected in software.


----------



## rfdesigner (Jan 26, 2017)

In astro there are multiple advantages of using many images.

1. You can stack and reduce noise
2. You can dither the pointing of the telescope a bit, then you align to a finer grid and get better resolution
3. You can deselect artefacts such as satellite trails or aircraft buy either just removing poor frames, or by using statistical stacking, so you work out the average and standard deviation for each pixel of a set of images, then remove any pixel that is beyond, say, +/-3sigma. Thus you don't throw away any frames, just the few pixels in a frame that have been contaminated.
4. You can do "RGB" imaging with colour filters and a mono or B&W camera.
5. You can do LRGB imaging with colour filters and a mono or B&W camera to get much faster images.
6. You can also do "lucky imaging", mainly for planetary work, which can be used to eliminate "twinkle".

example: RGB: you image 10 frames of Red, 10 of Green & 10 of Blue, the "glue" them together.
example: LRGB: you image 15 frames of B&W (much more sensitive than the colour frames as you have 3 times the light and 50% more integration time.), then you image 5 each of Red Green and Blue with the camera set to "Bin" 3x3, that is on the sensor you effectively create much larger pixels before converting and reading out. Because there is now 9x the signal the SNR is much better despite the shorter time. (you can't do this with a CMOS chip, you need a CCD)
Now you make a RGB image from the RGB frames, then you convert to HSI. Now the trick!.. you throw away the "I" frame (the brightness image) and replace it with the high resolution high SNR "I" frame you took with the 15 B&W frames. The you convert back to RGB and process as normal.
Notice both LRGB and RGB examples take the same imaging time, but the LRGB will have substantially better SNR.

Also once you use filters and mono-chipped cameras you can take full advantage of narrowband filters and just image for particular gasses such as hydrogen, then you add that data into your image:

google for "hubble palette"

You also take various calibration frames which again you stack.

Take a glance at http://www.astronomie.be/registax/


----------



## YellowJersey (Jan 29, 2017)

I'll just file this one under "late to the party," then.


----------



## rfdesigner (Jan 29, 2017)

YellowJersey said:


> I'll just file this one under "late to the party," then.



Don't beat yourself up about it, we're all late to the party:

http://www.astrosurf.com/buil/us/story/story3.htm


----------



## Don Haines (Jan 29, 2017)

YellowJersey said:


> Anyone else tried this? Was thinking it would be really handy for astro. (or I'm just late to the party)



Yes, you are late to the party, but far more importantly, WELCOME TO THE PARTY!!!!!!!!


----------



## AlanF (Jan 29, 2017)

Sharlin said:


> Yes, for deep sky astro people routinely stack dozens or even hundreds of exposures. Every doubling of the number of frames gives you about half a stop of noise reduction. Newer Canon bodies include a feature called "multi-shot noise reduction" which averages together four frames taken in quick succession, gaining you about a stop.



Naively, I would have thought that stacking two would give a stop, not half a stop of noise. An extra stop doubles the exposure and so increases S/N by sqrt 2; doubling the number frames also increases S/N by sqrt 2. Or have I got it wrong?


----------



## scyrene (Jan 29, 2017)

Yes, I use it for astro as it really makes all the difference - it is the number one thing that improves such images imho. You don't *need* any particular camera or lens, or a tracking mount or even a tripod (though obviously they can all help) - but taking a series of images of the same subject, aligning, and stacking reduces noise so as to allow for much more aggressive processing, bringing out faint details.

I have tinkered with it for other purposes. It can actually work very well with static scenes, even if you're handholding - Affinity now has an autostack routine that does very well aligning everything, even when there's some slight discrepancy between frames. The most marginal scenes seem to give the greatest improvement - so using a mobile phone to photograph very dark static scenes, you can produce usable results with stacking, way beyond what would otherwise be possible (press and hold the 'shutter button' and take 20 or more in a very quick burst, and the results can be surprisingly good). The penalty with this tends to be sharpness, as the alignment won't be pixel perfect unless you use a tripod and timer/remote shutter. This is mostly for fun, of course.

I've considered doing it with low light bird shots, but no matter how similar I think two frames are (a static bird, short exposure time, good IS), there's always too much movement (feathers ruffling in the wind etc) to make the result worthwhile. Perhaps with ever faster burst rates, it might start to be possible under some circumstances, however.


----------



## Sharlin (Jan 30, 2017)

AlanF said:


> Sharlin said:
> 
> 
> > Yes, for deep sky astro people routinely stack dozens or even hundreds of exposures. Every doubling of the number of frames gives you about half a stop of noise reduction. Newer Canon bodies include a feature called "multi-shot noise reduction" which averages together four frames taken in quick succession, gaining you about a stop.
> ...



Wait, you're probably right. I was thinking SNR and how it grows as sqrt(num-of-frames) and made an incorrect mental leap. Disregarding thermal noise, you should get the same SNR whether taking one long exposure or dividing it into subframes and stacking afterwards.


----------

