# DPReviewTV: What is diffraction in photography?



## Canon Rumors Guy (Jan 25, 2021)

> It’s Monday, so it’s time to head to class and today the always educational Don Komarechka shows us what’s behind the dreaded “diffraction” when you’re stopped down significantly with your lens.
> Why do the images lose sharpness? Don shows you a couple of simple experiments to help you understand what is actually a pretty complex phenomenon.
> Check out the video above and give Don a follow on Twitter or Instagram



Continue reading...


----------



## Chaitanya (Jan 25, 2021)

Really good demonstration especially covering it for macro where diffraction does become a big headache too quickly.


----------



## dilbert (Jan 25, 2021)

This will make for interesting reviews of the upcoming high-megapixel Canon RF that is aimed at a branch of photorgraph that typically uses narrow apertures for increased depth of field: landscape photography.


----------



## jolyonralph (Jan 25, 2021)

dilbert said:


> This will make for interesting reviews of the upcoming high-megapixel Canon RF that is aimed at a branch of photorgraph that typically uses narrow apertures for increased depth of field: landscape photography.



It's more of an issue for macrophotography than for landscape photography.


----------



## privatebydesign (Jan 25, 2021)

dilbert said:


> This will make for interesting reviews of the upcoming high-megapixel Canon RF that is aimed at a branch of photorgraph that typically uses narrow apertures for increased depth of field: landscape photography.


No it won't. I have two really big issues with the video, first, he is demonstrating a visual diffraction between f17 and f96, at which point f17 looks pretty darn good. Besides, I don't know any Canon lenses that stop down past f32.

Second he implies that more pixels make diffraction more apparent and that simply isn't true, more magnification makes diffraction more apparent. If you have a 30mp image and retake that image with the same settings with your new 90mp camera the diffraction in both is the same.


----------



## SteveC (Jan 25, 2021)

privatebydesign said:


> Second he implies that more pixels make diffraction more apparent and that simply isn't true, more magnification makes diffraction more apparent. If you have a 30mp image and retake that image with the same settings with your new 90mp camera the diffraction in both is the same.



Before this turns into a gigantic talking-past-each-other fest, I'll suggest that you are assuming that the images are viewed at the same final size...rather than being 100 percent cropped or something like that. 

But we have quite a number of pixel peepers here.

I believe that at 100 percent crop you will see a difference because you are (effectively) blowing up the image, making the diffraction more apparent. It's the same absolute size, but won't be the same displayed size under those circumstances.


----------



## privatebydesign (Jan 25, 2021)

SteveC said:


> Before this turns into a gigantic talking-past-each-other fest, I'll suggest that you are assuming that the images are viewed at the same final size...rather than being 100 percent cropped or something like that.
> 
> But we have quite a number of pixel peepers here.
> 
> I believe that at 100 percent crop you will see a difference because you are (effectively) blowing up the image, making the diffraction more apparent. It's the same absolute size, but won't be the same displayed size under those circumstances.


Of course, but by the same token if you made a larger print of the smaller resolution file you would see more diffraction in the lower resolution file. If you don't 'compare' things at the same size, speed, magnification etc etc then that is not a *comparison*. It is like saying my car goes faster than yours because mine goes 150km/h yours only does 100mph, it simply isn't true.


----------



## dilbert (Jan 25, 2021)

privatebydesign said:


> Second he implies that more pixels make diffraction more apparent and that simply isn't true, more magnification makes diffraction more apparent. If you have a 30mp image and retake that image with the same settings with your new 90mp camera the diffraction in both is the same.



Yes, the diffraction is the same on both sensors as it is defined by the lens aperture. What changes is the ability for the diffraction to be recorded by the sensor.

A web page with only old cameras is here:






Diffraction Limited Photography: Pixel Size, Aperture and Airy Disks







www.cambridgeincolour.com





... using the 7D, which is the closest to the R5 (4.3um pixel), setting F/22 shows a very large Airy disc. Compare that to F/1.4. Then there's the interferance pattern (not shown by that web page.)

The ability to record fine detail drops as the aperture narrows.


----------



## usern4cr (Jan 26, 2021)

Could somebody please "illuminate" me on this particular issue?
Look down the barrel of the RF 800mm f11 lens (which has NO adjustable iris blades as it is always "wide open").
It looks like (to me) the light bundle is around 5/8" or so wide, visually.
Now I don't know for a fact how narrow the smallest light bundle is at the circular fixed iris, or pupil. Does anyone know the size of it?
If the fixed pupil is in the range of 5/8" or so, then how could light be appreciably affected by diffraction since that's such a big hole?


----------



## smshore (Jan 26, 2021)

Several problems with this demonstration. First and foremost, light does NOT bend. And discussion of light that justifies some light property by saying that light "bends" is automatically incorrect. Period. Second, using liquid wave tables or sound waves to describe what the action of electromagnetic radiation is like is not really representative.


----------



## privatebydesign (Jan 26, 2021)

usern4cr said:


> Could somebody please "illuminate" me on this particular issue?
> Look down the barrel of the RF 800mm f11 lens (which has NO adjustable iris blades as it is always "wide open").
> It looks like (to me) the light bundle is around 5/8" or so wide, visually.
> Now I don't know for a fact how narrow the smallest light bundle is at the circular fixed iris, or pupil. Does anyone know the size of it?
> If the fixed pupil is in the range of 5/8" or so, then how could light be appreciably affected by diffraction since that's such a big hole?


The ‘fixed pupil’, apparent aperture opening, is 800/11 in mm, rounded out that means it is 73mm or 2.87 inches.

Looking down a lens and trying to estimate anything is an exercise in futility as the lenses you are looking through change the very view of what it is you think you are seeing.

As for how light is affected by such a ‘large hole’, well that is just the nature of waves and edges. But as I pointed out previously, the example shown was as extreme as it is possible to get even in a specialized situation and really can’t be replicated in non macro real world situations. I think few would argue the example at f2.8 was not sharp yet that equates to a non macro real world f17, more closed down than you 800 f11!


----------



## privatebydesign (Jan 26, 2021)

smshore said:


> Several problems with this demonstration. First and foremost, light does NOT bend. And discussion of light that justifies some light property by saying that light "bends" is automatically incorrect. Period. Second, using liquid wave tables or sound waves to describe the action of electromagnetic radiation like like is not really representative.


Gravity bends light.


----------



## AlanF (Jan 26, 2021)

usern4cr said:


> Could somebody please "illuminate" me on this particular issue?
> Look down the barrel of the RF 800mm f11 lens (which has NO adjustable iris blades as it is always "wide open").
> It looks like (to me) the light bundle is around 5/8" or so wide, visually.
> Now I don't know for a fact how narrow the smallest light bundle is at the circular fixed iris, or pupil. Does anyone know the size of it?
> If the fixed pupil is in the range of 5/8" or so, then how could light be appreciably affected by diffraction since that's such a big hole?


I don’t know if this helps but I did post a thread on diffraction.




__





Diffraction, Airy Disks and implications


Two points of light are clearly resolved when they are separated by distances that are much larger than the radius of the disk. As the separation decreases, the disks start overlapping and the resolution decreases. When the separation is the same as the radius, the disks have coalesced. Closer...




www.canonrumors.com


----------



## smshore (Jan 26, 2021)

privatebydesign said:


> Gravity bends light.



The force of gravity on light is so weak that the only observation of light bending is when it travels past a star or black hole in space. There is nowhere on earth that anyone or any instrumentation will ever see light bend. Also, the Einsteinian prediction of the bending of light is actually a confirmation of the curvature of space predicted by Einstein. You should read about Einstein and Eddington and the extent that Eddington, Dyson, and Crommelin had to go to in 1919 to see the effect of light bending around our sun. So, light bending inside your camera is something you will never see.


----------



## Joules (Jan 26, 2021)

smshore said:


> There is nowhere on earth that anyone or any instrumentation will ever see light bend.


What exactly do you mean when you write of light bending? Are you saying that light behaves like a particle, traveling in a straight line from its source until it strikes a surface?

The common explanation of how light is not (just) a particle and diffraction works conceptually is the double slit experiment. And you can easily find tons of material on it, often with pictures that show the difference between 'particle' (no bending) and 'wave' (bending in the sense that there does not need to be a straight line of uninterrupted medium between the source of the light and the surface it hits) like this:

What you would expect from a particle:






What the experiment actually produces:





Source

So, are you saying the lower picture is misleading, or is the effect it shows not what you refer to with the term bending? In the former case, how is the diffraction pattern explained? Do you have any good material for an alternate explanation?

Edit: Hadn't watched the video previously, but the laser beam producing a disk instead of a point is not a double slit, but it shows the same kind of 'bending'.


----------



## Del Paso (Jan 26, 2021)

To me, the only question which really matters is:
Will an f.16 picture taken with a 90MP camera be VISIBLY better than the same picture taken at f.16 with a 30 MP camera?
DLA of the 90 MP camera should be around 4,5, if I'm not mistaken.
Any thoughts would be welcome!


----------



## privatebydesign (Jan 26, 2021)

smshore said:


> The force of gravity on light is so weak that the only observation of light bending is when it travels past a star or black hole in space. There is nowhere on earth that anyone or any instrumentation will ever see light bend. Also, the Einsteinian prediction of the bending of light is actually a confirmation of the curvature of space predicted by Einstein. You should read about Einstein and Eddington and the extent that Eddington, Dyson, and Crommelin had to go to in 1919 to see the effect of light bending around our sun. So, light bending inside your camera is something you will never see.


You said, and I quote “_And discussion of light that justifies some light property by saying that light "bends" is automatically incorrect. Period._” That is inaccurate, period. 

But that was simply referring to life in general, now you limit that statement to “_ So, light bending inside your camera is something you will never see._”, of course this again is false. What effect is there on light when it goes from one medium to another? Say, air to a piece of glass, or two pieces of glass with different refractory properties? The kind of stuff you’d find inside a lens..... It bends.


----------



## dilbert (Jan 26, 2021)

Del Paso said:


> To me, the only question which really matters is:
> Will an f.16 picture taken with a 90MP camera be VISIBLY better than the same picture taken at f.16 with a 30 MP camera?



How will you be looking at the image?

90MP reduced to HD screen size (1920x1080) is going to be much the same as 30MP reduced to HD.

90MP will give you more room to crop to get that 1920x1080 but few crop.

90MP will give you bigger prints at the same DPI, but few print.

If you zoom in for some 1:1 pixel peeping, then what you find will depend on the interpolation algorithm use by the raw image conversion that tries to work out what color a given pixel should be based on known sensor pixel size, whether it is red/green/blue being processed, focal length, neighbors, etc.


----------



## Joules (Jan 26, 2021)

Del Paso said:


> To me, the only question which really matters is:
> Will an f.16 picture taken with a 90MP camera be VISIBLY better than the same picture taken at f.16 with a 30 MP camera?
> DLA of the 90 MP camera should be around 4,5, if I'm not mistaken.
> Any thoughts would be welcome!


They will look identical when you view them at full size. Once you start zooming in (the same amount on both images), the lower MP one will start to look worse, as it will show pixels and debayering artifact or moire more easily.

The sensor resolution does not affect how apparent diffraction is. It just is one of the two limits to the detail you can capture - the other is the diffraction, which is a property of the lens (+ settings).

The 'concern' about using a high MP sensor and a narrow aperture is simply that you aren't getting the full amount of detail the sensor on its own is capable of. But in comparison to a lower MP sensor, that is absolutely not a disadvantage, as that sensor does not have the capability to sample as much detail in the first place.


----------



## privatebydesign (Jan 26, 2021)

Del Paso said:


> To me, the only question which really matters is:
> Will an f.16 picture taken with a 90MP camera be VISIBLY better than the same picture taken at f.16 with a 30 MP camera?
> DLA of the 90 MP camera should be around 4,5, if I'm not mistaken.
> Any thoughts would be welcome!


Yes, it will.


----------



## dilbert (Jan 26, 2021)

privatebydesign said:


> But that was simply referring to life in general, now you limit that statement to “_ So, light bending inside your camera is something you will never see._”, of course this again is false. What effect is there on light when it goes from one medium to another? Say, air to a piece of glass, or two pieces of glass with different refractory properties? The kind of stuff you’d find inside a lens..... It bends.



If "camera" is used here to refer to "camera body" then it is technically correct (except if the rear of the lens has protruded back into the cavity.) Otherwise, yes, you're right. Light bends, and different colours bend by different amounts. Hence rainbows.


----------



## gruhl28 (Jan 26, 2021)

smshore said:


> Several problems with this demonstration. First and foremost, light does NOT bend. And discussion of light that justifies some light property by saying that light "bends" is automatically incorrect. Period. Second, using liquid wave tables or sound waves to describe what the action of electromagnetic radiation is like is not really representative.


If light doesn't bend, then what do lenses do, what is refraction? What would you say diffraction is caused by if not light bending? How would you explain interference patterns if light doesn't bend? For many situations, electromagnetic waves do behave analogously to liquid waves and sound waves.


----------



## Joules (Jan 26, 2021)

dilbert said:


> If "camera" is used here to refer to "camera body" then it is technically correct (except if the rear of the lens has protruded back into the cavity.)


That would be cherry picking the meaning though. Keep in mind the poster previously said this:



smshore said:


> There is nowhere on earth that anyone or any instrumentation will ever see light bend.


Which makes it quite clear that there must be some misunderstanding when it comes to the use of the word 'bend'. Unless the poster also temporarily forgot about all the devices we use day to day to bend light.


----------



## usern4cr (Jan 26, 2021)

AlanF said:


> I don’t know if this helps but I did post a thread on diffraction.
> 
> 
> 
> ...


Thanks for the post, AlanF, but after reading the thread I didn't see any of the comments that would get into what the pupil size is for a long telephoto lens. If there was a link to a formula that would show this then I missed it. As the focal length becomes extremely long such as on a 10" wide 2500mm f10 telescope (without an adjustable iris/aperture blades), it must have a very large pupil diameter, on the order of an inch or so. I can't imagine that light going through a 1" wide hole can have any appreciable diffraction at all since the vast majority of the light is nowhere near the iris edge. That leads me to believe that the f# that causes appreciable diffraction should increase with the focal length of the lens. That's what I want to know as it would make all the difference when people talk about diffraction limits on long lenses.


----------



## privatebydesign (Jan 26, 2021)

dilbert said:


> If "camera" is used here to refer to "camera body" then it is technically correct (except if the rear of the lens has protruded back into the cavity.) Otherwise, yes, you're right. Light bends, and different colours bend by different amounts. Hence rainbows.


Actually even technically it isn't correct as all sensors in cameras are stacks with some form of glass on the top of them, yes the light is traveling in a pretty collimated path by then but the edges have to bend more. Hmm I wonder if that is why performance of all lenses drops off the further from center you go.....

Almost like we are discovering stuff that has been know for years!

And you can never have to many rainbows


----------



## privatebydesign (Jan 26, 2021)

usern4cr said:


> Thanks for the post, AlanF, but after reading the thread I didn't see any of the comments that would get into what the pupil size is for a long telephoto lens. If there was a link to a formula that would show this then I missed it. As the focal length becomes extremely long such as on a 10" wide 2500mm f10 telescope (without an adjustable iris/aperture blades), it must have a very large pupil diameter, on the order of an inch or so. I can't imagine that light going through a 1" wide hole can have any appreciable diffraction at all since the vast majority of the light is nowhere near the iris edge. That leads me to believe that the f# that causes appreciable diffraction should increase with the focal length of the lens. That's what I want to know as it would make all the difference when people talk about diffraction limits on long lenses.


I don't understand what you are calling a "pupil diameter", the diameter of the apparent aperture is the focal length in mm divided by the f number. Your 2,500mm f10 telescope has an apparent aperture opening 2500/10 = 250mm diameter.

But my understanding is diffraction is a function of f value (and magnification) not apparent aperture size. Well obviously it is more complicated than that, it is a function of airy disc size in relation to magnification, that is why diffraction changes with the same lens on a different sized sensor.


----------



## usern4cr (Jan 26, 2021)

privatebydesign said:


> The ‘fixed pupil’, apparent aperture opening, is 800/11 in mm, rounded out that means it is 73mm or 2.87 inches.
> 
> Looking down a lens and trying to estimate anything is an exercise in futility as the lenses you are looking through change the very view of what it is you think you are seeing.
> 
> As for how light is affected by such a ‘large hole’, well that is just the nature of waves and edges. But as I pointed out previously, the example shown was as extreme as it is possible to get even in a specialized situation and really can’t be replicated in non macro real world situations. I think few would argue the example at f2.8 was not sharp yet that equates to a non macro real world f17, more closed down than you 800 f11!


The "entrance pupil" is 800/11 = 73mm. The entrance pupil is the diameter of the central unobstructed light bundle coming in _without considering _any bending of the lens. But the lens bends/focuses the light so that it is much smaller as it reaches the iris/aperture blades. So the diameter of the pupil at the iris (which I called the "pupil" or "fixed pupil") is much smaller than the "entrance pupil". It is the diameter of this "pupil" that I'm interested in.

I agree that looking down the lens can be misleading, but I wouldn't call it futile since it is to be expected to act that way.

What I am trying to get at is how f11 on a 15mm lens will act vs on a 100mm lens or on a 2500mm lens, focused at infinity for simplicity of comparison. I would expect it to behave drastically differently as far as the amount of noticeable diffraction it causes. I noticed on your additional post that diffraction is a function of f# and _magnification_. _That's_ getting into what I'm talking about.


----------



## Ian K (Jan 26, 2021)

privatebydesign said:


> No it won't. I have two really big issues with the video, first, he is demonstrating a visual diffraction between f17 and f96, at which point f17 looks pretty darn good. Besides, I don't know any Canon lenses that stop down past f32.


Pretty much all of the canon lenses that can take a doubler will become f/64 with the 2x extender attached.


----------



## Joules (Jan 26, 2021)

usern4cr said:


> What I am trying to get at is how f11 on a 15mm lens will act vs on a 100mm lens or on a 2500mm lens, focused at infinity for simplicity of comparison. I would expect it to behave drastically differently as far as the amount of noticeable diffraction it causes. I noticed on your additional post that diffraction is a function of f# and _magnification_. _That's_ getting into what I'm talking about.


My understanding is that f-number (as in, 24-70 mm f/4.0) is all you need to be concerned with when you want to judge how apparent diffraction will be at a given digital magnification. The last part there is just saying that it gets more apparent as you zoom into your picture.

But the magnification that is actually affecting the Airy disk size is already baked into the f-number, as that combines physical aperture and focal length.

The other magnification that can make things confusing is the distance to your subject, but you were talking about focus at infinity. At which, based on my understanding, you should expect no difference in the amount of diffraction blur you see regardless of focal length. Only f-number matters. 

At close distances, the aperture on the lens is not the effective aperture anymore, which means distance comes into play as well. That was part of the video in the OP.


----------



## privatebydesign (Jan 26, 2021)

Ian K said:


> Pretty much all of the canon lenses that can take a doubler will become f/64 with the 2x extender attached.


Good point, though I wonder how many people are buying tele's, especially fast and expensive ones, and then using them at f32?


----------



## Nemorino (Jan 26, 2021)

privatebydesign said:


> I don't know any Canon lenses that stop down past f32


Afaik the 100-400 stops down to f/38 at the long end and two TS Makro even down to f/45 (90mm and 135mm).


----------



## usern4cr (Jan 26, 2021)

Joules said:


> My understanding is that f-number (as in, 24-70 mm f/4.0) is all you need to be concerned with when you want to judge how apparent diffraction will be at a given digital magnification. The last part there is just saying that it gets more apparent as you zoom into your picture.
> 
> But the magnification that is actually affecting the Airy disk size is already baked into the f-number, as that combines physical aperture and focal length.
> 
> ...


Wow - After looking it up online, it does seem that you're right - it's just so surprising to me. I guess the future R5s might not see much better photos than the R5 at high f#'s for any focal length lenses after all. I guess that might give more "value" to the high IQ wide open lenses like the 85mm f1.2 than they currently have for the R5 since you might really need exceptionally high IQ & low f# lenses to really see a big benefit for the doubly bigger R5s MP's. I wonder if it'll come with quad pixel AF, or if they'll only have that (initially) in their R1 (at lower MP's) as a big selling point for the R1?


----------



## Joules (Jan 26, 2021)

usern4cr said:


> Wow - After looking it up online, it does seem that you're right - it's just so surprising to me. I guess the future R5s will not see much better photos than the R5 at high f#'s for any focal length lenses after all. I guess that might give more "value" to the high IQ wide open lenses like the 85mm f1.2 than they currently have for the R5 since you might really need exceptionally high IQ & low f# lenses to really see a big benefit. I wonder if it'll come with quad pixel AF, or if they'll only have that (initially) in their R1 (at lower MP's) as a big selling point for the R1?


The really wide apertures like f/2.0 and wider are typically not used in conditions that you would associate with a desire for maximum detail though. So I don't know if the 85 mm 1.2 is such a great example.

But I don't feel like the R5 s (high res R) is all that threatened yet. It is essentially just the same pixel density as the 90D, if it actually is ~ 90 MP. The R5 _begins_ to show softening at f/9.0 and a 90 MP FF body at 6.3 according to the Photo pills calculator. That's still an absolutely common aperture for wildlife, where all that extra detail and room for cropping will be very much appreciated.

And in landscape photography, if you are concerned about maximizing detail, you should focus stack anyway, making it okay to shoot with a wider aperture.

Also worth noting is that you can actually reduce the blur quite a bit using deconvolution techniques - for example with Photoshop smart sharpen oder Canon's DLO. Yes, it can't restore detail actually lost due to overlapping Airy disks. But it can make details more visible that were no longer visually identifiable. 

Another thing to consider is that it would actually be ideal to be diffraction limited all the time. As it is right now, AA (or low pass) filters are essentially necessary to introduce some blur artificially. And there's false detail in our images anyway due to having sensors that are not capable of fully resolving what the lenses do. This is again mostly relevant for wildlife, especially birds. Where fine detail and patterns in the feathers tens to get messed up with aliasing and artifacts.

The other way to think about high resolution is like this: For each expensive lens you buy, as long as you are not using a body that is diffraction limited with this lens, you are not getting the most value for your money. Not a big deal if detail isn't all that important, but for those expensive big whites, I think it is something positive to have in the back of your head about diffraction.


----------



## AlanF (Jan 26, 2021)

Joules said:


> The really wide apertures like f/2.0 and wider are typically not used in conditions that you would associate with a desire for maximum detail though. So I don't know if the 85 mm 1.2 is such a great example.
> 
> But I don't feel like the R5 s (high res R) is all that threatened yet. It is essentially just the same pixel density as the 90D, if it actually is ~ 90 MP. The R5 _begins_ to show softening at f/9.0 and a 90 MP FF body at 6.3 according to the Photo pills calculator. That's still an absolutely common aperture for wildlife, where all that extra detail and room for cropping will be very much appreciated.
> 
> ...


In theory you can restore detail lost to overlapping Airy discs if you know the point spread function. The higher the pixel density, the better my 400 f/4 performs for resolution relative to the 100-400 f/5.6 or 100-500mm f/7.1. As the resolution of the sensor gets higher and higher, what determines resolution ends up being the diameter of the entrance pupil of the lens and not the focal length or f number.


----------



## Joules (Jan 26, 2021)

AlanF said:


> In theory you can restore detail lost to overlapping Airy discs if you know the point spread function.


But in practice, we don't know it and there are other, more practical techniques of imaging past the diffraction limit from what I understand. Though the ones I'm thinking of are also not practical for everyday photography.



AlanF said:


> what determines resolution ends up being the diameter of the entrance pupil of the lens and not the focal length or f number.


What's the difference between these? Isn't f-number just the result of focal length/entrance pupil?

Edit: Do you mean resolution in the sense of the smallest physical detail you can make out from a given distance, for example details on the moon? Then of course you are right, physical aperture is the key factor there.


----------



## usern4cr (Jan 26, 2021)

Joules said:


> But in practice, we don't know it and there are other, more practical techniques of imaging past the diffraction limit from what I understand. Though the ones I'm thinking of are also not practical for everyday photography.
> 
> 
> What's the difference between these? Isn't f-number just the result of focal length/entrance pupil?
> ...


From what I've read the entrance_pupil size limits the possible angular detail possible (eg for astronomy), but it is the f# that limits the airy disc size and not the entrance_pupil alone. 

As far as getting the most resolution past the airy disc size, I don't use Adobe products so I can't take advantage of the techniques you mention, but the DXO PL4 prime output does give me some remarkable output for almost no effort - but I don't know how it compares with what you can do.


----------



## dilbert (Jan 27, 2021)

privatebydesign said:


> Actually even technically it isn't correct as all sensors in cameras are stacks with some form of glass on the top of them, yes the light is traveling in a pretty collimated path by then but the edges have to bend more. Hmm I wonder if that is why performance of all lenses drops off the further from center you go.....



Ah, I forgot about that. Yes, the sensor directly under lens would be better than sensor at the edges.


----------



## AlanF (Jan 27, 2021)

Joules said:


> But in practice, we don't know it and there are other, more practical techniques of imaging past the diffraction limit from what I understand. Though the ones I'm thinking of are also not practical for everyday photography.
> 
> 
> What's the difference between these? Isn't f-number just the result of focal length/entrance pupil?
> ...


Yes, i mean the the smallest detail that can be resolved as in your edit.


----------



## AlanF (Jan 27, 2021)

usern4cr said:


> From what I've read the entrance_pupil size limits the possible angular detail possible (eg for astronomy), but it is the f# that limits the airy disc size and not the entrance_pupil alone.
> 
> As far as getting the most resolution past the airy disc size, I don't use Adobe products so I can't take advantage of the techniques you mention, but the DXO PL4 prime output does give me some remarkable output for almost no effort - but I don't know how it compares with what you can do.


I find that DxO PL4 gives me better resolution than the Adobe products.


----------



## usern4cr (Jan 27, 2021)

AlanF said:


> I find that DxO PL4 gives me better resolution than the Adobe products.


Really? Wow - that's sooo great to hear, since I'm a fellow PL4 user.

I've always been completely amazed at how the "Prime" output could take my grainy (but sharp) Olympus M43 photos and output them with super smooth beautiful backgrounds with great detail. Now I can do the same with my even better R5 photos, and not feel like I'm missing out since I avoid the Adobe products.


----------



## AlanF (Jan 27, 2021)

usern4cr said:


> Really? Wow - that's sooo great to hear, since I'm a fellow PL4 user.
> 
> I've always been completely amazed at how the "Prime" output could take my grainy (but sharp) Olympus M43 photos and output them with super smooth beautiful backgrounds with great detail. Now I can do the same with my even better R5 photos, and not feel like I'm missing out since I avoid the Adobe products.


Unfortunately, DxO doesn't have a module for the 100-500mm yet and so you can't use their lens sharpness tool. I am using Topaz to sharpen as it's much better than Adobe's tools. The 100-400mm II, 400mm DO II DxO modules work with the R5 and sharpen up nicely. DxO are a pain being slow with new models, but it makes you appreciate it when you can use it again.


----------



## stevelee (Jan 27, 2021)

privatebydesign said:


> No it won't. I have two really big issues with the video, first, he is demonstrating a visual diffraction between f17 and f96, at which point f17 looks pretty darn good. Besides, I don't know any Canon lenses that stop down past f32.


Just yesterday I was taking some shots of the woods with the 100–400mm II and messing around with Av mode to vary depth of field. I took one shot at f/36 for grins. The variable aperture works on both ends, so you can get f/36 at 400mm but not 100mm.


----------



## privatebydesign (Jan 27, 2021)

stevelee said:


> Just yesterday I was taking some shots of the woods with the 100–400mm II and messing around with Av mode to vary depth of field. I took one shot at f/36 for grins. The variable aperture works on both ends, so you can get f/36 at 400mm but not 100mm.


And how much did the diffraction destroy your image?


----------



## stevelee (Jan 28, 2021)

privatebydesign said:


> And how much did the diffraction destroy your image?


Not nearly as much as the noise did at 12,800 ISO. The result, when resized for posting was good enough that I put it in the winter thread.

A 100% crop, however, shows a lot of noise.




For comparison, here is a 100% crop from a shot at f/5. The much smaller depth of field is the reason for what is unsharp in it. It was shot at ISO 640.


----------



## stevelee (Jan 28, 2021)

These and two others are posted at https://www.canonrumors.com/forum/threads/winter-2020-2021.39629/page-6#post-880544


----------



## dilbert (Jan 28, 2021)

privatebydesign said:


> But my understanding is diffraction is a function of f value (and magnification) not apparent aperture size. Well obviously it is more complicated than that, it is a function of airy disc size in relation to magnification, that is why diffraction changes with the same lens on a different sized sensor.



The Airy Disc (Airy is a noun, not adjective, as it is named after a person), is a result of diffraction. The size of the Airy Disc is a function (or result) of diffraction, which is related to the f-number.


----------



## Sporgon (Jan 28, 2021)

dilbert said:


> The size of the Airy Disc is a function (or result) of diffraction, which is related to the f-number.


The size is also effected by the wave length of light, red being larger which makes diffraction slightly worse, and is an unfortunate aspect of physics as landscape photography is often at its best during the 'golden hour'.


----------



## johnhenry (Jan 28, 2021)

Armchair scientists like him REALLY gloss over the math behind the optical effect of diffraction.


First off, the math is truly awful. You need to dig into complex math concepts Fourier Transforms before you even begin to understand the hows and whys of why this occurs.


In the simplest of cases, using a drastically simple form, your subject consists of white and black areas. The wavefront originates here with all subjects at infinity for the sake of argument.

As the lens focuses this, and it passes through the aperture of the lens, it imposes certain characteristics on the light that passes through to become and image at the focal plane.

The sharp edges of the diaphragm diffract light from the bright regions to the dark ones, but this is a wavelike function, there ends up being more in some regions, less in others.

With the lens wide open, lens aberrations are more pronounced. As the lens is stopped down more and more, this diffraction begins to be more noticeable, leading to unsharp transitions in light to dark areas.


----------



## AlanF (Jan 29, 2021)

johnhenry said:


> Armchair scientists like him REALLY gloss over the math behind the optical effect of diffraction.
> 
> 
> First off, the math is truly awful. You need to dig into complex math concepts Fourier Transforms before you even begin to understand the hows and whys of why this occurs.
> ...


Although Fourier Transform is the standard procedure for analysing diffraction patterns, you don't need it to understand diffraction. Sir Lawrence Bragg, the founding father of solving the structures of crystals, including proteins, by X-ray diffraction, for which he got the Nobel Prize in 1915, formulated Bragg's law from the most simple of equations https://en.wikipedia.org/wiki/Lawrence_Bragg (As an aside, I met him in the 1970s, and isn't it remarkable that people alive today could have met someone who won a Nobel prize 106 years ago.) Airy solved the equations for the Airy Disc diffraction using just differential calculus, without Fourier Transforms, and published it in Transactions of the Cambridge Philosophical Society in 1834. I recommend downloading the volume https://archive.org/details/transactionsofca05camb as it contains some remarkable papers.


----------



## Joules (Jan 29, 2021)

johnhenry said:


> Armchair scientists like him REALLY gloss over the math behind the optical effect of diffraction.
> 
> First off, the math is truly awful. You need to dig into complex math concepts Fourier Transforms before you even begin to understand the hows and whys of why this occurs.


r/iamverysmart


----------



## AlanF (Jan 29, 2021)

Joules said:


> r/iamverysmart


Perhaps we could get Craig'a phrase substitution to incorporate your interpretation, . There are two extreme types of physicists; those who develop theories etc by simple force of mathematics, and those who think their way into a problem using intuition and visualisation, and then find the maths to continue. Einstein and Feynman are examples of the latter. Schwinger, who shared the Nobel Prize with Feynman, is an example of the former. I need to understand in my mind what is going on and I am pretty poor mathematician.


----------



## Joules (Jan 29, 2021)

AlanF said:


> Perhaps we could get Craig'a phrase substitution to incorporate your interpretation, . There are two extreme types of physicists; those who develop theories etc by simple force of mathematics, and those who think their way into a problem using intuition and visualisation, and then find the maths to continue. Einstein and Feynman are examples of the latter. Schwinger, who shared the Nobel Prize with Feynman, is an example of the former. I need to understand in my mind what is going on and I am pretty poor mathematician.


I can't say that I quite understand your first sentence. But just in case somebody took offence, I was just poking fun at the poster for using glossing over the 'truly awful' maths as criticism. Essentially ignoring that there is value in understanding the effect somethig has, even if its origin and inner workings remain a mystery. After all, Newtonian physics is still highly relevant despite not 'accuratelty' describing why things like for example gravity do what they do. Nothing too serious, I just found it funny.

On another node, isn't Einsteins work on relativity more an example of predictions beginning in math and only making their way to experimental and visual validation later on? That's just what I thought based on some tidbits of information I heard over time.


----------



## AlanF (Jan 29, 2021)

Joules said:


> I can't say that I quite understand your first sentence. But just in case somebody took offence, I was just poking fun at the poster for using glossing over the 'truly awful' maths as criticism. Essentially ignoring that there is value in understanding the effect somethig has, even if its origin and inner workings remain a mystery. After all, Newtonian physics is still highly relevant despite not 'accuratelty' describing why things like for example gravity do what they do. Nothing too serious, I just found it funny.
> 
> On another node, isn't Einsteins work on relativity more an example of predictions beginning in math and only making their way to experimental and visual validation later on? That's just what I thought based on some tidbits of information I heard over time.


It was the way he set about it. For example, for Special Relativity he began by imagining what would happen to time etc if he was sitting on a beam of light, observing events and then thinking his way through the consequences, and then getting involved in more and more complex mathematics and General Relativity. He once said, In physics, imagination is more important than knowledge. I think this was a riposte to Max Planck who had earlier written that Experiment is the only means of knowledge at our disposal, the rest is poetry, imagination (both quoted from memory). The opposite approach is to start with equations and solve them and see where it leads.


----------



## Mt Spokane Photography (Jan 29, 2021)

AlanF said:


> It was the way he set about it. For example, for Special Relativity he began by imagining what would happen to time etc if he was sitting on a beam of light, observing events and then thinking his way through the consequences, and then getting involved in more and more complex mathematics and General Relativity. He once said, In physics, imagination is more important than knowledge. I think this was a riposte to Max Planck who had earlier written that Experiment is the only means of knowledge at our disposal, the rest is poetry, imagination (both quoted from memory). The opposite approach is to start with equations and solve them and see where it leads.


That is my understanding. Some people actually develop a theory by intuition and then set out to do the math. I find answers to simple problems pop into my head, then I do the math and I'm pretty close. My math ability is pretty poor. I managed to make it thru advanced math in engineering school in the mid 1960's. We used slide rules and mechanical calculators. It was difficult and a lengthy process to solve some of the problems that required multiple iterations. I never want to do that again.


----------



## SteveC (Jan 29, 2021)

Mt Spokane Photography said:


> That is my understanding. Some people actually develop a theory by intuition and then set out to do the math. I find answers to simple problems pop into my head, then I do the math and I'm pretty close. My math ability is pretty poor. I managed to make it thru advanced math in engineering school in the mid 1960's. We used slide rules and mechanical calculators. It was difficult and a lengthy process to solve some of the problems that required multiple iterations. I never want to do that again.



All that happened when calculators became common is the problems became longer (because you could work them faster), and you were expected to have more digits of precision.

I had a cheap slide rule with me in exams in case my calculator died; but I would have choked anyway; I would have had to set the problem up (to at least get partial credit) but all I ever really learned to do on a slip stick was multiplication (and the cheapass slide rule I just mentioned could probably do little more than that anyway).


----------



## Sporgon (Jan 29, 2021)

I was honestly expecting the conclusion to be that Sonys suffer less from diffraction.


----------



## usern4cr (Jan 29, 2021)

SteveC said:


> All that happened when calculators became common is the problems became longer (because you could work them faster), and you were expected to have more digits of precision.
> 
> I had a cheap slide rule with me in exams in case my calculator died; but I would have choked anyway; I would have had to set the problem up (to at least get partial credit) but all I ever really learned to do on a slip stick was multiplication (and the cheapass slide rule I just mentioned could probably do little more than that anyway).


Slide rules taught me a lot more than just low precision multiplication. They showed the beauty of logarithms, and how they could apply to all sorts of things in real life. Like the intensity of sound or perceived brightness - it's all logarithmic. And why just mention multiplication without mentioning division? And what about the "old school" way of teaching us how to do long division on paper? Or finding square roots on paper? It taught more than just a method to forget once obsolete, but how interesting tricks could be figured out to solve difficult problems.

Oh well, it's time to go back to typing on my multi-core Apple computer with more abilities than probably existed when they put a man on the moon!


----------



## stevelee (Jan 30, 2021)

usern4cr said:


> Slide rules taught me a lot more than just low precision multiplication. They showed the beauty of logarithms, and how they could apply to all sorts of things in real life. Like the intensity of sound or perceived brightness - it's all logarithmic. And why just mention multiplication without mentioning division? And what about the "old school" way of teaching us how to do long division on paper? Or finding square roots on paper? It taught more than just a method to forget once obsolete, but how interesting tricks could be figured out to solve difficult problems.
> 
> Oh well, it's time to go back to typing on my multi-core Apple computer with more abilities than probably existed when they put a man on the moon!


Your cell phone may have more computing power than NASA had in 1969.


----------



## Mt Spokane Photography (Jan 30, 2021)

usern4cr said:


> Slide rules taught me a lot more than just low precision multiplication. They showed the beauty of logarithms, and how they could apply to all sorts of things in real life. Like the intensity of sound or perceived brightness - it's all logarithmic. And why just mention multiplication without mentioning division? And what about the "old school" way of teaching us how to do long division on paper? Or finding square roots on paper? It taught more than just a method to forget once obsolete, but how interesting tricks could be figured out to solve difficult problems.
> 
> Oh well, it's time to go back to typing on my multi-core Apple computer with more abilities than probably existed when they put a man on the moon!


I hated things like Bessel Functions where we had to do calculations on those huge old mechanical calculators that could multiply and divide to many decimal places but we did it over and over and had to get every one of the many digits put in correctly. They were needed for wave propagation theory. In my case, it wasn't light waves, but electromagnetic waves.

Since slide rules were a critical tool for engineers at the time, we all had good ones. I still have a couple laying around, 59 years old this year, I bought them in the fall of 1961. I still use the triangular architect's scale I bought then, its right here in my desk. When calculators came out, a simple 4 function one cost something like $450. HP was king until TI came out with much less expensive ones that did not use reverse polish. I bought a SR50 instead of a HP35.


----------



## SteveC (Jan 30, 2021)

usern4cr said:


> Slide rules taught me a lot more than just low precision multiplication. They showed the beauty of logarithms, and how they could apply to all sorts of things in real life. Like the intensity of sound or perceived brightness - it's all logarithmic. And why just mention multiplication without mentioning division? And what about the "old school" way of teaching us how to do long division on paper? Or finding square roots on paper? It taught more than just a method to forget once obsolete, but how interesting tricks could be figured out to solve difficult problems.
> 
> Oh well, it's time to go back to typing on my multi-core Apple computer with more abilities than probably existed when they put a man on the moon!



Much of that I learned to appreciate anyway without a slide rule!


----------



## SteveC (Jan 30, 2021)

Mt Spokane Photography said:


> I hated things like Bessel Functions where we had to do calculations on those huge old mechanical calculators that could multiply and divide to many decimal places but we did it over and over and had to get every one of the many digits put in correctly. They were needed for wave propagation theory. In my case, it wasn't light waves, but electromagnetic waves.
> 
> Since slide rules were a critical tool for engineers at the time, we all had good ones. I still have a couple laying around, 59 years old this year, I bought them in the fall of 1961. I still use the triangular architect's scale I bought then, its right here in my desk. When calculators came out, a simple 4 function one cost something like $450. HP was king until TI came out with much less expensive ones that did not use reverse polish. I bought a SR50 instead of a HP35.



Reverse Polish is one of those things that make no damned sense (why in the hell would someone design a calculator to work that way?) unless you understand it's working on a stack. Then it not only makes perfect sense, but if you're clever enough you can do more complex things on it than you could with a "regular" calculator. 

I had used plenty of non RPN calculators before entering college, but IN college, an HP41 became my companion. I still have it, but it's broken (there are shops out there that will repair them; it's on my to do list). I've told people that it should be put in my coffin with me.


----------



## AlanF (Jan 30, 2021)

Mt Spokane Photography said:


> I hated things like Bessel Functions where we had to do calculations on those huge old mechanical calculators that could multiply and divide to many decimal places but we did it over and over and had to get every one of the many digits put in correctly. They were needed for wave propagation theory. In my case, it wasn't light waves, but electromagnetic waves.
> 
> Since slide rules were a critical tool for engineers at the time, we all had good ones. I still have a couple laying around, 59 years old this year, I bought them in the fall of 1961. I still use the triangular architect's scale I bought then, its right here in my desk. When calculators came out, a simple 4 function one cost something like $450. HP was king until TI came out with much less expensive ones that did not use reverse polish. I bought a SR50 instead of a HP35.


Put them in a display cabinet with your film cameras.


----------



## stevelee (Jan 30, 2021)

I wondered if RPN might have been an overreaction from people who used to program in Lisp.

I use Casio calculators that use algebraic notation. They seem to follow it better than other brands I have tried that supposedly do the same thing.


----------



## SteveC (Jan 31, 2021)

stevelee said:


> I wondered if RPN might have been an overreaction from people who used to program in Lisp.
> 
> I use Casio calculators that use algebraic notation. They seem to follow it better than other brands I have tried that supposedly do the same thing.



I hated my first experience with RPN, in junior high school. It was maddening and I had no idea why anyone would want to do it. In college though, I had been taught about stack architectures, and when I realized there was a four-layer-deep stack on HP calculators, it suddenly made perfect sense to me--I had a mental picture of how it worked--and four levels lets you do a LOT of algebraically complicated things without the use of a parenthesis key. A very simple example: 

4 * (12+3) comes out as 4 (enter), 12 (enter) [don't do anything just yet, the stack has 4, 12 on it] 3 + [adding the 3 to the 12 and leaving the stack as 4, 15], then with the stack containing 4 and 15, you hit multiply. 

Note, though that any stack based system has to have keys that swap the top two things on the stack and also one that does a "roll" of the stack (moving everything one position, and taking what's at one end (and about to be shoved off by the roll) and sticking it into the other.


----------



## stevelee (Jan 31, 2021)

If I had encountered a pocket calculator in junior high, I wouldn’t have cared if I had to stand on my head to use it. As it was, I was in graduate school in Dallas when I was at a prof’s home when I saw one. His wife worked for a TV station, and she had a calculator from the station at home with her. It cost over $300 and would add, subtract, multiply, and divide in the simplest manner. I vowed that one day I would own one.


----------



## Valvebounce (Jan 31, 2021)

Hi PBD, usern4cr.
Head over and have a look at Roger’s post to see just how misleading this can be! I found it enlightening and entertaining! 








The Secret of the Broken Element: A Canon RF 100-500mm f4.7-7.1 Teardown


In ancient times, the IS unit in Canon EF lenses was physically locked down, so the unit didn't make rattling noises when the lens wasn't powered up. Canon decided the lockdown was no longer necessary in modern times, so some RF lenses, like the Canon RF 100-500mm f/4.5-7.1, rattled when the...



www.lensrentals.com





Cheers, Graham. 



privatebydesign said:


> Looking down a lens and trying to estimate anything is an exercise in futility as the lenses you are looking through change the very view of what it is you think you are seeing.


----------



## usern4cr (Feb 1, 2021)

Valvebounce said:


> Hi PBD, usern4cr.
> Head over and have a look at Roger’s post to see just how misleading this can be! I found it enlightening and entertaining!
> 
> 
> ...


Thanks, Valvebounce. I had read that post previously, and it's pretty amazing in how many steps & photos they took.


----------

