# Dxo tests canon/nikon/sony 500mm's



## Apop (Jul 10, 2013)

http://www.dxomark.com/index.php/Publications/DxOMark-Reviews/Nikon-AF-S-Nikkor-500mm-and-600mm-f-4G-ED-VR-lens-reviews-legendary-performers-in-the-range/Nikon-AF-S-NIKKOR-500mm-f-4G-ED-VR-fights-off-both-Canon-and-Sony



When you put a d600 on the nikon 500, 
http://www.dxomark.com/index.php/Lenses/Compare-Camera-Lenses/Compare-lenses/%28lens1%29/1174/%28lens2%29/393/%28lens3%29/891/%28brand1%29/Nikkor/%28camera1%29/834/%28brand2%29/Canon/%28camera2%29/795/%28brand3%29/Sony/%28camera3%29/831


----------



## Apop (Jul 10, 2013)

Also the 600's

http://www.dxomark.com/index.php/Publications/DxOMark-Reviews/Nikon-AF-S-Nikkor-500mm-and-600mm-f-4G-ED-VR-lens-reviews-legendary-performers-in-the-range/Nikon-600mm-f-4G-ED-VR-vs-Canon-EF-600mm-f-4L-IS-II-USM-duel-it-out-but-it-s-a-tie


I guess the 'tests' kind of back up what was already reported.
The difference looks quite substantial, nikon must be close to updating some of their lenses (400/500/600)


----------



## bdunbar79 (Jul 10, 2013)

This is why DxOmark sucks:

Although the two lenses are extraordinary performers, the Nikon can’t quite match the new $11,999 triple fluorite Canon in sharpness or in lateral chromatic aberration, however overall the two perform very similarly. Both have homogenous sharpness at maximum aperture and possess low distortion and vignetting and excellent transmission, but reason why the DxOMark scores are the same is due to the excellent noise and dynamic range of the Nikon D800 sensor.

The LENS tied the Canon lens because of the CAMERA used!!! HAHAHAHHAHAHAHAHAHAHAHAHHAHAHAHAHA.


----------



## Apop (Jul 10, 2013)

bdunbar79 said:


> This is why DxOmark sucks:
> 
> Although the two lenses are extraordinary performers, the Nikon can’t quite match the new $11,999 triple fluorite Canon in sharpness or in lateral chromatic aberration, however overall the two perform very similarly. Both have homogenous sharpness at maximum aperture and possess low distortion and vignetting and excellent transmission, but reason why the DxOMark scores are the same is due to the excellent noise and dynamic range of the Nikon D800 sensor.
> 
> The LENS tied the Canon lens because of the CAMERA used!!! HAHAHAHHAHAHAHAHAHAHAHAHHAHAHAHAHA.




Well , I just put in the d600 in the comparison to get something which is closer to reflecting the lenses?
But it's indeed funny they tie the lenses due to the sensor behind it outperforming the other sensor


Also when you look at 
http://www.dxomark.com/index.php/Lenses/Compare-Camera-Lenses/Compare-lenses/%28lens1%29/1174/%28lens2%29/393/%28lens3%29/891/%28brand1%29/Nikkor/%28camera1%29/834/%28brand2%29/Canon/%28camera2%29/795/%28brand3%29/Sony/%28camera3%29/831

I wonder how they conclude 24 for the nikon, if you compare it to the sony(also on 24 mp?)
The sony has better t-value, sharpness, lower CA, yet it scores only 22? 

There the canon has 19mpix, the nikon 15
0% distortion vs 0.2%
3µm vs 7

I don't know if their tests are really scientific, if so it's easy enough to draw your own conclusions based on their values ( just forget the overall score and compare per lens/body).
And ignore the conclusion of them i guess 

Better hold on to those lenses and wait for better bodies , Those cycles are much shorter and less expensive to replace. Imagine what those nikon sensor would be like on the canon lenses!


----------



## neuroanatomist (Jul 10, 2013)

I guarantee you that and shot I take handholding my 600 II with the 1D X would be sharper than an equivalent shot if I handheld the Nikon 600/4 with the D800.


----------



## Apop (Jul 10, 2013)

neuroanatomist said:


> I guarantee you that and shot I take handholding my 600 II with the 1D X would be sharper than an equivalent shot if I handheld the Nikon 600/4 with the D800.



Ah yes, but that could say more about your physical state than the quality of camera/lens  ( sorry hehe )

The 600 IS II , for me at the moment is still something to work hard for and hopefully own in the future !


----------



## jrista (Jul 11, 2013)

This comparison of DXO's results is what makes me EXTREMELY SUSPICIOUS of them. The Nikkor 500mm scores 25, when every single trait that factors into that score is worse than the Canon 500mm...which also scores 25. That is just plain wrong. The Canon has zero distortion, higher sharpness, less vignetting, less CA, and the same transmission...on a LOWER RESOLUTION BODY! It should have a higher score than the Nikon.


----------



## neuroanatomist (Jul 11, 2013)

jrista said:


> This comparison of DXO's results is what makes me EXTREMELY SUSPICIOUS of them. The Nikkor 500mm scores 25, when every single trait that factors into that score is worse than the Canon 500mm...which also scores 25. That is just plain wrong. The Canon has zero distortion, higher sharpness, less vignetting, less CA, and the same transmission...on a LOWER RESOLUTION BODY! It should have a higher score than the Nikon.



+1000

I have been critical of DxOMark's Scores, but this takes the cake for the BS factor. Even the Sony 500/4 bests the Nikon on every measurement, but the Nikon lens outscores the Sony.

[quote author=DxOMark]
However, while the Canon 500mm mounted on 5D Mark III is sharper optically than the Nikon model mounted on D800, at the light levels used for DxOMark score (1/60, 150 Lux), *the excellent dynamic range of the Nikon D800 sensor* helps it improve the DxO Mark Score and accounts for the level-pegging. 
[/quote]

Sorry, but what the heck does sensor-based DR have to do with scoring a lens?!? Nothing. This smells stinks to high heaven of bias. 

At least this time, they have apparently come right out and said that the "DxOMark Score" is utterly meaningless.


----------



## jrista (Jul 11, 2013)

neuroanatomist said:


> jrista said:
> 
> 
> > This comparison of DXO's results is what makes me EXTREMELY SUSPICIOUS of them. The Nikkor 500mm scores 25, when every single trait that factors into that score is worse than the Canon 500mm...which also scores 25. That is just plain wrong. The Canon has zero distortion, higher sharpness, less vignetting, less CA, and the same transmission...on a LOWER RESOLUTION BODY! It should have a higher score than the Nikon.
> ...



BS cake indeed!



neuroanatomist said:


> [quote author=DxOMark]
> However, while the Canon 500mm mounted on 5D Mark III is sharper optically than the Nikon model mounted on D800, at the light levels used for DxOMark score (1/60, 150 Lux), *the excellent dynamic range of the Nikon D800 sensor* helps it improve the DxO Mark Score and accounts for the level-pegging.



Sorry, but what the heck does sensor-based DR have to do with scoring a lens?!? Nothing. This smells stinks to high heaven of bias. 

At least this time, they have apparently come right out and said that the "DxOMark Score" is utterly meaningless.
[/quote]

Aye! _Smells _like a beached whale that just deathfarted. 

About time they ousted themselves for their insane bias. What a ridiculous joke. It is more than clear that these tests are most definitely not "lens" tests...they are camera tests, and throwing the D800 into the mix with its extremely high resolution sensor completely invalidates any scalar score DXO tries to push as a means of comparing lenses against each other.


----------



## RGF (Jul 11, 2013)

jrista said:


> This comparison of DXO's results is what makes me EXTREMELY SUSPICIOUS of them. The Nikkor 500mm scores 25, when every single trait that factors into that score is worse than the Canon 500mm...which also scores 25. That is just plain wrong. The Canon has zero distortion, higher sharpness, less vignetting, less CA, and the same transmission...on a LOWER RESOLUTION BODY! It should have a higher score than the Nikon.



If you look at the test results they are measuring the combination of lens and body. Hard to do otherwise. Makes comparison between manufactures difficult and less than meaningful.


----------



## jrista (Jul 11, 2013)

RGF said:


> jrista said:
> 
> 
> > This comparison of DXO's results is what makes me EXTREMELY SUSPICIOUS of them. The Nikkor 500mm scores 25, when every single trait that factors into that score is worse than the Canon 500mm...which also scores 25. That is just plain wrong. The Canon has zero distortion, higher sharpness, less vignetting, less CA, and the same transmission...on a LOWER RESOLUTION BODY! It should have a higher score than the Nikon.
> ...



Oh, I know. I've long stated that DXO's "lens tests" are really "camera system tests". That doesn't change the fact that lenses can be compared on camera bodies with similar or identical pixel densities in order to provide a more meaningful basis for comparison. 

As it stands, in the example above, one can only assume their Nikkor 500mm f/4 lens will score 25 IF used on a D800 body. I know they test against multiple bodies, but usually the spread isn't broad enough to support useful cross-brand comparisons.


----------



## neuroanatomist (Jul 11, 2013)

RGF said:


> If you look at the test results they are measuring the combination of lens and body. Hard to do otherwise. Makes comparison between manufactures difficult and less than meaningful.



Yes, and sorry, but I think you are missing the point. For every measurement, all generated with a body attached, the Canon lens comes out on top, in some cases by a significant margin. Yet, the Score is a tie. So...the score is fabricated, pulled from their nether orifices, etc.


----------



## JohanCruyff (Jul 11, 2013)

jrista said:


> This comparison of DXO's results is what makes me EXTREMELY SUSPICIOUS of them. The Nikkor 500mm scores 25, when every single trait that factors into that score is worse than the Canon 500mm...which also scores 25. That is just plain wrong. The Canon has zero distortion, higher sharpness, less vignetting, less CA, and the same transmission...on a LOWER RESOLUTION BODY! It should have a higher score than the Nikon.



The hidden row (see my attachment) includes a technical parameter   which determines the overall score.


----------



## bdunbar79 (Jul 12, 2013)

So far, this thread's remained relatively mild


----------



## Rienzphotoz (Jul 12, 2013)

jrista said:


> Aye! _Smells _like a beached whale that just deathfarted.


My apologies to all animal lovers but the expression "a beached whale that just *deathfarted*" just got me LOL ;D ;D ;D ... "*deathfarted*"? LMAO ;D ;D ;D


----------



## Rienzphotoz (Jul 12, 2013)

I read Nikon rumors every day (coz I also use Nikor gear) and when I read this article two days ago, the first thought that came to my mind was "what a load of chicken 5h!t"


----------



## jrista (Jul 12, 2013)

Rienzphotoz said:


> jrista said:
> 
> 
> > Aye! _Smells _like a beached whale that just deathfarted.
> ...



Haha. Sadly, I've had the unpleasant experience...one of the few memories from my childhood that will never go away...when we had a beached whale in California let a big one go. 

It can be a lot worse, though! THIS is a Deathfart (timeindex 4:55):

http://www.videobash.com/video_show/apparently-if-you-shank-a-beached-whale-it-erupts-its-guts-like-a-volcano-286339

Oh, and for an even worse experience...time index 3:10.  I think my favorite quote is 5:35: "It's the whale fart symphony!"


----------



## ahab1372 (Jul 12, 2013)

And then there is the story of the guy who wanted to remove a beached whale from the beach by blowing it up ...


----------



## dtaylor (Jul 12, 2013)

neuroanatomist said:


> Sorry, but what the heck does sensor-based DR have to do with scoring a lens?!?



What the heck does DxO have to do with scoring DR or a lens?!? ;D

The hilarity that is DxO continues.


----------



## Don Haines (Jul 12, 2013)

neuroanatomist said:


> For every measurement, all generated with a body attached, the Canon lens comes out on top, in some cases by a significant margin. Yet, the Score is a tie. So...the score is fabricated, pulled from their nether orifices, etc.


Give it up Neuro.... Give it up jrista....

You are just jealous that my 50 F1.8 that I paid $100 for outscores your $10,000 lenses  and try to justify it by attacking DXO. Everyone knows how impartial and accurate DXO scoring is just like how all birders would rather use a 50F1.8 than a 500 or 600F4.


----------



## jrista (Jul 12, 2013)

Don Haines said:


> neuroanatomist said:
> 
> 
> > For every measurement, all generated with a body attached, the Canon lens comes out on top, in some cases by a significant margin. Yet, the Score is a tie. So...the score is fabricated, pulled from their nether orifices, etc.
> ...



Damn. If there was only a "for" after each of those "up"'s. ;P

The irony here is so thick you could swim in it, though!  That a 50/1.8 outscores a Canon Great White is about the perfect testimony as to how overweighted and invalid transmission is on DXO's score, and how ludicrous the scores therefor are.


----------



## garyknrd (Jul 13, 2013)

I always found there sensor scores pretty accurate. But this is??? These guys are smoking some pretty potent stuff. I shoot with a guy that has a Nikon 500 VR II and it is scary good, the VR is un real on that thing. But! He constantly comments on how sharp the pics from a 500 II are. Like every time we shoot together. I damn sure would not trade.
Another fellow I shoot with every once in a while has the Nikon 600. That thing is not even close to the new Canon 600 II IMO.


----------



## Woody (Jul 13, 2013)

The Canon 500 mm lens mounted on a 22 MP FF sensor OUTRESOLVES the Nikon 500 mm lens mounted on a 36 MP FF sensor? And they are scored equally? Incredible


----------



## mb66energy (Jul 13, 2013)

The Nikon 500mm rules @ f/32 in terms of actuance:
http://www.dxomark.com/index.php/Lenses/Compare-Camera-Lenses/Compare-lenses/%28lens1%29/1174/%28lens2%29/393/%28lens3%29/891/%28brand1%29/Nikkor/%28camera1%29/792/%28brand2%29/Canon/%28camera2%29/795/%28brand3%29/Sony/%28camera3%29/831

This might lead to a higher summed up score because DxO produces the score from data over the whole range. If this is NOT WEIGHTED with typical use scenarios ... this gives misleading scores while the measurements are correct AND helpful.

This might explain why Canons and Nikons 4/500 have the same scores - I would choose Canon's lens because it should deliver cleaner results @ f/4-f/8 - the typical f-stops I would use.


----------



## neuroanatomist (Jul 13, 2013)

mb66energy said:


> The Nikon 500mm rules @ f/32 in terms of actuance:
> This might lead to a higher summed up score because DxO produces the score from data over the whole range. If this is NOT WEIGHTED with typical use scenarios ... this gives misleading scores while the measurements are correct AND helpful.
> 
> This might explain why Canons and Nikons 4/500 have the same scores - I would choose Canon's lens because it should deliver cleaner results @ f/4-f/8 - the typical f-stops I would use.



I'd like to ask where DxOMark states the Score is summed over the whole range...because instead, I read: "_DxOMark Score corresponds to an optimal focal length/aperture combination. The score corresponds to the quantity of information that can be captured by the camera. Each focal length/aperture combination provides a numerical value. The highest value is the DxOMark Score._". They're saying the score is based on the one best focal length/aperture combo. They indicate that right next to their Score, and for all three lenses that optimum is 500mm f/4. 

Sorry, but I believe actuance at f/32 is irrelevant to the Scores, which are still BS...


----------



## msm (Jul 13, 2013)

mb66energy said:


> The Nikon 500mm rules @ f/32 in terms of actuance:
> http://www.dxomark.com/index.php/Lenses/Compare-Camera-Lenses/Compare-lenses/%28lens1%29/1174/%28lens2%29/393/%28lens3%29/891/%28brand1%29/Nikkor/%28camera1%29/792/%28brand2%29/Canon/%28camera2%29/795/%28brand3%29/Sony/%28camera3%29/831
> 
> This might lead to a higher summed up score because DxO produces the score from data over the whole range. If this is NOT WEIGHTED with typical use scenarios ... this gives misleading scores while the measurements are correct AND helpful.
> ...



That is interesting considering the Nikon has a minimum aperture of f/22 ;D


----------



## bdunbar79 (Jul 13, 2013)

mb66energy said:


> The Nikon 500mm rules @ f/32 in terms of actuance:
> http://www.dxomark.com/index.php/Lenses/Compare-Camera-Lenses/Compare-lenses/%28lens1%29/1174/%28lens2%29/393/%28lens3%29/891/%28brand1%29/Nikkor/%28camera1%29/792/%28brand2%29/Canon/%28camera2%29/795/%28brand3%29/Sony/%28camera3%29/831
> 
> This might lead to a higher summed up score because DxO produces the score from data over the whole range. If this is NOT WEIGHTED with typical use scenarios ... this gives misleading scores while the measurements are correct AND helpful.
> ...


----------



## Dylan777 (Jul 13, 2013)

Don Haines said:


> neuroanatomist said:
> 
> 
> > For every measurement, all generated with a body attached, the Canon lens comes out on top, in some cases by a significant margin. Yet, the Score is a tie. So...the score is fabricated, pulled from their nether orifices, etc.
> ...



LOL..... ;D


----------



## jrista (Jul 14, 2013)

bdunbar79 said:


> mb66energy said:
> 
> 
> > The Nikon 500mm rules @ f/32 in terms of actuance:
> ...



LOL. This should become the standard response to all DXO threads. "WAT?!"


----------



## Pi (Jul 14, 2013)

jrista said:


> RGF said:
> 
> 
> > If you look at the test results they are measuring the combination of lens and body. Hard to do otherwise. Makes comparison between manufactures difficult and less than meaningful.
> ...



DXO never said that they were "lens tests" only, so I do not see he problem.


----------



## Woody (Jul 14, 2013)

Pi said:


> DXO never said that they were "lens tests" only, so I do not see he problem.



I was surprised by your comment, so I decided to investigate the definition of DXOMark Score. This is what is said on http://www.dxomark.com/index.php/Publications/DxOMark-Insights/DxOMark-Score/DxOMark-Score-design:

"DxOMark Score can be interpreted as the maximum print size of the average quality. Obviously, any photo can be printed at any size, but beyond a certain point, a larger print does not reveal any additional details to an observer at close distance...

While the Sensor Overall Score describes the results of measurements only on sensors and is essentially related to image noise (for example, a difference of one f-stop offsets the Overall Sensor Score by approximately 15 points), the DxOMark Score is both proportional to resolution (taking optical aberrations into account) and to sensor dynamic range."

It appears that DXOMark considers both sensor and lens performances when assigning a score. So, I guess you are right.

But I still can't get over the fact the Canon 500 mm lens when mounted on a 22 MP FF sensor can OUTRESOLVE the Nikon 500 mm lens mounted on a 36 MP FF sensor. Wow, just wow! Clearly shows how awful the new Nikon telephoto lens is... especially when these prices are taken into account.


----------



## iMagic (Jul 14, 2013)

[/quote]

DXO never said that they were "lens tests" only, so I do not see he problem.
[/quote]

Uh, the title says "lens reviews". Symantics and misunderstanding? When i see lens reviews i would expect apples to apples comparison and a "lens test" tone to the article. As it stands, the whole article is meaningless. 


"Nikon AF-S Nikkor 500mm and 600mm f/4G ED VR lens reviews: legendary performers in the range"


----------



## bdunbar79 (Jul 14, 2013)

Pi said:


> jrista said:
> 
> 
> > RGF said:
> ...



It's a problem because it's a lens score and nobody cares about a 5D3/500 f/4L II combo score. They want to know which lens is better. And, DxOmark scores it as a lens score, so an average observer would take it at face value. It's deceiving, and an objective reviewing body should never be deceiving, if they are being truly objective. Clearly they are not.


----------



## Pi (Jul 15, 2013)

Woody said:


> But I still can't get over the fact the Canon 500 mm lens when mounted on a 22 MP FF sensor can OUTRESOLVE the Nikon 500 mm lens mounted on a 36 MP FF sensor. Wow, just wow! Clearly shows how awful the new Nikon telephoto lens is... especially when these prices are taken into account.



DXO decided to dumb down their lens+body measurements and to report the results in some metric that they keep secret. From other discussions here, I can guess that it is heavily weighed towards higher MTF's, i.e., it measures mainly what we call "contrast". Then 36 or 22 mp does not matter much.


----------



## Pi (Jul 15, 2013)

iMagic said:


> Uh, the title says "lens reviews". Symantics and misunderstanding? When i see lens reviews i would expect apples to apples comparison and a "lens test" tone to the article. As it stands, the whole article is meaningless.
> 
> "Nikon AF-S Nikkor 500mm and 600mm f/4G ED VR lens reviews: legendary performers in the range"



You cannot judge an article by its title only. The text clearly talks about combos. Also, you cannot get "pure lens" data anywhere on their site, you always choose a body.


----------



## Pi (Jul 15, 2013)

bdunbar79 said:


> It's a problem because it's a lens score and nobody cares about a 5D3/500 f/4L II combo score. They want to know which lens is better. And, DxOmark scores it as a lens score, so an average observer would take it at face value. It's deceiving, and an objective reviewing body should never be deceiving, if they are being truly objective. Clearly they are not.



I disagree. I would say that nobody cares how the 500/4 II performs on the D800, but how I performs on the 5d3 is really interesting. Even if DXO were to publish pure resolution numbers of the lens, nobody would know what to do with those numbers to see how it performs on a particular body (well, some would know). 

Their "score" is nonsense, but the number for each aperture and FL (for zooms) are useful.


----------



## mb66energy (Jul 15, 2013)

neuroanatomist said:


> mb66energy said:
> 
> 
> > The Nikon 500mm rules @ f/32 in terms of actuance:
> ...



I do not find the text I BELIEVE I have read - perhaps I should read english texts more carefully because it isn't my mother language. Perhaps I have misread the explanation of the METRIC tests which averages the results of different FOCAL LENGTHS - not different apertures.
Sorry for that.

Now I have seen the Use Case Scores (Mid-Light-Score) - whatever that is (clicking on the "?" doesn't provide information):
Nikon 48 @f/5.6
Canon 48 @f/4.0
Sony 44 @f/4.0
perhaps that is a weighted version of scoring ...
EDIT(added): http://www.dxomark.com/index.php/Lenses/Compare-Camera-Lenses/Compare-lenses/%28lens1%29/1174/%28lens2%29/393/%28lens3%29/891/%28brand1%29/Nikkor/%28camera1%29/792/%28brand2%29/Canon/%28camera2%29/795/%28brand3%29/Sony/%28camera3%29/831

But perhaps you are right: The scores ARE BS ...


----------



## neuroanatomist (Jul 15, 2013)

Pi said:


> DXO decided to dumb down their lens+body measurements and to report the results in some metric that they keep secret. From other discussions here, I can guess that it is heavily weighed towards higher MTF's, i.e., it measures mainly what we call "contrast". Then 36 or 22 mp does not matter much.



The most important factor in their BS Score is transmission, which is why the cheap 50/1.8 lenses from both Canon and Nikon score several points higher than any of these 500/4 lenses. It's only when you have lenses of identical max aperture that the other stuff has any influence. BTW, while the 500/4s score 25, the Canon 50/1.8 on a 5DIII gets 28, and the Nikon 50/1.8 on a D800 gets a 31, and main measurement difference between the 50/1.8s is that the Nikon is 1 P-Mpix sharper (put it on the D3X, it's sharpness and Score tie the Canon). So, the Nikon 50/1.8 is 1 P-Mpix sharper and gets a Score 3 points higher, the Canon 500/4 is 3 P-Mpix sharper, but the Scores are equal. 

BS. 

I begin to wonder if the 'secret metric' you mention is sponsorship...


----------



## weixing (Jul 15, 2013)

Hi,
Lens performance result using measurement from different camera body is basically useless since every camera sensor perform differently and every vendor process their RAW file differently.

DxO should come out a standard testing camera for testing lens... a mirrorless camera should be idea since it'll have the shortest flange focal distance and can use adapter for different vendor lens... then the only variables will be the lens and the result can be valid to compare between different lens.

Have a nice day.


----------



## Woody (Jul 15, 2013)

weixing said:


> Lens performance result using measurement from different camera body is basically useless since every camera sensor perform differently and every vendor process their RAW file differently.
> 
> DxO should come out a standard testing camera for testing lens... a mirrorless camera should be idea since it'll have the shortest flange focal distance and can use adapter for different vendor lens... then the only variables will be the lens and the result can be valid to compare between different lens.



But they won't do that because a truly impartial score is not acceptable to them.


----------



## agierke (Jul 15, 2013)

> DxO should come out a standard testing camera for testing lens... a mirrorless camera should be idea since it'll have the shortest flange focal distance and can use adapter for different vendor lens... then the only variables will be the lens and the result can be valid to compare between different lens.



that wouldnt really be more valid as you still have to mount those lenses on their brand camera for real world purposes. why would i care what the test results of a lens would be mounted to a camera i would never shoot. i would rather see test results from a lens/body combo that i could actually use.

not that i really care about DxO that is....


----------



## Plainsman (Jul 15, 2013)

In defence of Dxo - at least they publish test info like MTF-50 sharpness graphs for you read and interpret as you want.

Nothing I dislike more than wishy washy "reviews" with nothing to justify the claims of the reviewer - like juzaphoto stating that the Canon 70-200/2.8 II is "pretty poor with teleconverters"! At least try another lens before you make a statement like that juza.

BTW Nikon will not like Dxo's tests which show that their expensive new 80-400 appears to be inferior to Sony's cheaper latest 70-400 offering certainly at the top end.

Anybody investing in an expensive new lens needs as much info as possible - so carry on Dxo. 

Sony 500/4 more expensive than Canon 500/4 II. Ridiculous - won't tempt many pros over to their system.


----------



## jthomson (Jul 15, 2013)

The most important factor in their BS Score is transmission, which is why the cheap 50/1.8 lenses from both Canon and Nikon score several points higher than any of these 500/4 lenses. It's only when you have lenses of identical max aperture that the other stuff has any influence. BTW, while the 500/4s score 25, the Canon 50/1.8 on a 5DIII gets 28, and the Nikon 50/1.8 on a D800 gets a 31, and main measurement difference between the 50/1.8s is that the Nikon is 1 P-Mpix sharper (put it on the D3X, it's sharpness and Score tie the Canon). So, the Nikon 50/1.8 is 1 P-Mpix sharper and gets a Score 3 points higher, the Canon 500/4 is 3 P-Mpix sharper, but the Scores are equal. 

BS. 

I begin to wonder if the 'secret metric' you mention is sponsorship... 
[/quote]

If transmission really was the most important factor then the Sony should be out scoring the Nikon. The Sony equals or beats the Nikon in all the listed categories and has the lowest transmission of the three lenses.

Basically I think DxO are a bunch of Nikon fanboys. The individual ratings are fine, but the composite scores ae just ridiculous.


----------



## Orangutan (Jul 15, 2013)

agierke said:


> why would i care what the test results of a lens would be mounted to a camera i would never shoot. i would rather see test results from a lens/body combo that i could actually use.



Because lenses, especially one like this, will probably be used on new bodies for the next 10 years or so. It will far outlast the current "best" body of a brand, possibly by several generations. If I'm going to lay out that much money for a lens, I want to know not only how well it will do on the current bodies, but get a sense of its longevity.

So my answer is yes: to the greatest degree feasible, a "lens test" should isolate the lens, even mounting the competitors on the same body if possible. (of course that's difficult, but that's what I'd like to see)


----------



## weixing (Jul 15, 2013)

agierke said:


> > DxO should come out a standard testing camera for testing lens... a mirrorless camera should be idea since it'll have the shortest flange focal distance and can use adapter for different vendor lens... then the only variables will be the lens and the result can be valid to compare between different lens.
> 
> 
> 
> ...


 Hmm... might not mean anything to Nikon DSLR user, but Canon user can mount Nikon lens to Canon DSLR... 

Have a nice day.


----------



## neuroanatomist (Jul 15, 2013)

jthomson said:


> If transmission really was the most important factor then the Sony should be out scoring the Nikon. The Sony equals or beats the Nikon in all the listed categories and has the lowest transmission of the three lenses.



Transmission on macro scale, not micro. In fact, probably not even measured transmission, but rather the specified max aperture. That's why the 50/1.8 lenses outscore the 500/4 lenses. 



dilbert said:


> neuroanatomist said:
> 
> 
> > RGF said:
> ...



The point is that they measure several parameters of optical image quality from the lens, such as sharpness, transmission, distortion, vignetting, and CA. They _could_ generate a Lens Score based on those parameters, but they don't. Had they done so, the Canon 500/4 II would have soundly trounced the Nikon 500/4. 

For those who argue that it's reasonable that DxO consider the camera in the 'Lens Score', note that the sensor is already factored into the measurements themselves. P-Mpix measures sharpness of camera + lens, pixel size affects CA, etc. Even their transmission measurement changes with different cameras. So by factoring in the camera directly in the measurements (reasonable) then factoring it in again in the overall score where it's given an undisclosed (but evidently very significant) weighting, means their 'Lens Score' is as much if not more a camera score than a lens score. 

One more point - considering just their P-Mpix measure of sharpness, by their definition the Nikon 500/4 results in a loss of more that 55% of the resolution of which the D800 sensor is capable, whereas the Canon 500/4 II only decreases the 5DIII's potential resolution by less than 14%. Of course, the Canon lens also outresolves the Nikon by an absolute assessment, even taking the higher resolution D800 sensor into account. But they get the same 'Lens Score'. Right. 

They should rename their Overall Scores to a Camera Basic Score and a Lens Basic Score, so we could abbreviate them for what they really are: BS. Actually, that's probably giving them too much credit, because real Bovine Scat makes good fertilizer, whereas DxOMark's BS has no real-world utility.


----------



## msm (Jul 15, 2013)

neuroanatomist said:


> whereas DxOMark's BS has no real-world utility.



It does precisely what it is supposed to do, generate a lot of views for the DXO site where they advertise and sell their software. This is evident from this and other threads. It costs them very little since they need to test the gear to make their software anyways.


----------



## Woody (Jul 16, 2013)

neuroanatomist said:


> For those who argue that it's reasonable that DxO consider the camera in the 'Lens Score', note that the sensor is already factored into the measurements themselves. P-Mpix measures sharpness of camera + lens, pixel size affects CA, etc. Even their transmission measurement changes with different cameras...
> 
> Actually, that's probably giving them too much credit, because real Bovine Scat makes good fertilizer, whereas DxOMark's BS has no real-world utility.



Good point.


----------



## neuroanatomist (Jul 16, 2013)

dilbert said:


> neuroanatomist said:
> 
> 
> > ...
> ...



Except I'm not scoring lenses without disclosing my methods. In my scientific publishing, I must fully disclose my methods such that another scientist can duplicate my experiment. DxOMark is clearly not held to the standards of peer-reviewed scientific publications. (Which is reasonable, as they're not publishing in scientific journals, but only to their own website - so, they can make up 'data' or 'scores' if they choose, change those data post hoc without explanation, whatever they want.)

We don't know what's in their black box? They don't say. Why not? If someone is hiding something, it's usually for a good reason. Could be as simple as maintaining a competitive advantage (against who, I have no idea).

Clearly, I don't buy into it. Sadly, I suspect many people do...


----------



## Woody (Jul 16, 2013)

dilbert said:


> Your assertions of DxO publishing BS are just as BS as their results because you don't know how they calculate their "lens score" and thus you have to make up reasons as to why it is so.



Actually, based on their description, we have some idea of how the lens score is computed. For example, we know they take sensor score into account. However, this method of computation makes no sense.


----------



## Pi (Jul 16, 2013)

Orangutan said:


> agierke said:
> 
> 
> > why would i care what the test results of a lens would be mounted to a camera i would never shoot. i would rather see test results from a lens/body combo that i could actually use.
> ...



But there is still a problem: if you have a pure lens test, are you sure you know how to compute how it will perform on a future, say, 50mp body?


----------



## jrista (Jul 16, 2013)

Pi said:


> Orangutan said:
> 
> 
> > agierke said:
> ...



MTF charts are, for all intents and purposes, "pure lens tests". They already give us a way to compare lenses across the board, brands be damned. Simple fact of the matter is a better lens will perform better on ALL sensors, 20mp, 30mp, or 50mp. The problem with DXO's tests is they quite simply don't give you a reasonable camera-agnostic basis from which to compare lenses. The Nikon 500/4 performs "on par" (toung in cheek) with the Canon 500/4 solely because of the higher resolution sensor. That sort of tells you that the Canon lens is particularly good, because it is performing so well on a worse sensor...but you don't really have any exact way of comparing. You only get a "feeling" that it performs so well.


----------



## Orangutan (Jul 16, 2013)

Pi said:


> But there is still a problem: if you have a pure lens test, are you sure you know how to compute how it will perform on a future, say, 50mp body?



You don't; however, you do know that a better lens will perform no worse, and likely better, on any future camera than would a lens that performed worse in a "pure" lens test. Though not perfect, it's superior to the tainted lens+body test.




jrista said:


> MTF charts are, for all intents and purposes, "pure lens tests". They already give us a way to compare lenses across the board, brands be damned.



If memory serves me well (no guarantee here) most manufacturers use computed (theoretical) MTF charts that are derived from the design of the optics. A few (?Zeiss?) actually test production copies. I hope someone will correct me if I'm mistaken. Yes, it would be great to see MTF charts using data from real, production copies.


----------



## TexasBadger (Jul 16, 2013)

Since you can mount the Nikon lens on a Canon body with an adapter, why not compare apples to apples?


----------



## sandymandy (Jul 16, 2013)

i didnt read it...let me guess: nikon lenses the best as always? except in sales...


----------



## applecider (Jul 16, 2013)

This test should be added to a permanent faq so that any time a dxo discussion come up we all refer to this chart to show that the tests are meaningless only done to make nikon look good.


----------



## Orangutan (Jul 16, 2013)

sandymandy said:


> i didnt read it...let me guess: nikon lenses the best as always? except in sales...



Nope.



jrista said:


> This comparison of DXO's results is what makes me EXTREMELY SUSPICIOUS of them. The Nikkor 500mm scores 25, when *every single trait that factors into that score is worse than the Canon 500mm*...which also scores 25. That is just plain wrong. The Canon has zero distortion, higher sharpness, less vignetting, less CA, and the same transmission...on a LOWER RESOLUTION BODY! It should have a higher score than the Nikon.


----------



## Pi (Jul 16, 2013)

Orangutan said:


> Pi said:
> 
> 
> > But there is still a problem: if you have a pure lens test, are you sure you know how to compute how it will perform on a future, say, 50mp body?
> ...



But a better lens on, say, the 5D3, will preform better on the 5D4, you know that for sure. Again, where is the problem exactly?


----------



## Pi (Jul 16, 2013)

TexasBadger said:


> Since you can mount the Nikon lens on a Canon body with an adapter, why not compare apples to apples?



You mean, an apple on an orange to an orange on an orange?


----------



## Pi (Jul 16, 2013)

jrista said:


> Simple fact of the matter is a better lens will perform better on ALL sensors, 20mp, 30mp, or 50mp. The problem with DXO's tests is they quite simply don't give you a reasonable camera-agnostic basis from which to compare lenses.




Actually, they do. There is a way to extract the pure lens resolution from the data they used to publish (full MTF curves, not the nonsense they publish now). 



> The Nikon 500/4 performs "on par" (toung in cheek) with the Canon 500/4 solely because of the higher resolution sensor. That sort of tells you that the Canon lens is particularly good, because it is performing so well on a worse sensor...but you don't really have any exact way of comparing. You only get a "feeling" that it performs so well.



Why in the world would you want to know how a Canon compares to a Nikon without a body? For bragging rights? They tell you what is achievable with the current bodies on which the lens works, the way it is deigned to work. A better lens on one body will be better on future bodies as well.


----------



## bdunbar79 (Jul 16, 2013)

This is pretty straightforward. The Canon LENS is better than the Nikon LENS. They got the same LENS score. Yep, pretty much sums it up.


----------



## jrista (Jul 16, 2013)

Pi said:


> jrista said:
> 
> 
> > Simple fact of the matter is a better lens will perform better on ALL sensors, 20mp, 30mp, or 50mp. The problem with DXO's tests is they quite simply don't give you a reasonable camera-agnostic basis from which to compare lenses.
> ...



Umm, no...sorry. The final image is a convoluted result...one could not extract a "pure" lens resolution...you could only approximate it. (For the very same reason one cannot perfectly extract noise from a noisy image...it is part of a convolution produced by a complex real-world system. Too much uncertainty and a loss of information prevents perfect noise removal.) A mathematically generated MTF that takes into account the real mathematical point spread function of the entire lens is really the only way to get any realistic idea of how a lens will actually perform. The moment that convolution is further convolved by a sensor, you lose the ability to "perfectly" (or purely) revert to the prior result...there is too much uncertainty and loss of information. 



Pi said:


> > The Nikon 500/4 performs "on par" (toung in cheek) with the Canon 500/4 solely because of the higher resolution sensor. That sort of tells you that the Canon lens is particularly good, because it is performing so well on a worse sensor...but you don't really have any exact way of comparing. You only get a "feeling" that it performs so well.
> 
> 
> 
> Why in the world would you want to know how a Canon compares to a Nikon without a body? For bragging rights? They tell you what is achievable with the current bodies on which the lens works, the way it is deigned to work. A better lens on one body will be better on future bodies as well.



One wouldn't, necessarily. But your missing the point. The point is to call out DXO's BS approach to performing lens tests. The point is to clearly note that those tests are "camera system" tests...they are neither lens tests nor sensor tests. I wouldn't go so far as to say that is 100% useless, but it is certainly biased the way DXO does it, and there is a suspiciously long-term bias towards a particular manufacturer by DXO. (Not just away from Canon, either...even the Sony lens, which actually has better transmission, should have scored better...but it was limited by a sensor!)


----------



## Pi (Jul 16, 2013)

jrista said:


> Pi said:
> 
> 
> > jrista said:
> ...



You are wrong on that. I am not saying that you can remove the AA filters/sensor blur from the image. I am saying that you can find (estimate, if you wish) the strength of the sensor blur. If you are interested in the math, go to my profile, click on the link, etc. Deconvolution is a very different process, very unstable but you do not need to deconvolute to estimate the effect of the sensor blur. You can get instability only if you use sensors with such a low resolution, that the lenses you want to compare look the same (and they are not). 

The problem with all that is that even if you are going to get the pure lens resolution somehow, you still need to consider the blurring effect of a future sensor, and compute the combined resolution again. So my question stands: are you sure you know how to do that?



> One wouldn't, necessarily. But your missing the point. The point is to call out DXO's BS approach to performing lens tests. The point is to clearly note that those tests are "camera system" tests...they are neither lens tests nor sensor tests. I wouldn't go so far as to say that is 100% useless, but it is certainly biased the way DXO does it, and there is a suspiciously long-term bias towards a particular manufacturer by DXO. (Not just away from Canon, either...even the Sony lens, which actually has better transmission, should have scored better...but it was limited by a sensor!)



Of course those are lens+camera tests, and DXO never said otherwise. They sometimes write funny articles, and I simply stop reading them. But the numbers are meaningful; or should I say were meaningful before they decided that we are too stupid to understand what MTF meant and decided to use an undocumented metric.


----------



## neuroanatomist (Jul 16, 2013)

Pi said:


> But the numbers are meaningful; or should I say were meaningful before they decided that we are too stupid to understand what MTF meant and decided to use an undocumented metric.



Now, now...be fair. P-Mpix is just as well documented as many of the other values they publish. :


----------



## Orangutan (Jul 17, 2013)

Pi said:


> But a better lens on, say, the 5D3, will preform better on the 5D4, you know that for sure. Again, where is the problem exactly?



1. That assumes I'll get a 5D4. Maybe I'll re-evaluate my entire system if data suggest that another brand is better. Good data allows me to make the right decision for my needs.

2. Wait! SOMEONE is WRONG on the Internet!!! (http://xkcd.com/386/)

It's irrelevant to me at this point since I can't justify the cost of any of these lenses for my amateur needs. I simply like to see honest, accurate information. Misleading data (or misleading presentation of data) makes me feel cheated.

Cheers.


----------



## jrista (Jul 17, 2013)

Pi said:


> jrista said:
> 
> 
> > Pi said:
> ...



It doesn't matter what kind of sensor you have, low resolution, high resolution, or tomorrows resolution. A convolved result is a convolved result, and in this case stability (or the lack thereof) doesn't really apply like it might when trying to denoise or deblur. You are talking about reverse engineering the *actual lens PSF* from an image produced by a grid of spatially incongruent red, green, and blue pixels (likely covered by additional lenses (microlenses)), then further interpolated by software to produce the kind of RGB color pixels we see on a screen and analyze with tools like Imatest (or DXO's software). The moment you bring the sensor into play, there are significant enough losses of data, and you can only, at best, guess at what those losses are (unless you have some detailed inside knowledge about whatever sensor it is your testing with). Your article is an interesting start, but you are assuming a Gaussian PSF. An actual PSF is most definitely not Gaussian, nor is it constant across the area of the lens (i.e. it changes as you leave the center and approach the corners...do a search for "spot diagram" to see actual lens PSF's produced mathematically from detailed and accurate lens specifications...even for the best of lenses, outside of the most centeral on-axis results, a PSF can be wildly complicated). Not to mention the fact that you have to guess the kernel in the first place, so whatever your result, it is immediately affected by _what you think_ the lens is capable of in the first place.

Personally, I wouldn't trust any site that provided "lens resolution" results reverse engineered from an image produced by any sensor. I would actually rather take the "camera system" tests than have someone telling me what their best guess is for lens performance. 



Pi said:


> > One wouldn't, necessarily. But your missing the point. The point is to call out DXO's BS approach to performing lens tests. The point is to clearly note that those tests are "camera system" tests...they are neither lens tests nor sensor tests. I wouldn't go so far as to say that is 100% useless, but it is certainly biased the way DXO does it, and there is a suspiciously long-term bias towards a particular manufacturer by DXO. (Not just away from Canon, either...even the Sony lens, which actually has better transmission, should have scored better...but it was limited by a sensor!)
> 
> 
> 
> Of course those are lens+camera tests, and DXO never said otherwise.



Hmm, DXO's own description on the lens tests page begs to differ:



> DxOMark's comprehensive camera *lens test result* database allows you to browse and select *lenses* for comparison based on its *characteristics, brand, type, focal range, aperture and price*.



Nowhere in there do they state that the camera sensor is a factor in your ability to select and compare *lenses*. They only state that the lens characteristics, brand, type, focal range, aperture, and price are the applicable factors.


----------



## Pi (Jul 17, 2013)

jrista said:


> It doesn't matter what kind of sensor you have, low resolution, high resolution, or tomorrows resolution. A convolved result is a convolved result, and in this case stability (or the lack thereof) doesn't really apply like it might when trying to denoise or deblur.



Let me repeat. The question here is NOT to reconstruct the image before the convolution. It is to reconstruct the convolution kernel knowing what the image was and knowing what the convoluted image is. This is a very different problem, and a well posed one. In the case under discussion, we have two kernels but we have more than two bodies, so you get a system, etc. 



> You are talking about reverse engineering the *actual lens PSF* from an image produced by a grid of spatially incongruent red, green, and blue pixels (likely covered by additional lenses (microlenses)), then further interpolated by software to produce the kind of RGB color pixels we see on a screen and analyze with tools like Imatest (or DXO's software).




Why do not read their description first? They do not demosaic, and they test on each channel. Unfortunately, they decided to hide the data since recently. They do the slanted edge test, which averages over many pixels and makes it possible to estimate well what the effect of the pixels is just based on their number (but not the AA filter strength). It is the "purest" test I have seen but again, the data is hidden now.




> Your article is an interesting start, but you are assuming a Gaussian PSF. An actual PSF is most definitely not Gaussian, nor is it constant across the area of the lens (i.e. it changes as you leave the center and approach the corners...do a search for "spot diagram" to see actual lens PSF's produced mathematically from detailed and accurate lens specifications...even for the best of lenses, outside of the most centeral on-axis results, a PSF can be wildly complicated).



I did say that it is not a Gaussian but for many purposes it is close enough. This is not _my_ formula, 
it first appeared in a paper on optics and has been used many times since then. If somebody can point out a reference, I will put it there right away. 

I also explained why its variance across the frame is not a problem. You just apply the formula in different regions with different values. 

I also mentioned there that my point is not _that _ formula but the general principle: multiple blur factors act as a convolution with their convolution. This is a much more universal principle. 



> Not to mention the fact that you have to guess the kernel in the first place, so whatever your result, it is immediately affected by _what you think_ the lens is capable of in the first place.



There is no much go guess. The AA filter can very well be approximated with a Gaussian, and the effect of the pixels can just be computed. The only weakness is that you do not really know that the AA filter has strength proportional to the pixel density. But then you have data for many bodies. 

But even if you had the absolute lens data, what are you going to do with it? You never answered that question. Let me help you - to see how it performs on a future 5D4 or whatever, you need to make some assumptions on the AA filter, then you need to apply the formula that you do not like. There is no way around that factor. 



> Personally, I wouldn't trust any site that provided "lens resolution" results reverse engineered from an image produced by any sensor. I would actually rather take the "camera system" tests than have someone telling me what their best guess is for lens performance.



You probably never go to MRI or CT scan test because instead of just "seeing" what is inside, they "reverse engineer", i.e., they compute it. 



> Hmm, DXO's own description on the lens tests page begs to differ:
> 
> 
> 
> > DxOMark's comprehensive camera *lens test result* database allows you to browse and select *lenses* for comparison based on its *characteristics, brand, type, focal range, aperture and price*.


This is nitpicking. Try to get the resolution numbers. You cannot get it without choosing a body, and the results are always displayed with the body well visible. Their articles are poorly written but it is not a "rocket science" to realize what you are looking at.


----------



## msm (Jul 24, 2013)

ankorwatt said:


> well as I have declare before in many discussion here at CR that a real MTF test from Hasselblad MTF lab shows that there are no significant difference between Nikon super tele and Canon, even if Canon are using their fluorite elements compared to Nikons super ED
> http://www.dxomark.com/index.php/Publications/DxOMark-Reviews/Nikon-AF-S-Nikkor-300mm-and-400mm-f-2.8G-ED-VR-lens-reviews-legendary-performers-in-the-range
> 
> and 200/2.0 FROM NIKON, I PERSONALLY RANK THIS LENS AS BETTER THAN 200/1,8 AND 2.0 FROM CANON and so does also http://www.lenstip.com/index.html?test=obiektywu&test_ob=325
> http://www.dxomark.com/index.php/Publications/DxOMark-Reviews/Nikon-AF-S-Nikkor-200mm-f-2.0G-ED-VR-II-lens-review/Nikkor-AF-S-Nikkor-200mm-f-2G-ED-VR-II-versus-competition



I'll take this page showing real ISO crops over some MTF lab test anyday I'm going to buy a lens to take pictures with: 

http://www.the-digital-picture.com/Reviews/ISO-12233-Sample-Crops.aspx?Lens=458&Camera=453&Sample=0&FLI=0&API=0&LensComp=648&CameraComp=0&FLIComp=0&APIComp=0

I'll worry about the "real MTF" test next time I buy a lens to take it to the lab.


----------



## Mika (Jul 24, 2013)

I personally think this is the weirdest lens test article I have ever read. It is not a lens test, but a system test.

What it comes to Hasselblad measuring the MTF of the superteles, not all information is included there. Typically reduced secondary spectrum doublets achromats have a better MTF than equal element number apochromatic designs, but the color error is then visible with areas of high contrast difference. I'm also interested in the Hasselblad's methodology of doing the test, could you send me the link, please?

It is not always clear to me whether Nikon is reviewed with software corrections enabled on the color aberrations in these tests or not. For that reason, I'd be a bit suspicious on using current Nikon objectives with film.

Even in the Digital Pictures crops, it is not clear to me, how the crops were obtained from RAWs. But for me it seems that the Canon lens performs actually better than Nikon's at this distance for all tested field points. Whether this difference is significant in real life photography is another story. 

By looking at the manufacturers MTFs for 200/2.0s, it seems Canon has optimized the 10 lp/mm performance while slightly sacrificing 30 lp/mm performance compared to Nikon, and averagely the impression seems to be that the Canons 200 mm performs better (to my eye), and considerably better close to the edge. This is nothing new, it is known that images taken with a lens that has a better macro contrast tend to look better than the ones taken with a lens with less contrast but higher micro-contrast. Only in the center crop we can see a bit of the micro-contrast at play, but it isn't clear to me what causes aliasing (or Moire?) on the Canon crop, is it the lens or the sensor.

The MTF tests that we do actually does not include the camera body, but a microscope objective and a known image sensor. It is arguable whether this gives full information since there is further processing due to the software, but it allows checking which lenses are better corrected to begin with.


----------



## msm (Jul 24, 2013)

ankorwatt said:


> msm said:
> 
> 
> > ankorwatt said:
> ...



Never said it was, just tried to make the point that MTF does not tell the entire story and I prefer equipment that produce best images to my eyes and couldn't care less about numbers from a lab. And the Nikon 200f2 looks softer in the corner compared to Canon on those actual images.


----------



## Pi (Jul 25, 2013)

msm said:


> Never said it was, just tried to make the point that MTF does not tell the entire story and I prefer equipment that produce best images to my eyes and couldn't care less about numbers from a lab. And the Nikon 200f2 looks softer in the corner compared to Canon on those actual images.



Not that I disagree with you about the visual evidence vs. the lab results ... but if your goal is to get the whole PSF (the image of an ideal point), then the MTF is giving essentially its Fourier transform, and from there, you get the PSF in a direct and a stable way. This does not work is such a straightforward way when the PSF is too concentrated near a single pixel. Still, the whole MTF is in some sense the "complete data" that DXO is hiding from us.


----------



## msm (Jul 25, 2013)

Pi said:


> msm said:
> 
> 
> > Never said it was, just tried to make the point that MTF does not tell the entire story and I prefer equipment that produce best images to my eyes and couldn't care less about numbers from a lab. And the Nikon 200f2 looks softer in the corner compared to Canon on those actual images.
> ...



Ages since I had fourier analysis, but as far as I recall that theory is based on point sampling. An image sensor uses area sampling. Has it been shown that fourier analysis still applies?


----------



## Pi (Jul 25, 2013)

msm said:


> Pi said:
> 
> 
> > msm said:
> ...



I think you are talking about something different: the sampling theorem (still connected to Fourier analysis). The sampling theorem still applies to "sampling" by integration over the whole pixel (you made a very good observation) but it needs to be modified. Imagine the whole image first convoluted with a pixel (a function equal to 1 in a square of the size of a pixel, zero otherwise). This convolution is band limited (same limit), if the original image is. Then you point sample that.


----------



## Mika (Jul 25, 2013)

As far as I know, digital image sensors are a bit more complicated case than the classical sampling theorem would predict. First of all, it is important to understand the full meaning that the captured image is a two three dimensional signal (x, y and intensity) and how the eye sees it. 

Using classical sampling theorem, a maximum resolvable frequency could be found by taking the inverse of (2*pixel pitch), which would lead to Nyqvist cut-off frequency. However, this is not the case, as in the measurements the image sensor tends to see further, as explained in [1] and published in [2]

As a short version, if one is able to align the pixel array exactly in the direction of bar patterns, the classical Nyqvist frequency holds. However, it is very difficult to do this, and thus what is actually seen is a result of sub-pixel sampling, which is then averaged by the eye and interpreted as a distinguishable bar. If one would only take a single line of the image, I'm not sure if the result in that case would be classified as distinguishable.

Add on top of that the fact whether we want to represent the actual shape of the subject at the maximum resolvable frequency despite the fact if it lands between the pixels, it can be seen that there can be a need for three to five times oversampling. I don't unfortunately have a good link to show this, I'll try to look for it and post it whether I can find it. However, this tends to be a way of selling more pixels too.

EDIT: Ah, found it, the PDF was by Andor [3]. What I want to say with all this, is that it is actually not that well defined what is meant by "resolving something" with the image sensors.


----------



## Pi (Jul 25, 2013)

Mika said:


> As far as I know, digital image sensors are a bit more complicated case than the classical sampling theorem would predict. First of all, it is important to understand the full meaning that the captured image is a two three dimensional signal (x, y and intensity) and how the eye sees it.
> 
> Using classical sampling theorem, a maximum resolvable frequency could be found by taking the inverse of (2*pixel pitch), which would lead to Nyqvist cut-off frequency. However, this is not the case, as in the measurements the image sensor tends to see further, as explained in [1] and published in [2]
> 
> ...



Those links have nothing to do with the sampling theorem. The latter does not care whether you image bars, etc., it tells you how to sample an a priori band limited signal (the bars are NOT that), and how to reconstruct it. The modification needed that I mentioned is simple and must have been done by somebody already. In short, if your image is band limited already (this is what the AA filter does, together with the lens), and you have a good estimate what that limit is, you know how many pixels you need. 

Do not confuse a convenient resolution test (bars) with the sampling theorem.


----------



## Mika (Jul 26, 2013)

Pi said:


> Mika said:
> 
> 
> > As far as I know, digital image sensors are a bit more complicated case than the classical sampling theorem would predict. First of all, it is important to understand the full meaning that the captured image is a two three dimensional signal (x, y and intensity) and how the eye sees it.
> ...



There is a misunderstanding somewhere here, for me it sounds like we are talking about different things or use different terms. I'm well aware of the different nature of the problem described in [3]. However, what I meant to say with that is related to your earlier PSF considerations, when characterizing the PSF, the energy in the typical photographic objective spot is typically within the region of 1-3 camera body pixels, with a central core of the energy (something like 80 %) in a single pixel.

So in that case, you would be quite subject to errors in estimating the PSF due to the effect shown in [3]. And you really don't know the PSF beforehand. Only at the proximity of image edges (or using fast lenses) the PSF may become large enough to be sampled well by the camera sensor. If you are using a different bench for estimating the PSF with magnification, you'll then lose the effect of the AA filter as well.

Also, the photographic objective MTF isn't typically evaluated from a PSF (haven't seen this being used in many places), but from an edge or line spread function which then allows sub-pixel sampling and is more robust against positioning with respect to sampling grid. Astronomical telescopes may be a different thing, I don't have experience in designing them.

The point of [1] was to show that for example, depending on the angle the camera is mounted with respect to the bar chart target, your micro-contrast figures may change slightly.

None of this actually matters to the actual photography, though. I don't know whether we should continue with private messages, I suppose this is going to get technical and lots of people aren't probably interested in seeing this.


----------



## jrista (Jul 26, 2013)

Mika said:


> Pi said:
> 
> 
> > Mika said:
> ...



I love this kind of stuff, and have been reading your discussion so far. Instead of private messages, maybe just start a new thread, and link to the conversation here? So far, my understanding is more in line with yours, Mika...but I'd like to see what Pi has to say on the subject, as perhaps there is something new to learn.


----------



## Pi (Jul 26, 2013)

Mika said:


> There is a misunderstanding somewhere here, for me it sounds like we are talking about different things or use different terms. I'm well aware of the different nature of the problem described in [3]. However, what I meant to say with that is related to your earlier PSF considerations, when characterizing the PSF, the energy in the typical photographic objective spot is typically within the region of 1-3 camera body pixels, with a central core of the energy (something like 80 %) in a single pixel.
> 
> So in that case, you would be quite subject to errors in estimating the PSF due to the effect shown in [3]. And you really don't know the PSF beforehand. Only at the proximity of image edges (or using fast lenses) the PSF may become large enough to be sampled well by the camera sensor. If you are using a different bench for estimating the PSF with magnification, you'll then lose the effect of the AA filter as well.
> 
> Also, the photographic objective MTF isn't typically evaluated from a PSF (haven't seen this being used in many places), but from an edge or line spread function which then allows sub-pixel sampling and is more robust against positioning with respect to sampling grid.



Exactly. The slanted edge test averages over a relatively long edge, and I do not know of a single test which would try to look at a single point, or would try to align the bars exactly, etc. But that allows you to restore the actual PSF (convoluted with something but let us keep it simple) by a simple calculation. 




> None of this actually matters to the actual photography, though. I don't know whether we should continue with private messages, I suppose this is going to get technical and lots of people aren't probably interested in seeing this.



It matters enough for so many labs and companies to measure the MTF, for academics to write papers and books, etc.


----------



## Mika (Jul 27, 2013)

Sorry about the delay in replying, the weather has been (almost too) good in this week.

What it comes to slanted edge testing, this is where I disagree (partially). If we consider a slanted edge test with a body+lens setup, there are several issues in that what I'd think as a deal breaker for recovering the real point spread function as I know it. 

First, the pixel pitch typically does not actually support sufficient sampling. Second, the slanted edge is considerably larger and thus the average of the line spread functions is taken over a comparatively large image block where PSF has probably changed by some amount - this is typical for wide angle constructs where there are several aspherical surfaces. And if the length of the slanted edge isn't long enough, there will be uncertainty in the slant angle and the sub-pixel sampling is then affected. Third, given the slant angle is small, this test methodology cannot differentiate between imaging quality of tangential and sagittal axes and can miss changes in the averaging direction completely. 

For an extreme example, it would report the MTF of a cylinder lens system equal to a spherical lens system if it was aligned along the imaging axis. This mistake of course, is hard to imagine happening in real life, but extending the thought for a bit, it is easier to understand that decentered elements along one axis could be missed with this. For this reason, lens would need to be turned 90 degrees to determine both directions.

The bar chart quality assurance benches that I have seen are used as OK/NOK step in quality control. The actual MTF measurement benches magnify the known spot with a high quality microscope objective, and thus this measurement of the MTF is much more local, and for that reason I accept it as a representative PSF. The only people who I do know to have sampled the PSF directly are astronomers.

What I'm saying here is not that the slanted edge method in lens+body setup isn't useful in determining MTF (with certain error bounds), it is. It is also very useful in relative comparisons if all systems are measured in the same bench. But what it does not do is provide scientifically accurate MTF values, and additionally, the online reviews are usually about resolving power of a body+lens combination, but the macro-contrast level is not that often reported.

So I suppose it all boils down on what is accepted as a PSF.


----------



## Pi (Jul 27, 2013)

Mika said:


> Sorry about the delay in replying, the weather has been (almost too) good in this week.
> 
> What it comes to slanted edge testing, this is where I disagree (partially). If we consider a slanted edge test with a body+lens setup, there are several issues in that what I'd think as a deal breaker for recovering the real point spread function as I know it.
> 
> First, the pixel pitch typically does not actually support sufficient sampling.



It does but not of the PSF directly - of the PSF convoluted with "something", derived from the pixel size and the AA filter. That "something" is a known quantity, depending on the angle as well. From there, you can get the PSF. Again, the computation is NOT the same as deconvolution but still, it is not going to be too accurate if the PSF is too concentrated compared to one pixel but the instability is far from the (exponential) one for deconvolution. The lens PSF convoluted with the effect of the AA filter however is not "too concentrated" (the reason AA exists in the first place), and can be well reconstructed. Factoring out the effect of the AA filter itself if trickier but if you keep the same sensor, you want to keep that effect in place. 



> Second, the slanted edge is considerably larger and thus the average of the line spread functions is taken over a comparatively large image block where PSF has probably changed by some amount (if this isn't done, there will be uncertainty in the slant angle and the sub-pixel sampling is affected).



Look at the actual targets. They consists of squares which are not "too large" but "large enough". In any case, you are measuring some average of the PSF along the edge, hoping that it does not change wildly.



> Third, given the slant angle is small, this test methodology cannot differentiate between imaging quality of tangential and sagittal axes and can miss changes in the averaging direction completely.
> 
> For an extreme example, it would report the MTF of a cylinder lens system equal to spherical lens system if it was aligned along the imaging axis. This mistake of course, is hard to imagine happening in real life, but extending the thought for a bit, it is easier to understand that decentered elements along one axis could be missed with this. For this reason, lens would need to be turned 90 degrees to determine both directions.



They use black slanted squares in the target. In the good old times, DXO would report horizontal and vertical resolution (and they usually were different enough). 



> The bar chart quality assurance benches that I have seen are used as OK/NOK step in quality control. The actual MTF measurement benches magnify the known spot with a high quality microscope objective, and thus this measurement of the MTF is much more local, and for that reason I accept it as a representative PSF. The only people who I do know to have sampled the PSF directly are astronomers.



Do not underestimate the power of computed PSF vs. the measured one. In most modern medical imaging techniques, for example, the image is computed from the data with serious math methods, vs. just displaying pictures as with the traditional X-ray tomography, for example. BTW, my work is related to that.


----------



## Mika (Jul 29, 2013)

I checked through some of the late night photos I have taken with 28/1.8 and 40D to see how large the star points actually are on the sensor. With 8-15 second exposure, I'm seeing that the star spots are within 4x3 pixels or 5x4 (the extension along the other dimension is the most dominant here, and is due to exposure time and Earth's rotation). 

This is measured from a straight out of camera JPEG, with F/3.5 and with a lens that had slight amount of frost on the front element. And even then the spots are quite limited. I do think that had the conditions been better (no frost and better lens with equal aperture and better ISO than 40D has) and had I taken RAWs, most of the star images would fall within 3x3 region (as I said earlier) - spread is mainly because of the AA filter, otherwise stars should fall within a single pixel assuming any kind of reasonable performance of the objective and with F-numbers less than 5.6. With a PSF of size 3x3 pixels, it is hard for me to see how this could be used to compute the MTF even with sub-pixel sampling without averaging over larger area.

For example, 100 pixels with a 5 µm pitch respresents about 0.5 mm, which is significant. If we are talking about smaller averaging distance, for example 30 pixels, the uncertainty of the slant angle itself would be about 2 degrees. I haven't seen many error estimations to the slanted edge method, but I'm afraid I'll have to do that myself in the near future in a publication.

It is great to hear about your background in the tomography, it helps me understand how you think about these issues. But I have to remind you that we are talking about optical systems within visual wavelength range, where things are quite bit different from radio waves (MRI) or THz region or ultrasound. From my hazy memory, MRI actually measures time differences on different detectors, but it's 11 years since I have needed to think anything related to NMR? These medical wavelengths are used mainly because of the requirement of non-invasiveness and that's why you need to deal a lot with image processing techniques. 

I think a better equivalent would be to compare image processing techniques in astronomical telescopes to get the state-of-the-art results in the visible wavelength range. Adaptive optics correction to the PSFs allows ground based telescopes just to match them with Hubble over a smaller field of view on good nights.


----------



## Pi (Aug 4, 2013)

Mika said:


> With a PSF of size 3x3 pixels, it is hard for me to see how this could be used to compute the MTF even with sub-pixel sampling without averaging over larger area.



And still they do it. Here is a good read:

http://www.imatest.com/docs/sharpness/
http://www.imatest.com/docs/sharpness/#calc

_*Quote*: Briefly, the ISO-12233 slanted edge method calculates MTF by finding the average edge (4X oversampled using a clever binning algorithm), differentiating it (this is the Line Spread Function (LSF)), then taking the absolute value of the Fourier transform of the LSF. The edge is slanted so the average is derived from a distribution of sampling phases (relationships between the edge and pixel locations). _

Of course, this measures the combined lens+AA filter MTF (a bit more complicated than that, actually). Different lenses show different enough results to make the test meaningful. They do average over some area, that is the whole idea of the slanted edge test vs. measuring what happens near a single pixel. 



> It is great to hear about your background in the tomography, it helps me understand how you think about these issues. But I have to remind you that we are talking about optical systems within visual wavelength range, where things are quite bit different from radio waves (MRI) or THz region or ultrasound.



My background is in math, which (fortunately for me) is useful regardless of the whether you have microwaves, or visual wavelengths, etc.


----------



## Mika (Aug 8, 2013)

Sorry again about the delay, I was on a vacation trip.

I have not said that the MTF computed using the slanted edge method isn't useful. However, I have said that the MTF calculated with this method isn't scientifically accurate if you want absolute accuracy. The problem with the averaging is that it tends to lose information of the spot itself, while the average along two orthogonal directions is computed with sufficient sampling, pretty much nothing is said about what happens between the orthogonal directions.

For this reason, I don't believe it would be possible to reconstruct an accurate PSF with the slanted edge method and thus the measured MTF must be slightly invalid as well. You can think of this from the dimensional reduction point of view; it is generally not possible to recreate a 3D function from two 2D functions. Higher order aberrations do give rise for all sorts of interesting spot shapes and orientations with element decentering.

But as I said, slanted edge method allows comparable MTF measurements and is very good at that, but it does not allow absolute measurements where you have to guarantee the results.

It is relatively easy to think that there isn't differences between the behavior of rays when shifting from a wavelength range to another. I hear this argument quite often, and this may sound like blasphemy for some, but I disagree with that. For example, there is a considerable difference between the ray propagation physics between a visual wavelength range camera (typically not diffraction limited) and a THz system and you have to take them into account when designing them.


----------



## jrista (Aug 8, 2013)

Pi said:


> Mika said:
> 
> 
> > Sorry about the delay in replying, the weather has been (almost too) good in this week.
> ...



If you are reverse engineering a PSF which is the convolution of lens + AA filter, then that would be the exact issue I was trying to point out before. The AA filter is designed to limit the resolution of the image that reaches the sensor plane by filtering out higher frequencies, while leaving lower frequencies in tact. If you are reverse engineering the image post-AA, then it has an intrinsic upper limit on resolution. The lens could very well (and in the case of a good lens like the EF 500/4 L II, most likely does) resolve more than what the lens+AA convolved resolve.

If you knew the exact nature of the AA filter, you could probably exclude its effect from the PSF, and arrive at a result much closer to what the lens itself is actually capable of. If you leave the convolution with the AA filter in, then you haven't really reverse engineered the lens MTF, you've just reverse engineered the lens+AA filter MTF. That might be useful for comparison, but it really doesn't tell you all that much about the lens itself.


----------



## Pi (Aug 8, 2013)

Mika said:


> Sorry again about the delay, I was on a vacation trip.
> 
> I have not said that the MTF computed using the slanted edge method isn't useful. However, I have said that the MTF calculated with this method isn't scientifically accurate if you want absolute accuracy. The problem with the averaging is that it tends to lose information of the spot itself, while the average along two orthogonal directions is computed with sufficient sampling, pretty much nothing is said about what happens between the orthogonal directions.
> 
> For this reason, I don't believe it would be possible to reconstruct an accurate PSF with the slanted edge method and thus the measured MTF must be slightly invalid as well. You can think of this from the dimensional reduction point of view; it is generally not possible to recreate a 3D function from two 2D functions. Higher order aberrations do give rise for all sorts of interesting spot shapes and orientations with element decentering.



We are getting here to more philosophical questions. In science, we use modeling. We make some a priori assumptions, ignore this and that, and then build a model which we analyze. That model is never perfect, and it can't be. We must be aware of its limitations. But saying - you can never have a perfect model, so why bother with science at all - is not the right thing. 

Long time ago, Riemann suggested to model anisotropic phenomena by a quadratic form. The level curves (in 2D) of that form are ellipses. 2 measurements then are enough. In some sense, this is equivalent to taking a truncated Taylor expansion of a more complicated function. 

Going back to photography - when the PSF is well concentrated, approximation by a quadratic form is OK. When it is not, you can see it, no need of sophisticated methods (like my 35L at f/1.4 in the corners, wide open). Those are some of the limitations in this case. 

So when I say - you can get the the PSF from the MTF, I always assume some reasonable model in place. Also, I do not mean that you must use the DXO test, necessarily. If you are curious enough, you can rotate your target, and get get more directions. 



> It is relatively easy to think that there isn't differences between the behavior of rays when shifting from a wavelength range to another. I hear this argument quite often, and this may sound like blasphemy for some, but I disagree with that.



In the good old times, DXO published MTF charts on each of the (RAW) RGB channels. They were different enough, indeed. Actually, sometimes, too much. They reported some kind of average, weighed heavily towards greens, if I remember well, because people want simple answers. If you dig deeper in that, the spectral decomposition of the light in the test would play a role, too, etc. In principle, the camera projects an infinitely dimensional color space to a 3D one, so full spectral information is lost anyway.


----------

