# DxO Mark



## logaandm (Nov 16, 2013)

I have long used DxO software and I have almost as long distrusted DxO Mark ratings. Aside from the camera rating, which do not match what my eyes often see or at least are non-linear in the human perception of image quality, the lens ratings system is equally flawed.

Below is a snapshot of a lens comparison I did on DxO Mark. The Zeiss 100mm macro on three camera bodies. With this SAME lenses, the 5D Mark III has 20mpx of resolution, the Nikon D3x has 15mpx of resolution and the D800 has 17 mpx of resolution. Yet the Zeiss 100 mm f2.0 Makro is rated at 32 on the 5D Mark III at 31 on the Nikon D3x - even though it has 5mpx resolution LESS than the Canon and otherwise identical scores on a measurably worse sensor and the D800 is rated at 36 even though it has 3 mpx less resolution than the Canon.

WTF? What is in your secret sauce? Maybe guidelines from Nikon marketing?

DxO marks have zero credibility.


----------



## Pi (Nov 16, 2013)

DXO measure real data with their lens tests. Then they encrypt it and present the encrypted version to the public without releasing the key.


----------



## neuroanatomist (Nov 17, 2013)

logaandm said:


> WTF? What is in your secret sauce? Maybe guidelines from Nikon marketing?
> 
> DxO marks have zero credibility.



The Measurements are useful and accurate (with a couple of notable exceptions). But what's confusing you is the admittedly logical assumption that the Scores are based primarily on the Measurements. In fact, if you dig around on DxOMark site, you'll discover that the overall Lens Score is based on performance in 150 lux - the light level of a dimly-lit warehouse. Thus, lenses score higher on bodies which DxO has determined have better low light performance. Likewise, that's the reason the 50/1.8 II has a better Score than the 600/4L IS II...an f/4 lens, even a $13,000 one, isn't the best choice for shooting in a dimly lit warehouse. 

So, DxOMark's Lens Scores are useful if you do all of your shooting in a dimly lit warehouse. Those of us who don't tend to just ignore the Scores, and look at the Measurements. A different sort of bias is present in their Sensor Scores, give disproportionate weight to ISO 100. 

Thus, DxOMark Scores are mostly consistent, once you understand their biases. That's why I refer to them as Biased Scores, and it's no coincidence that can be abbreviated as BS.


----------



## AcutancePhotography (Nov 18, 2013)

logaandm said:


> DxO marks have zero credibility.



Would your opinion change if they rated Canon lenses higher?


----------



## duydaniel (Nov 18, 2013)

For some reason, I think Ken Rockwell and Dxo are the same people


----------



## Mt Spokane Photography (Nov 18, 2013)

AcutancePhotography said:


> logaandm said:
> 
> 
> > DxO marks have zero credibility.
> ...


It wouldn't matter to me. They keep their scoring methods secret and have even been forced to change scores after being called out over their Strong Bias. A lens can score equal or better in each area of measurement and still get a lower score, so some other undisclosed factor is in play. Its a good reason to be skeptical no matter what brand you own or are looking at buying. That undisclosed factor seems to trump all of the measurements.


----------



## StudentOfLight (Nov 18, 2013)

neuroanatomist said:


> Thus, DxOMark Scores are mostly consistent, once you understand their biases. That's why I refer to them as Biased Scores, and it's no coincidence that can be abbreviated as BS.



Where's the like button! ;D


----------



## Pi (Nov 18, 2013)

Mt Spokane Photography said:


> It wouldn't matter to me. They keep their scoring methods secret and have even been forced to change scores after being called out over their Strong Bias.



This only matters if you think that one can actually assign a single score to a lens.


----------



## mackguyver (Nov 19, 2013)

The other thing I've picked up on is what they call "consistency throughout the field" or something like that, which I think translates to average sharpness, and I think the also measure average sharpness across all apertures.

One other thing, they have, on many occasions, made mistakes, mixing up results, particularly in their reviews.


----------



## LetTheRightLensIn (Nov 19, 2013)

> DxO marks have zero credibility.



Their sensor plot data has a LOT of credibility.

Their original lens tests just about none.

The current lens tests I am not sure, but certainly not any of the overall numbers.


----------



## seekthedragon (Nov 19, 2013)

Don't rely on the scores. They reflect the taste of the DxO crew, which may be completely different than yours. You can not describe a lens or a sensor with one number that fits everyone anyway.
However, the measurements are pretty reliable. Check that out.

PS: I really would like to hear more about those notable examples.


----------



## Pi (Nov 19, 2013)

seekthedragon said:


> However, the measurements are pretty reliable. Check that out.



There are, most of them, at least, but we do not have access to them anymore. They publish now some encrypted BS resolution scores. This is what they used to publish before (and much more):







Then they decided that we are not intelligent enough to understand those charts and replaced a few charts with one number, computed by a secret formula. I am talking about their resolution numbers in "mp".


----------



## dtaylor (Nov 20, 2013)

duydaniel said:


> For some reason, I think Ken Rockwell and Dxo are the same people



No. Ken Rockwell makes sense more often then DxO.


----------



## candc (Nov 20, 2013)

dxo scores are useful for comparing similar lenses on the same body such as canon, tamron, sigma, 70-200 2.8 for instance but it gets screwy when you compare different bodies with the same lens like in the screenshot above. higher score but lower m-pix on a d800 compared to 5diii? the d800 should have higher sharpness with the same lens right?


----------



## Woody (Nov 20, 2013)

LetTheRightLensIn said:


> Their sensor plot data has a LOT of credibility.



Just a word of caution for sensors: only the measurement data, NOT the assigned scores.


----------



## neuroanatomist (Nov 20, 2013)

seekthedragon said:


> PS: I really would like to hear more about those notable examples.



Here's one obvious one that I've run across:

When DxOMark first publised their measurements and review of the EF 70-200mm f/2.8L IS II, they stated that the MkI version of the 70-200/2.8L IS was sharper. They stated that the, "_The Canon EF 70-200mm f/2.8L IS II USM offers slightly less resolution with 51 lp/mm compared to the excellent 61 lp/mm of the Canon EF 70-200mm f/2.8L IS USM_," and went on to explain that it was due to 'less homogenous behavior across the field'. They concluded that, "_...the overall scores come out slightly in favor of the previous version of the Canon EF 70-200mm f/2.8L IS USM, especially for Travel and Sport photography, which are the main use cases of these telephoto zoom lenses._" 

That conclusion differed from pretty much every other reviewer/tester who compared the two versions of the lens, and found that the MkII was hands-down the better lens (the f/2.8L IS MkI was considered to be a worse performer optically than both the 70-200/4L IS and the 70-200/2.8L non-IS). Comments can be added to their review pages, and several people commented that from personal experience with both lenses and reviews on DPR, PZ, et al., had all found that the MkII was the better lens. A DxO Labs employee replied, "_Thanks for bringing this potential mistake to our attention. But, after checking with all our experts in the lab, there isn’t really a mistake...overall the Mark 1 has a slightly higher and more homogeneous resolution. So, it scores better on a full frame camera, like the Canon 5D Mark II used in the review._"

The above quotes are copied from the review page for the 70-200 II, which has not been edited. But, if you compare these two lenses on the 5DII today (screenshot below), you can see that under their new P-Mpix measure for sharpness, the MkII version of the lens performs better than the MkI. They also updated their 'use case scores' so that the MkII is now rated higher for Traval and Sport photography. So…either they re-tested the lens (a different copy, presumably), or whatever black-box factors applied to convert real units (lp/mm) to units that DxO made up (P-Mpix) were applied in a nonlinear manner that somehow favored the MkII version of the lens. I suspect the former is the case, but they said nothing about it, did not add any sort of notation to the original review (which would have been appropriate), or do anything else to acknowledge their mistake.

DxO has used the tagline 'Image Science'. Speaking as a card-carrying scientist, when we discover a mistake in previously published data (it happens), we inform the journal and they publish a correction. DxO's failure to do so in this case is what I'd call 'Bad Science'.


----------



## candc (Nov 20, 2013)

That's a good example, while still convoluted, their new methodology seems more in line.


----------



## Woody (Nov 20, 2013)

candc said:


> That's a good example, while still convoluted, their new methodology seems more in line.



'... more in line...' for the particular example. There are many other cases where their assigned scores are still NOT consistent with real world usage.


----------



## Rienzphotoz (Nov 20, 2013)

neuroanatomist said:


> Thus, DxOMark Scores are mostly consistent, once you understand their biases. That's why I refer to them as Biased Scores, and it's no coincidence that can be abbreviated as BS.


 ;D


----------



## AmbientLight (Nov 20, 2013)

Excuse me, but what I don't understand is why there is so much discussion regarding DxO Mark?

It should have become obvious based on forum members actually collecting DxO Mark results (many thanks to Neuro) that this is not an independent research institute, but a company with all its influences, which do not serve to make them publish what may be qualified as results coming from proper scientific procedures, but rather those results the company wants to publish. There is a marked difference here.

It should also be clear that marketing statements ("image science"?) are sometimes contrary to actual fact. So please excuse DxO for publishing what managers think they should publish, because as a company they have any right to do so. On the other hand there is no reason to complain about this or that score being somewhat different from other tests or actual real life experience.


----------



## wockawocka (Nov 20, 2013)

When it comes to looking at an image I've shot with a camera I've used, studied the raw files of etc I've found the DXOmark scores completely out of whack.

Completely and I wouldn't trust anything on there to relate to real world organic use.


----------



## Woody (Nov 20, 2013)

AmbientLight said:


> Excuse me, but what I don't understand is why there is so much discussion regarding DxO Mark?...
> 
> It should also be clear that marketing statements ("image science"?) are sometimes contrary to actual fact. So please excuse DxO for publishing what managers think they should publish, because as a company they have any right to do so. On the other hand there is no reason to complain about this or that score being somewhat different from other tests or actual real life experience.



There is a lot of discussion because many people take them seriously, including somewhat respectable sites like DPReview. Luckily the latter carry out their own tests too.


----------



## seekthedragon (Nov 20, 2013)

neuroanatomist said:


> seekthedragon said:
> 
> 
> > PS: I really would like to hear more about those notable examples.
> ...



This is quite serious. Thanks for sharing.


----------



## Rienzphotoz (Nov 20, 2013)

neuroanatomist said:


> seekthedragon said:
> 
> 
> > PS: I really would like to hear more about those notable examples.
> ...


NO! ... that cannot be ... DxO can NEVER be wrong, they are the god of camera gear tests ;D No I refuse to believe this false information ;D


----------



## mackguyver (Nov 20, 2013)

Neuro's example was the main one I was talking about and it's pretty sad they didn't publish a retraction. There was another review where they just changed the text, but I can't remember which one it was. 

I wouldn't say their data is worthless, however. The scores, yes, the analysis, most times, but their measurements are pretty reliable. I've found the measurements on sharpness, CA, distortion, and vignetting to match my lens with the exception of my 180mm f/3.5L, which is extremely sharp in my experience, vs. their measurements. My guess is that they measured it a telephoto distance, where it has average performance, instead of at macro distance, where it is superb.

If you use the tools yourself, and compare lens metrics, not scores, it's a great and FREE resource.


----------

