# Dustin Abbott Nikon D850 vs Canon 5D mark IV sensor comparision



## sebasan (Dec 7, 2017)

https://www.youtube.com/watch?v=U_WPjtjuMGk


----------



## aceflibble (Dec 7, 2017)

The contrast differences he brings up a couple of times in the first half are neither lens nor sensor-specific, but processor-specific. I do wish people would learn that it's the sensor and processor together which make the image, not just the sensor. It's especially aggravating when it's someone who generally knows what they're talking about and is positioned as an informer and educator. (See also his mispronunciation of 'ISO' as if it's an acronym.)
To some people it may seem like a minor distinction, but given how frequently these companies re-use processors with different sensors (or vice-versa) in various cameras, it really is very helpful to bear in mind that it's the sensor _and processor_ within each camera at work, not just the sensor. Knowing which elements can be attributed to just the sensor (initial capture colour accuracy), the processor (noise, color profiling, and interpretation of contrast), or both (everything else) is very helpful when then looking at future cameras which may re-use the same part(s).

That said, it's interesting to see someone come to such a mixed conclusion (by the second half of his video), compared to the usual all-or-nothing comparisons you get online. The Canon seemingly being better when you nail exposure, but the Nikon being better for deeper editing and post-processing, is in line with my own experience with the two brands over the last couple of years (as well as Sony, who of course make Nikon's sensors). Typically online you see people only vouching for one or the other.

Also interesting to see the Tamron lens at 120mm on the Nikon shots and 114mm on the Canon in the first couple of examples. While that wouldn't account for contrast, noise, colour accuracy, or anything else like that, and has no bearing on the later examples looking at recoverable range, it would account for the apparent bump in sharpness that the Canon has in most of the early comparisons and it could also account for the slightly more accurate metering. (That's a longer shot; Tamron's light transmission across their zoom range is usually very accurate.)


----------



## DaviSto (Dec 7, 2017)

aceflibble said:


> That said, it's interesting to see someone come to such a mixed conclusion (by the second half of his video), compared to the usual all-or-nothing comparisons you get online. The Canon seemingly being better when you nail exposure, but the Nikon being better for deeper editing and post-processing, is in line with my own experience with the two brands over the last couple of years (as well as Sony, who of course make Nikon's sensors). Typically online you see people only vouching for one or the other.


If 'nailing exposure' means being no more than two or three stops out, I'm going to agree with you. A fairer conclusion might be that, when it comes to pushing images, the Nikon sensor has a small advantage over the Canon if either your technical limitations or, more positively, your style of shooting require you to push your out-of-camera images by over three stops.


----------



## Geek (Dec 7, 2017)

Hey aceflibble, I don't think I understand the impact that the processor has on the image beyond how quickly the image can be stored and manipulated once acquired from the A/D converter that processes the analog data from the sensor? Please elaborate.


----------



## Larsskv (Dec 7, 2017)

I found the review very intersting and balanced as well. Dustin does a great job! 

Honestly, my expectation was that the Nikon would have more of an advantage at low ISO, but the difference seemed insignificant to my eyes. 

I believe the difference in sharpness must be caused by a weaker lens on the Nikon, but it would be interesting to see a sharpness comparison of the two cameras and the Tamron 15-30 lenses. 

One small thing could be improved though. Many of the pictures that were compared had slightly different exposures (shutter speed), which hurts the comparison of ISOs, and also the ISO invariance tests. My experience is that more exposure benefits the end result with regards to (less) noise, when using the same ISO (as long as there is no clipping). Therefore, I would expect the pictures with the longer shutter speeds to perform better with regards to shadow lifting. Said in different words, the comparisons weren’t 100% apples to apples. That said, this doesn’t impact on the overall impression, that both cameras perform very good in real world use.


----------



## mistaspeedy (Dec 7, 2017)

I also don't think the processor makes any difference at all to the RAW images at all. The processor and jpg processing algorithms do make a difference to the quality of JPG images, but those are not what is being compared in the video. (RAW images are being tested in the video)


----------



## Mr Majestyk (Dec 7, 2017)

You can do deep pushing with the 5D4. It's easy to rescue very high contrast images and still have nice clean shadows. Yes the Nikon can go an an extra stop or stop and a half, but how often do you need to go over 4EV, without leaving a unnatural result. Nikon is great if you completely screw up exposure, but it only has a 0.8EV photo dynamic range advantage at base ISO and from ISO 160 on they are almost identical with the Canon slightly better above ISO 3200. The sensor differences are now not the main issue for Canon, although it would be great to see BSI and stacked sensors and further improvements, it's more about the feature gimping. 5D4 is a nice camera but could have easily been a lot better. 5DsII could be a D850 killer if they wanted but that would make it a 1DXII killer and they just can't have that. Nikon couldn't care less about putting pro features from D5 into lesser cameras. D500 is so close to the D5 in AF performance it's not funny. You can own a D500 + D850 for less than D5.


----------



## Mikehit (Dec 7, 2017)

Mr Majestyk said:


> ...it's more about the feature gimping. 5D4 is a nice camera but could have easily been a lot better.



What 'feature gimping' are you referring to?


----------



## Mikehit (Dec 7, 2017)

aceflibble said:


> The contrast differences he brings up a couple of times in the first half are neither lens nor sensor-specific, but processor-specific. I do wish people would learn that it's the sensor and processor together which make the image, not just the sensor. It's especially aggravating when it's someone who generally knows what they're talking about and is positioned as an informer and educator. (See also his mispronunciation of 'ISO' as if it's an acronym.)



You are splitting hairs - it is whether the D850 _as a camera_ is better than the 5DIV _as a camera_. People use 'sensor' as a shorthand because it is the easiest thing to talk about even though the sensor provides the raw material. Yes, a manufacturer could screw up a great sensor with a poor processor but what is really important is whether you can get good photos from it.


----------



## Click (Dec 7, 2017)

Larsskv said:


> I found the review very intersting and balanced as well. Dustin does a great job!



+1 

Well done, Dustin.


----------



## Isaacheus (Dec 8, 2017)

Mikehit said:


> Mr Majestyk said:
> 
> 
> > ...it's more about the feature gimping. 5D4 is a nice camera but could have easily been a lot better.
> ...



Not the original poster, but I'd assume things like the crop on 4k, limited (read storage intensive) 4k codec, fixed screen rather than tilting/articulating, fps seemed to be minimal increase from the 5dmk3 etc. At least, those are the ones that put me off it.

Other extras like introducing usb charging, a uhs-2 card slot, an ibis system etc would have made it an easy purchase. Not saying it's a bad camera by any means though, just held back


----------



## 9VIII (Dec 8, 2017)

Geek said:


> Hey aceflibble, I don't think I understand the impact that the processor has on the image beyond how quickly the image can be stored and manipulated once acquired from the A/D converter that processes the analog data from the sensor? Please elaborate.



For starters, every camera does baked in noise reduction. For the last five years companies have been pointing to the processor whenever they get new improvements in high ISO.
“Maybe” base ISO isn’t given any processing but it’s unlikely.
I’d bet there are some color adjustments going on as well.


----------



## Mikehit (Dec 8, 2017)

9VIII said:


> Geek said:
> 
> 
> > Hey aceflibble, I don't think I understand the impact that the processor has on the image beyond how quickly the image can be stored and manipulated once acquired from the A/D converter that processes the analog data from the sensor? Please elaborate.
> ...



I agree.
When the signal is converted from digital to analogue I would be amazed if they (r any manufacturer) avoid the temptation to firtle with the S/N and get a 'cleaner' image. You only need to look at Sony's 'star-eater' update to see what can be done in the conversion. Raw is not raw - it is 1's and 0's with conversion applied.


----------



## snoke (Dec 8, 2017)

Larsskv said:


> One small thing could be improved though. Many of the pictures that were compared had slightly different exposures (shutter speed), which hurts the comparison of ISOs, and also the ISO invariance tests. My experience is that more exposure benefits the end result with regards to (less) noise, when using the same ISO (as long as there is no clipping).



All ISO not equal. Canon 5DIV ISO 100 is fake ISO number. DxO measure it.https://www.dxomark.com/Cameras/Compare/Side-by-side/Nikon-D850-versus-Canon-EOS-5D-Mark-IV___1177_1106


----------



## neuroanatomist (Dec 8, 2017)

snoke said:


> Larsskv said:
> 
> 
> > One small thing could be improved though. Many of the pictures that were compared had slightly different exposures (shutter speed), which hurts the comparison of ISOs, and also the ISO invariance tests. My experience is that more exposure benefits the end result with regards to (less) noise, when using the same ISO (as long as there is no clipping).
> ...



As you can also notice from your link, the D850 ISO 100 is also 'fake'. What is your point?


----------



## aceflibble (Dec 8, 2017)

mistaspeedy said:


> I also don't think the processor makes any difference at all to the RAW images at all. The processor and jpg processing algorithms do make a difference to the quality of JPG images, but those are not what is being compared in the video. (RAW images are being tested in the video)


'Raw' is processed too.

Basically, all the sensor does is collect the light; it's the processor which interprets it.

Yes this applies to 'raw'. No, it's not just for .jpgs. There is a huge, _huge_ difference between the processor within a camera and a 'processor' as in software converting to a different format. They may be frequently called the same thing but they are entirely different.

'Raw' files are not actually truly 'raw' data. To create the 'raw' image file the processor needs to interpret the light collected by the sensor.

The way I've found people easily understand this in the past is to liken it to food production, so bear with me a second.
Think about basic food ingredients such as flour or sugar, which you might use to make a cake. When you buy them they seem to be in a pretty basic form—'raw', you could say—but you know that before you were able to buy that flour and sugar it first had to go through many stages of production. It had to grow, be harvested, cleaned up, processed and packaged up. Only after the ingredients have already gone through all that can you buy them and then turn them into a cake.

You can think of the camera's processor—remember, that's the psychical processor driving everything, not the software—as being that intermediate stage between food first being grown and you getting to cook with it. When you open your 'raw' image and start to 'process' it, that's the equivalent of you starting to bake that cake; the base ingredients you're working with were already harvested (sensor) and cleaned and packaged up (processor).

Another way to think of it is like a solar panel. The panel—that's the camera sensor—can collect the light, but it requires much more—the processor—to actually turn that sunlight into energy, which then can be used to power whatever you want. (That last bit is your post-processing.)


So, what is the camera's processor responsible for? It's taking the light the sensor reports and puts it all in order. Noise is mostly down to the processor as it tries to interpret variations in the under-stimulated sensor. The processor decides what data can be safely discarded (yes, even with lossless 'raw', some data is lost) when multiple neighbouring pixels report exactly the same information. (This is the basis of what we commonly refer to as dynamic range). When you see colour banding or specific noise patterns, that's usually down to the processor.

If you look at Fuji cameras, you can see how important the processor is compared to the sensor. Their 'A' cameras use a bayer sensor while their other 'X' cameras us a unique 'X-Trans' array. Despite the sensor's pixels being in a different order, though, all their cameras end up with the same look to the raw files because their processors are the same. A lot of people think DxO don't measure Fuji cameras because of the X-Trans array, but DxO can't measure the X-A bayer camera either... because it's really the Fuji processor that is getting in the way.

This is also why Nikon and Sony cameras have varied in image despite them using the same sensors. They use the same sensor, but not the same processor. As a result, the basic interpretation of colour, contrast, brightness, and noise are different.

Canon users of a decade or so ago, especially portrait photographers, will remember how big a deal it was when DIGIC II came along, and then again when DIGIC II was replaced, and how colours and contrast in Canon cameras changed so much in that period even though the sensors themselves mostly hadn't changed. (Especially the APS-C sensors). There are still some people who swear by DIGIC II cameras (and the first generation of Fuji processors, for that matter) as producing the best files for portraiture.


This is all done to the 'raw'. You are _never_ getting truly untouched 'raw' data. If you did you wouldn't be able to open it. Every 'raw' file, from every manufacturer, has to have been processed by the camera's processor—again, this is the physical chip we're talking about, not the software .jpg conversion—before you can open it.

And before anybody asks "well why don't they give me the option to have the file unprocessed, and why doesn't someone develop a way to open that _truly_ raw data?" the answer is quite simple: imagine buying a lens which never allowed you to focus it, ever. Yeah, now you see why you don't want _truly_ raw data.


Final word on the subject: it would really help ease confusion if people got used to being more specific with their terminology. When you're talking about editing a raw file be sure to call it "post-processing", not just "processing". Don't call Photoshop a "processor", for example, but a "raw processor" or "post-processor" are fine.
There can be up to five different stages of 'processing'from the time the shutter is pressed to the time the file is finished and you typically use at least two _physical_ processors in that process, so you can see how the process of using the correct names for each processor used in the process will make the process of discussing processes and processors less confusing.

Yeah, try and get your heads around that one.




Mikehit said:


> You are splitting hairs - it is whether the D850 _as a camera_ is better than the 5DIV _as a camera_. People use 'sensor' as a shorthand because it is the easiest thing to talk about even though the sensor provides the raw material. Yes, a manufacturer could screw up a great sensor with a poor processor but what is really important is whether you can get good photos from it.


When you want to get an idea of what these parts can do, so you can start to get an idea of what future cameras may be like, it's very important to make the distinction.

E.G.
The 7D3 may well use the same processor—in fact it will likely have two of them—but a different sensor. Knowing which part is responsible for which aspects of the 5D4's image quality can help us estimate the nature of the 7D3's image quality, which helps inform early adopters. Conversely, if you ascribe everything simply to the sensor, you've learnt nothing about the next camera.


----------



## Geek (Dec 8, 2017)

aceflibble said:


> Basically, all the sensor does is collect the light; it's the processor which interprets it.
> 
> Yes this applies to 'raw'. No, it's not just for .jpgs. There is a huge, _huge_ difference between the processor within a camera and a 'processor' as in software converting to a different format. They may be frequently called the same thing but they are entirely different.
> 
> 'Raw' files are not actually truly 'raw' data. To create the 'raw' image file the processor needs to interpret the light collected by the sensor.



I think you are missing part of the equation here. It is the Analog to Digital (A/D) converters that are one of the more critical parts in the path of getting a picture from the image sensor to a usable raw file or jpeg image. 

The A/D converters and associated amplifiers and other analog signal conditioning components are responsible for converting the analog data collected by the sensor array into the binary data that the processor(s) can handle.

Until recently this has been a big differentiator between Canon and Sony/Nikon camera systems. Sony sensors and A/D converters have been manufactured onto the same piece of silicon giving them an advantage in signal noise and therefore cleaner final images. Canon has typically used A/D converters that are on separate pieces of silicon and require longer paths and more potential for noise in the analog path before getting to the A/D converters. (Read some of the other threads that cover this in far more detail than I'll bore you with)

The processors (Canon Digic - Digic 7, Sony BIONZ X, Nikon whatever) take the digital data from the A/D converters and process it. Some if not all of the processors have extra Digital Signal Processor (DSP) hardware in addition to the general purpose cores. The DSP's are used to accelerate processing the digital data in the processors.

The reality is that all processors only process the digital data that is given to them from the A/D converters. The difference between the original Digic and the latest Digic 7, 8 or whatever is the performance of the processor and the power usage. With the appropriate firmware, an original Digic can do the same thing that the latest Digic 7 can do, only much slower and using more power. The Canon Digic line uses an ARM core (a general purpose processor) and some specialized DSP hardware to accelerate processing the digital data.

The processor version make no difference on the final image, only the sensor, the A/D conversion and the algorithms used to process the digital data.

And yes, the raw file is the digital data from the output of the A/D converters along with some other information about the required to turn the data into a usable image. I'm sure as mentioned above there is some processing applied to the raw data to reduce noise adjust for ISO levels, etc. But that is relatively minimal compare to the processing required for final images.

So in summary, the sensor and A/D conversion are very critical to the images. All of the work done by the camera to insure proper exposure and focus is critical to the images. Even the image manipulation algorithms are critical to the images. However as long as the processor is fast enough and does not use too much power, it really does not matter which one is used or even how many are used.


----------



## stevelee (Dec 9, 2017)

I don’t guess any usual camera gives out the raw data from the little sensors under the green, red, and blue filters.


----------



## neuroanatomist (Dec 9, 2017)

stevelee said:


> I don’t guess any usual camera gives out the raw data from the little sensors under the green, red, and blue filters.



Those data are in the RAW file, the demosaicing is performed by the RAW converter (in-camera or on your computer). Typical converters do not allow you to access those data, you need more esoteric software (e.g. RawDigger, Rawnalyze).


----------



## 9VIII (Dec 10, 2017)

Geek said:


> ...
> And yes, the raw file is the digital data from the output of the A/D converters along with some other information about the required to turn the data into a usable image. I'm sure as mentioned above there is some processing applied to the raw data to reduce noise adjust for ISO levels, etc. But that is relatively minimal compare to the processing required for final images.
> ...



This paragraph is self confliciting.
All RAW files are baked, it is impossible to access data as it was generated directly from the ADC on any modern camera. The design of the processor will affect the final image. Hook up Digic 4 to a 5DS and you’ll get a very different image, not just get it slower.
Maybe at one point things weren’t processed that way but they are now.


----------



## aceflibble (Dec 10, 2017)

Geek said:


> [words cut for space]


You're overstating how much the ADC is doing and also how distinct it is from the rest of the processor. The ADC matters, yes, and in past generations some of the biggest leaps in image quality and possibilities were down to advancements in ADC efficiency, but for several hardware generations now we've been at a point where the ADC is A) capable of far more than is being asked of it, and B) so refined and 'direct' in its work that it really can't have any particular image qualities attributed to it. Additionally, it has been a long time since it was normal for the A-D conversion to be handled by a physically separate unit than the main processor (I say 'main'; of course many SLRs have been made with multiple CPUs, and which one is doing the most work varies from camera to camera) and in many cases the whole imaging sequence—light capture by the sensor, conversion to digital, and processing and saving—is now handled within what is technically a single part (by patent purposes), rendering the distinction between ADC and the CPU not only irrelevant but actually inaccurate.

Analogue-digital conversion matters, but your take on it is a little out of date and even in the cases where it's not (e.g. Fuji cameras) it's still not really correctly crediting each part.


----------



## Don Haines (Dec 10, 2017)

Old tech.... the 7D2....

By going to an external A/D system like the traditional Canon system, you have 8 (or 16) A/D converters, and 20 million pixels to read at a burst rate of 10FPS.... That means that in a dual DIGIC setup, the A/D has to be able to read a new value every 80 nanoseconds.... to read video at 60FPS, you are now looking at a new reading from each A/D every 13 nanoseconds... Darn Fast! This is why there is no 4K on the 7D2.

Then switch to new tech.....

Compare this to to an on-chip A/D.... you have one for each row.... for the same size sensor you are now looking at 5472 readings times 10 frames per second..... or 18,265 microseconds per reading..... a heck of a lot slower, and that means greater accuracy and much lower noise.

The change to the type of A/D system means far more than which processor is running the algorithm to place the output data into a file......


----------



## RGF (Dec 11, 2017)

Thought this was very good. I am surprised how close the 5D M4 comes to the D850.


----------



## midluk (Dec 11, 2017)

Even in cases where the ADC is embedded in the same chip as the (digital) processor, I would not count it as part of the processor for this discussion. For me the ADC is part of the sensor (even if not on the same chip). The ADC of course has an influence on the noise performance of the image, but not on color (except for cases where it introduces crosstalk between pixels of different colors).

I have played around with the raw files (self-made c++ program using libraw) from the 70D and there are at least two areas where it is appears to not be completely raw. The raw file contains meta data (including the preview-jpeg) and a 14 bit number for each pixel. Each pixel is sensitive to only one color. On the edges of the sensor there are some special covered pixels, some black, some always saturated.

The black level in the file is always 4096, independent of exposure time and ISO. I would have expected that to go up due to dark current. It seems like the camera is subtracting (or adding) an offset to all values to shift black to 4096. The camera can compute the offset from dedicated pixels at the edge of the sensor which are covered by some material and therefore always receive no light. It is however not completely impossible to do the offset completely analog before the ADC, so the processor might not actually alter the data from the ADC here.

Known bad pixels (a list of bad pixels in the camera) are masked (likely replaced by the average of the surounding pixels). My camera has developed two or three hot pixels during the years, which no longer appear in the raw file after letting the camera check for them after or during manual sensor cleaning (close body cap, set camera into manual sensor cleaning, wait ~30s, turn camera off, wait some more). The fact that they are not removed automatically before making them known to the camera leads to the conclusion, that no additional noise removal (like Sony's star eater) is performed on the raw data.

The processor can do nothing for color (except for adding meta data and the preview jpeg) in raw files. If changes in color appear with changes of the processor, this might be caused by changed color filter arrays which produce better results but need more processing power to generate the in-camera jpeg (which only the new processor can do reliably). Or just changes to the default processing profiles in the camera which then are also included as default in raw converters.


----------



## sanj (Dec 11, 2017)

RGF said:


> Thought this was very good. I am surprised how close the 5D M4 comes to the D850.



Mean.


----------



## Geek (Dec 11, 2017)

9VIII said:


> Geek said:
> 
> 
> > ...
> ...



I disagree completely!! Hook up a Digic 4 to a 5DS and use all of the same algorithms and the A/D converter from the 5DS and you will get the same results as the 5DS with dual Digic 6's. Much slower, but the same results. In fact the original Digic will give the same results, far too slowly to be used in a product, but still the same results.

If you call raw files baked by having perhaps a little noise reduction(even that is somewhat speculation) applied and any offsets for ISO, dark current, etc., then I agree with you that the raw files are baked. However, the data provided in a raw file is representative of the digital values read from the individual photo sites on the sensor with minor tweaks specific to the camera hardware. The raw data does not have any significant processing applied.


----------



## aceflibble (Dec 12, 2017)

If any of y'all don't think the processor can or does make a difference to colour, you clearly were not around Canon during the big DIGIC II crisis. Go look up the history of that processor and its cameras. Note they all used different sensors and different ADC (as was prior to ADC being physically encapsulated within the CPU housing), but they all ended up with the same (raw) colour (accounting for sensor size variations, of course) and that DIGIC II-specific colour is something that is still frequently requested (especially for video) and the DIGIC II bodies have managed to hold their value better (relative to age) than the DIGIC III bodies as a result.


In any case, at least we're now (mostly) in agreement that it's not just the sensor responsible for producing everything, which was the original point being made.


----------



## bwud (Dec 14, 2017)

aceflibble said:


> t's especially aggravating when it's someone who generally knows what they're talking about and is positioned as an informer and educator. (See also his mispronunciation of 'ISO' as if it's an acronym.)



It is an acronym, hence it being pronounced “eye-sew.” Dustin used it as an initialism.


----------



## Don Haines (Dec 14, 2017)

aceflibble said:


> If any of y'all don't think the processor can or does make a difference to colour, you clearly were not around Canon during the big DIGIC II crisis. Go look up the history of that processor and its cameras. Note they all used different sensors and different ADC (as was prior to ADC being physically encapsulated within the CPU housing), but they all ended up with the same (raw) colour (accounting for sensor size variations, of course) and that DIGIC II-specific colour is something that is still frequently requested (especially for video) and the DIGIC II bodies have managed to hold their value better (relative to age) than the DIGIC III bodies as a result.
> 
> 
> In any case, at least we're now (mostly) in agreement that it's not just the sensor responsible for producing everything, which was the original point being made.



Having been active is DSP since the late 1970s, I disagree.

You build up a RAW file by recording the various settings on the camera and the lens, and then reading the sensor data from the output of the A/D converter (typically an 8X8 pixel block), compressing it, and proceeding to the next block until the entire sensor is read. A Jpg file may also be created and stored separately, and a jpg thumbnail may be appended to the RAW file.

The processor used to do this has no effect on the image, it only affects speed and power consumption.


----------



## Geek (Dec 14, 2017)

aceflibble said:


> If any of y'all don't think the processor can or does make a difference to colour, you clearly were not around Canon during the big DIGIC II crisis. Go look up the history of that processor and its cameras. Note they all used different sensors and different ADC (as was prior to ADC being physically encapsulated within the CPU housing), but they all ended up with the same (raw) colour (accounting for sensor size variations, of course) and that DIGIC II-specific colour is something that is still frequently requested (especially for video) and the DIGIC II bodies have managed to hold their value better (relative to age) than the DIGIC III bodies as a result.
> 
> 
> In any case, at least we're now (mostly) in agreement that it's not just the sensor responsible for producing everything, which was the original point being made.



Again to be precise, the processor does not have any impact on the final image. The lens, focusing mechanism, exposure mechanism, sensor, all of the analog circuitry, A/D converters, algorithms (firmware/software) used by the processor to process images, etc. have an impact on image, but the processor itself does not.

As Don stated, the processor just impacts the speed and power used.


----------



## aceflibble (Dec 16, 2017)

"Everything that makes up and is used by the processor matters... but the processor doesn't matter."

... You two realise how you're contradicting yourselves there, right? Or did you skip the earlier part of the thread? We've already gone over how everything other than the sensor itself is encapsulated by the processor now (and this has been the way for over a decade now).


----------



## 3kramd5 (Dec 17, 2017)

aceflibble said:


> "Everything that makes up and is used by the processor matters... but the processor doesn't matter."
> 
> ... You two realise how you're contradicting yourselves there, right? Or did you skip the earlier part of the thread? We've already gone over how everything other than the sensor itself is encapsulated by the processor now (and this has been the way for over a decade now).



Problematic is that some of which “we” have gone over is bad information buried in good, for example that the basis for which we call dynamic range is the processor throwing away data from adjacent pixels of equal magnitude, when it is in actuality a function of full saturation and the lowest meaningful signal (i.e. where SNR is greater than 1).

What is lost in all this is that a processor merely does what it’s told to. It’s a series of logic gates configured programmatically by people. If you took the same logic from a digic 5 chip and ported it to digic 2 (assuming the image fits), they would act the same on the same supplied compatible data.


----------



## Geek (Dec 18, 2017)

aceflibble said:


> "Everything that makes up and is used by the processor matters... but the processor doesn't matter."
> 
> ... You two realise how you're contradicting yourselves there, right? Or did you skip the earlier part of the thread? We've already gone over how everything other than the sensor itself is encapsulated by the processor now (and this has been the way for over a decade now).



I think I've been trolled!! Either that or these people are lumping way too much stuff into their definition of a processor. Apparently including some "magic"!

Uncle, I'm tapping out!


----------

