# New Sensor Tech in EOS 7D Mark II [CR2]



## Canon Rumors Guy (Jun 19, 2014)

```
<div name="googleone_share_1" style="position:relative;z-index:5;float: right; /*margin: 70px 0 0 0;*/ top:70px; right:120px; width:0;"><g:plusone size="tall" count="1" href="http://www.canonrumors.com/?p=16756"></g:plusone></div><div style="float: right; margin:0 0 70px 70px;"><a href="https://twitter.com/share" class="twitter-share-button" data-count="vertical" data-url="http://www.canonrumors.com/?p=16756">Tweet</a></div>
We’re told to definitely expect new sensor technology to be introduced in the Canon EOS 7D Mark II. This tech will be used in all forthcoming Canon DSLRs. What is it? We’re not 100% sure yet, though we’re told it’s definitely not a foveon type technology that <a href="http://www.canonrumors.com/2013/05/patent-canon-foveon-sensor/" target="_blank">we’ve previously seen in patents</a>.</p>
<p>This may be one of Canon’s best kept secrets as it’s apparently going to be more than an “evolutionary” technology.</p>
<p>We’re filtering through a lot of emails about this new sensor tech and we’re not sure what’s true and what isn’t.</p>
<p><em>More to come… We’re not putting a [CR3] on this until we know what the new tech is.</em></p>
<p><strong><span style="color: rgb(255, 0, 0);">c</span>r</strong></p>
```


----------



## Don Haines (Jun 19, 2014)

and this explains the wait and delays far better than conspiracy theories.....


----------



## angaras (Jun 19, 2014)

Sooner the better. We have had a long wait


----------



## Lee Jay (Jun 19, 2014)

Maybe it's compatibility with DPP 4.

Assuming it's a Bayer sensor with multiple pixels under each microlens (like the 70D), there's not a lot they can do to improve sensor performance that's outside the realm of read noise. There are several ways to attack that one, and some of them involve doing clever things with the multiple pixels per microlens, such as reading out each one at a different ISO and then combining them, sort of like what Magic Lantern has done to increase DR.


----------



## Lee Jay (Jun 19, 2014)

dilbert said:


> > This may be one of Canon’s best kept secrets as it’s apparently going to be more than an “evolutionary” technology.
> 
> 
> 
> Does anyone have an example of what Canon call revolutionary?



IS, USM, video in a full-frame dSLR, and dual-pixel technology come to mind.


----------



## neuroanatomist (Jun 19, 2014)

dilbert said:


> Does anyone have an example of what Canon call revolutionary?


Yes. "_Canon has unveiled its most advanced PowerShot compact camera ever – the *revolutionary* 14.3 Megapixel PowerShot G1 X..._" (link)


----------



## naylor83 (Jun 19, 2014)

Wow, this sounds exciting. Perhaps our five-year wait will actually turn out to have been worth it.

Right now I'll just about take anything though ... as long as it isn't 18 megapixels


----------



## Don Haines (Jun 19, 2014)

dilbert said:


> > This may be one of Canon’s best kept secrets as it’s apparently going to be more than an “evolutionary” technology.
> 
> 
> 
> Does anyone have an example of what Canon call revolutionary?


DPAF
white camera bodies


----------



## cid (Jun 19, 2014)

neuroanatomist said:


> dilbert said:
> 
> 
> > Does anyone have an example of what Canon call revolutionary?
> ...



it depends ... revolutionary for who, 14MPx selfies could be revolutionary for half of FB users ;D


----------



## PhotoCat (Jun 19, 2014)

I would be happy if this new sensor addresses the shadow noise problem at low ISO settings like 100 & 160.


----------



## Chaitanya (Jun 19, 2014)




----------



## Woody (Jun 19, 2014)

Lee Jay said:


> dilbert said:
> 
> 
> > > This may be one of Canon’s best kept secrets as it’s apparently going to be more than an “evolutionary” technology.
> ...



Their early 45-point AF sensor and ring USM lenses offered revolutionary and unrivalled AF performance for nearly a decade before Nikon finally came on board with their 51-point AF sensor in the D3.

Other examples of Canon's glorious past can be found here:
http://www.the-digital-picture.com/reviews/20-years-of-canon-eos.aspx


----------



## Woody (Jun 19, 2014)

PhotoCat said:


> I would be happy if this new sensor addresses the shadow noise problem at low ISO settings like 100 & 160.



Crossing my fingers... Not because I need to recover shadow details by 4 stops... Mostly for bragging rights...


----------



## Delish (Jun 19, 2014)

For some reason I doubt it will be a new 22nm production process and peltier cooled sensor ;D


----------



## Lee Jay (Jun 19, 2014)

Putting a few things together into PURE speculation...

It was mentioned before that the viewfinder would be pretty big - as big as the 5DIII. I pointed out that would make it pretty dim.

What if the "revolutionary sensor technology" is quad-pixel, for a cross-type AF sensor under every pixel, and that it works so well with a new processor that there's no need for a separate PDAF module in the bottom of the mirror box? That would enable the main mirror to be "fully silvered" instead of "partially silvered" which would mean a brighter viewfinder even at the same size.

I kind of doubt it, but it's a bit fun to speculate.


----------



## Delish (Jun 19, 2014)

Lee Jay said:


> Putting a few things together into PURE speculation...
> 
> It was mentioned before that the viewfinder would be pretty big - as big as the 5DIII. I pointed out that would make it pretty dim.
> 
> ...



Would that not require the mirror to be up to AF? gogo EVF


----------



## tomscott (Jun 19, 2014)

Really exciting at this point anything is a bonus! I will be glad to see Canon innovating again!


----------



## ecka (Jun 19, 2014)

Does "NEW" means it was never used before? ... or more like the usual "new" from the last couple of years?


----------



## wockawocka (Jun 19, 2014)

Part of me wouldn't be surprised if this is the high MP body.


----------



## Krob78 (Jun 19, 2014)

dilbert said:


> neuroanatomist said:
> 
> 
> > dilbert said:
> ...


Not so quickly!! It actually states it as being more than "EVOLUTIONARY" not "Revolutionary"... Therefore it's quite possibly more than an evolutionary advancement but not revolutionary. Or for that matter it could be more than evolutionary and quite revolutionary as well...


----------



## Krob78 (Jun 19, 2014)

tomscott said:


> Really exciting at this point anything is a bonus! I will be glad to see Canon innovating again!


+1 I'm having a good feeling that Canon won't let us down with the 7D MkII! ;D


----------



## Lee Jay (Jun 19, 2014)

Delish said:


> Lee Jay said:
> 
> 
> > Putting a few things together into PURE speculation...
> ...



I guess it would, so it would still need to be partially silvered to retain AF and not have nothing but a stupid EVF.


----------



## justawriter (Jun 19, 2014)

Could this mean the introduction of the long rumored multi-imp iconograph? ;D
http://wiki.lspace.org/mediawiki/index.php/Iconograph


----------



## bdunbar79 (Jun 19, 2014)

wockawocka said:


> Part of me wouldn't be surprised if this is the high MP body.



Unfortunately I wouldn't either.


----------



## NancyP (Jun 19, 2014)

This sounds as if they have completed a new fabrication process, which would be good news. Hey, after at least two years of speculation, why not?


----------



## dolina (Jun 19, 2014)

Announcement for the 7D replacement should fall on August or September.


----------



## x-vision (Jun 19, 2014)

Lee Jay said:


> Assuming it's a Bayer sensor with multiple pixels under each microlens (like the 70D), there's not a lot they can do to improve sensor performance that's outside the realm of read noise. There are several ways to attack that one, and some of them involve doing clever things with the multiple pixels per microlens, such as reading out each one at a different ISO and then combining them, sort of like what Magic Lantern has done to increase DR.



Either that - or, it might not be a Bayer sensor in the first place.

By the look of things, the so called dual-pixel tech is actually quad-pixel already.
See my previous post on the topic here.

With a quad-pixel design, rather than having a single color filter per pixel, it's theoretically possible to have individual color filters for each of the four sub-pixels.
These color filters don't need to be monochromatic R/G/B filters anymore.
Instead, these could be a combination of di/poly-chromatic filters, from which the _full _color of a pixel can be derived.
That's better than a Bayer sensor, where two of the pixel colors need to be interpolated from neighboring pixels. 

So, you never know. The 7DII could have the first non-Bayer sensor in a DSLR.
If they use a combination of dichromatic filters for each sub-pixel, they could achieve maybe 1 stop of ISO improvement vs a Bayer sensor. 
I think Canon will inevitably implement this sooner or later, given that they have gone the quad-pixel route already. 
The question is, will the 7DII be the first camera to have it - or will we have to wait more for that.


----------



## Marsu42 (Jun 19, 2014)

Canon Rumors said:


> This may be one of Canon’s best kept secrets as it’s apparently going to be more than an “evolutionary” technology.



Reminds me of the newly developed sensor for the 6d (inc. the newly-developed 11-point af system )... which is a certainly nice, but it's all the same general sensor generation for a long time.

My bet: They won't release a "revolutionary" iq technology in a crop camera, but would target the high-end ff market first. Much more likely it's in the direction of on-sensor af, evf/ovf hybrid and video-stills combination for ultra-high fps.


----------



## Lee Jay (Jun 19, 2014)

x-vision said:


> Lee Jay said:
> 
> 
> > Assuming it's a Bayer sensor with multiple pixels under each microlens (like the 70D), there's not a lot they can do to improve sensor performance that's outside the realm of read noise. There are several ways to attack that one, and some of them involve doing clever things with the multiple pixels per microlens, such as reading out each one at a different ISO and then combining them, sort of like what Magic Lantern has done to increase DR.
> ...



Wouldn't you be worried about that approach messing up the phase detection?


----------



## x-vision (Jun 19, 2014)

Lee Jay said:


> Wouldn't you be worried about that approach messing up the phase detection?


Yes ;D.


----------



## Don Haines (Jun 19, 2014)

I have wondered for a long time why the bayer filter has survived. Instead of breaking up a pixel into 4 squares, G-G-R-B, why can't it be made into 3 rectangles of R-G-B where the colours no go from 25 percent of the area to 33 percent of the area and you gain about a third of a stop?

Now insert DPAF and imagine each pixel made up of 6 subpixels, 3DPAF pairs, and alternate the orientation of adjacent pixels so you can do DPAF in both the vertical and the horizontal plane.

that would certainly be a logical growth from what is in the 70D....


----------



## Lee Jay (Jun 19, 2014)

Don Haines said:


> I have wondered for a long time why the bayer filter has survived. Instead of breaking up a pixel into 4 squares, G-G-R-B, why can't it be made into 3 rectangles of R-G-B where the colours no go from 25 percent of the area to 33 percent of the area and you gain about a third of a stop?
> 
> Now imagine each pixel made up of 6 subpixels, 3DPAF pairs, and alternate the orientation of adjacent pixels so you can do DPAF in both the vertical and the horizontal plane.
> 
> that would certainly be a logical growth from what is in the 70D....



I think if you have quad, each pixel can do horizontal, vertical and diagonal in each direction phase measurements. So, no need for six.


----------



## neuroanatomist (Jun 19, 2014)

x-vision said:


> Either that - or, it might not be a Bayer sensor in the first place.
> 
> By the look of things, the so called dual-pixel tech is actually quad-pixel already.
> See my previous post on the topic here.



You mean your previous post that was bogus and immediately discredited, because your conclusion was based on erroneous interpretation? :


----------



## Don Haines (Jun 19, 2014)

Lee Jay said:


> Don Haines said:
> 
> 
> > I have wondered for a long time why the bayer filter has survived. Instead of breaking up a pixel into 4 squares, G-G-R-B, why can't it be made into 3 rectangles of R-G-B where the colours no go from 25 percent of the area to 33 percent of the area and you gain about a third of a stop?
> ...



The idea of six subpixels is to have pairs for DPAF, and the three pairs gets rid of the current bayer sensor with half the real estate devoted to green. This improves the sensitivity of red and blue by a third of a stop.

I did a quick drawing of the idea....


----------



## x-vision (Jun 19, 2014)

Don Haines said:


> I did a quick drawing of the idea....


The thing about this arrangement is that it uses single-color (monochromatic) filters.
So, you are still 'throwing away' 2/3rds of the incident light.
The trick would be to use more transmissive filters (say R+G, R+B, G+B) and thus throw away less than 2/3rds of light.
Your arrangement does improve resolution, though.


----------



## x-vision (Jun 19, 2014)

neuroanatomist said:


> You mean your previous post that was bogus and immediately discredited, because your conclusion was based on erroneous interpretation? :


It's all a matter of interpretation, I guess 8). 
The arguments against my previous post were extremely weak.


----------



## gsealy (Jun 19, 2014)

It seems to me that we have some general trends and concepts going on:
1.) Dual pixel technology (and the like) is going to be spread throughout the product line, it is a significant differentiator compared to competitors.
2.) Canon is adding value to each product through firmware updates, thus giving them a longer life cycle.
3.) Video is becoming more pervasive, I would expect the 7DII to have at least a clean HDMI out for external recording.
4.) 4K is in the future and while the 7DII might not have it now, perhaps the sensor will support it for a future upgrade done at a service center. 
5.) Low noise at low light situations is a must. 

I am not an expert on these things - but Canon has to be more than competitive. This is a major release in their product line, so they have to make a statement.


----------



## rrcphoto (Jun 19, 2014)

Lee Jay said:


> Putting a few things together into PURE speculation...
> 
> It was mentioned before that the viewfinder would be pretty big - as big as the 5DIII. I pointed out that would make it pretty dim.



Not really . the difference would be negligible over the current 7D viewfinder. I didn't see people stating that the 50D viewfinder was far brighter than than 7D - which would be about the same ratio.


----------



## Don Haines (Jun 19, 2014)

x-vision said:


> Don Haines said:
> 
> 
> > I did a quick drawing of the idea....
> ...


interesting.... so you think that it could be done with the more transmissive filters? the combination (if it works) could get to only throwing away 1/3 of the light, as opposed to the current bayer which throws away 3/4 of the light.....


----------



## pierlux (Jun 19, 2014)

I think CR guy knows more than he's reporting. C'mon Craig, tell us something more!


----------



## neech7 (Jun 19, 2014)

Woody said:


> PhotoCat said:
> 
> 
> > I would be happy if this new sensor addresses the shadow noise problem at low ISO settings like 100 & 160.
> ...



And to finally shut those annoying Nikon and Sony fanboys up.


----------



## carlosmeldano (Jun 19, 2014)

When 70D was announced after a long delay after circulating speculations on both a 20MP or a 24MP sensor or an existing 18MP sensor, I thought that this technology is not completely done yet. I think it was developed further and tested for a complete solution for 7D2 and the delay is because of this.

I also vote for an advanced Dual/Quad Pixel CMOS AF for 7D2 and the rumors from last year (before 70D) points to this.

But, I also think that 70D is capable of doing more. I think the tech applied in the 70D is a bit dumbed down by firmware to keep room for a bigger bang, the 7D2.


----------



## tiger82 (Jun 19, 2014)

dilbert said:


> Does anyone have an example of what Canon call revolutionary?



White bodies?


----------



## Don Haines (Jun 19, 2014)

tiger82 said:


> dilbert said:
> 
> 
> > Does anyone have an example of what Canon call revolutionary?
> ...


I heard it will come out in midnight camouflage....


----------



## LetTheRightLensIn (Jun 19, 2014)

Does it improve IQ or is it more of an ultra-enhanced dual-pixel AF type thing? I hope the former.

Will they finally catch up (and maybe even go beyond) for low ISO DR?

Using a new sensor tech can certainly explain why the 7D2 and high MP camera are long to arrive (a wise and well worth it decision IMO, so long as this really does get the improved DR and other things).


----------



## LetTheRightLensIn (Jun 19, 2014)

Lee Jay said:


> Maybe it's compatibility with DPP 4.
> 
> Assuming it's a Bayer sensor with multiple pixels under each microlens (like the 70D), there's not a lot they can do to improve sensor performance that's outside the realm of read noise. There are several ways to attack that one, and some of them involve doing clever things with the multiple pixels per microlens, such as reading out each one at a different ISO and then combining them, sort of like what Magic Lantern has done to increase DR.



That has problems. Much better would be if this makes use of the patent where they read out every photosite at high and low ISO at the same time. A good way to boost DR up a lot at lower ISOs.


----------



## LetTheRightLensIn (Jun 19, 2014)

Lee Jay said:


> Putting a few things together into PURE speculation...
> 
> It was mentioned before that the viewfinder would be pretty big - as big as the 5DIII. I pointed out that would make it pretty dim.
> 
> ...



perhaps, i'd personally rather they fix up low ISO DR first though, if it's this quad pixel af supreme stuff we can maybe forget ever getting better DR for ages and ages I fear (although they might go different ways with high MP FF and 7D2 so we'd need to see the next FF to know for sure)


----------



## LetTheRightLensIn (Jun 19, 2014)

Don Haines said:


> I have wondered for a long time why the bayer filter has survived. Instead of breaking up a pixel into 4 squares, G-G-R-B, why can't it be made into 3 rectangles of R-G-B where the colours no go from 25 percent of the area to 33 percent of the area and you gain about a third of a stop?
> 
> Now insert DPAF and imagine each pixel made up of 6 subpixels, 3DPAF pairs, and alternate the orientation of adjacent pixels so you can do DPAF in both the vertical and the horizontal plane.
> 
> that would certainly be a logical growth from what is in the 70D....



green is more important overall


----------



## LetTheRightLensIn (Jun 19, 2014)

x-vision said:


> Lee Jay said:
> 
> 
> > Assuming it's a Bayer sensor with multiple pixels under each microlens (like the 70D), there's not a lot they can do to improve sensor performance that's outside the realm of read noise. There are several ways to attack that one, and some of them involve doing clever things with the multiple pixels per microlens, such as reading out each one at a different ISO and then combining them, sort of like what Magic Lantern has done to increase DR.
> ...



does it make sense to even call them sub-pixels at that point
nah


----------



## LetTheRightLensIn (Jun 19, 2014)

Don Haines said:


> Lee Jay said:
> 
> 
> > Don Haines said:
> ...



wouldn't that make color aliasing and all sorts of other interpolations awfully tricky


----------



## LetTheRightLensIn (Jun 19, 2014)

neech7 said:


> Woody said:
> 
> 
> > PhotoCat said:
> ...



Or how about mostly so we as Canon users can be free to tackle more types of shots very well?
You know it's not about this fanboy nonsense, it's about getting better performance, for US, for Canon users.


----------



## Tugela (Jun 19, 2014)

If it is new technology it would be patented, and based on the delays to the 7D2 release, that patent application would have already been published by now (publication happens a year after filing). 

So, just go through their relatively recent sensor patent applications and it will be one of those.


----------



## pknight (Jun 19, 2014)

Tugela said:


> If it is new technology it would be patented, and based on the delays to the 7D2 release, that patent application would have already been published by now (publication happens a year after filing).
> 
> So, just go through their relatively recent sensor patent applications and it will be one of those.



Or, some combination of features included in those patents, which might make it more difficult for anyone to predict.


----------



## memoriaphoto (Jun 19, 2014)

Tugela said:


> If it is new technology it would be patented, and based on the delays to the 7D2 release, that patent application would have already been published by now (publication happens a year after filing).
> 
> So, just go through their relatively recent sensor patent applications and it will be one of those.



Could have been stamped as a trade secret instead? Not sure if that's normal in this business (and for a sensor) but it would keep it hush-hush at least...


----------



## sengineer (Jun 19, 2014)

Here is a new sensor tech that would revolutionize cameras. http://spectrum.ieee.org/tech-talk/semiconductors/devices/sony-creates-curved-cmos-sensors-that-mimic-the-eye/?utm_source=techalert&utm_medium=email&utm_campaign=061914
It is a curved sensor like the eye.


----------



## rs (Jun 19, 2014)

x-vision said:


> So, you never know. The 7DII could have the first non-Bayer sensor in a DSLR.



Sigma beat them to it back in 2002 with the SD9 and it's foveon sensor. And the many models which have superseded it. But even they were beaten to market by the Fuji S1 Pro of 2000 with its Super CCD arrangement.


----------



## Tugela (Jun 19, 2014)

pknight said:


> Tugela said:
> 
> 
> > If it is new technology it would be patented, and based on the delays to the 7D2 release, that patent application would have already been published by now (publication happens a year after filing).
> ...



They would have patented those combinations as well, to avoid the possibility that someone else might do it before they released their product.

Whatever the technology is, it will be largely described in their existing patent literature, it will not be something out of the blue.


----------



## jrista (Jun 19, 2014)

Marsu42 said:


> Canon Rumors said:
> 
> 
> > This may be one of Canon’s best kept secrets as it’s apparently going to be more than an “evolutionary” technology.
> ...



None of those other technologies were ever rumored to be more than evolutionary, though. And "new" is what every brand tacks onto all their "evolved" technologies...that's par for the course. Given how old the original 7D is, Canon has to know they can't spit out some mediocre evolutionary improvements, especially with everyone in Canon's camp scrambling for more DR, after such a long wait. 

Canon has a lot of good technology, and a lot of patents for really good technology that I haven't seen implemented in any of their sensors (not even their video sensors, which is what many of the patents are for.) I hope that the 7D II will be the camera that they finally actually EMPLOY some of their cool sensor technology with.


----------



## Sabaki (Jun 19, 2014)

Question 1: Regardless of brand...which camera is the current King of Crop?

Question 2: What does Canon need to deliver with the 7Dii in order to put it in a league above every other crop body out there?


----------



## neuroanatomist (Jun 19, 2014)

x-vision said:


> neuroanatomist said:
> 
> 
> > You mean your previous post that was bogus and immediately discredited, because your conclusion was based on erroneous interpretation? :
> ...



Lol. The argument was that you weren't looking at the active imaging area of the sensor. Unless you paid Chipworks for the full analysis, your conclusions are bogus.


----------



## l_d_allan (Jun 19, 2014)

tomscott said:


> Really exciting at this point anything is a bonus! I will be glad to see Canon innovating again!



I'm looking forward to 7d2 technology finding its way to ## and ### series cameras.


----------



## Lee Jay (Jun 19, 2014)

l_d_allan said:


> tomscott said:
> 
> 
> > Really exciting at this point anything is a bonus! I will be glad to see Canon innovating again!
> ...



I'm personally looking forward to a full-frame (5D) version of the 7D replacement, assuming it's as-rumored.


----------



## docsmith (Jun 19, 2014)

Lee Jay said:


> l_d_allan said:
> 
> 
> > tomscott said:
> ...



+1.  7DII...Fall 2014, 5DIV/1DXII Fall of 2015/Spring 2016. That would be 4 years...about right, especially if the technology is truly "revolutionary" I can't see them waiting more than a year (+/-) to get it into their flagship bodies.


----------



## jrista (Jun 19, 2014)

x-vision said:


> Lee Jay said:
> 
> 
> > Assuming it's a Bayer sensor with multiple pixels under each microlens (like the 70D), there's not a lot they can do to improve sensor performance that's outside the realm of read noise. There are several ways to attack that one, and some of them involve doing clever things with the multiple pixels per microlens, such as reading out each one at a different ISO and then combining them, sort of like what Magic Lantern has done to increase DR.
> ...



I debunked your theory on this before. You are looking at the BACK side of the sensor, near the PERIPHERY, where readout connections and the like go. What you are looking at in that ULTRA TINY chipworks image is NOT the sensor. It is a stamp on the back side edge of the sensor...that's all! Canon does not have QPAF technology. You are wildly misinterpreting something you do not understand, and purpetrating a falsehood. 

Canon has multiple patents for DPAF...they have ZERO patents for QPAF. As it stands, no one actually has a patent for any kind of quad pixel focal-plane AF system.

Further, for everyone else who continues to perpetrate the myth that somehow the two halves of the pixels, which are under not only one microlens, but also under one color filter block, could somehow magically be used to expand dynamic range "for free" are fooling themselves, and anyone who listens to them. Magic lantern either uses two FULL sensor reads (vs. half sensor reads), or they do line interpolation for half the resolution, to achieve their dynamic range. There is no free increase to dynamic range, and DPAF isn't going to somehow allow more dynamic range for free. The problem with the idea of using one half of the AF photodiodes for an ISO 100 read, and the other half for an ISO 800 read, is that is HALF the light! That is not the same as what ML does, which involves the FULL quantity of light, or else half the light AND half the resolution.

There is no magical dynamic range enhancement with Canon's DPAF. The name is even misleading, as it isn't dual "pixels"...it's dual photodiodes per pixel. That should tell you something about the true nature of DPAF right there.


----------



## Mt Spokane Photography (Jun 19, 2014)

dilbert said:


> > This may be one of Canon’s best kept secrets as it’s apparently going to be more than an “evolutionary” technology.
> 
> 
> 
> Does anyone have an example of what Canon call revolutionary?


 
Perhaps you do not remember when Canon introduced a FF CMOS sensor, but Nikon held on to their APS-C sensor! That was pretty revolutionary at the time.


----------



## 9VIII (Jun 19, 2014)

LetTheRightLensIn said:


> x-vision said:
> 
> 
> > Lee Jay said:
> ...



Right.
What would be really cool to see is some sort of hardware level binning process that maintains the integrity of the RAW file.

Half the reason I'm so anxious for super high resolution cameras is that I haven't been terribly impressed with the image quality off my 5D2. That nasty AA filter (which I'm pretty sure is especially bad on the 5D2) effectively cuts resolution in half. When I first saw my pictures on a decent 4MP monitor I was amazed at how little detail loss there was vs. looking at the image zoomed to 100%. My bet is that a good 4K (8MP) monitor is going to display your images with just as much detail as a high quality print... Because the detail actually isn't there in the first place.

One option is just quadrupling resolution and getting rid of the AA filter (which I'm actually fine with), but if they could bin the full per-pixel RGB signal on the sensor it should effectively deal with moire, and we get to keep our current file size, and it should produce an actual 20MP image instead of the blurred out fake we currently end up with.

The last thing I really want to see is the integration of clear microlenses. Even the heavily faded green pixels that we have right now still block a lot of light. Given how advanced interpolation is I doubt that eliminating the colour value for one of the pixels would have a significant impact on image quality.


----------



## jrista (Jun 20, 2014)

Mt Spokane Photography said:


> dilbert said:
> 
> 
> > > This may be one of Canon’s best kept secrets as it’s apparently going to be more than an “evolutionary” technology.
> ...



Diffractive optics. Everyone else thought it was literally impossible to make a lens with diffractive optics. Canon persisted, and they have the most compact 400mm FF DSLR lens in the world. Canon has had PLENTY of revolutionary technological advances and improvements to their technology.

Apparently, a mere two years since the release of the D800 is enough to forget the technological LEADERSHIP that Canon has demonstrated for decades. It's only been TWO YEARS, Dilbert...a camera generation is usually closer to FOUR years...so it's no surprise that Canon hasn't leapfrogged the competition yet.


----------



## dufflover (Jun 20, 2014)

Considering what Canon has called "big leaps" in the past few years I am very skeptical LOL. Sure some of it was good like DPAF, but when it comes to sensor tech considering how old their general line up is (in the crop space) I would not be surprised one bit if their (r)evolutionary (whatever word you want) new sensor only ends up matching the competition ... welcome step nonetheless ofcourse!


----------



## brad-man (Jun 20, 2014)

I think the new sensor will be a marked improvement simply because Canon probably feels like their reputation depends on it, and I think it does. I don't want to debate market share, profit margins or anything else of the sort. Canon needs to release a bad-ass sensor as a matter of pride and reputation, and I think they will. The 7Dll (or whatever they call it) needs to be a significant upgrade from the 70D (a fine camera in its own right). And not just in build quality, auto focus and frame rate, but Image Quality. As an aside, I have been waiting for the 5Dlll to come down to a price level that I'm comfortable with for quite some time. I shoot with a 6D and a 7D and would absolutely love the "all in one" beauty of the mark lll, but I'm just an enthusiast and follow the old adage of putting my money into the glass. Anyway, the 5Dlll is frequently available in the $2600 price range and this is approaching _my_ price range, but I will absolutely wait to see if Canon's new generation of sensors are enough of an upgrade to pay what_ I_ consider to be a ridiculous price for. It's just a hobby, right? Well, thanks for listening to my subjective rant.


----------



## x-vision (Jun 20, 2014)

jrista said:


> You are wildly misinterpreting something you do not understand, and purpetrating a falsehood.



Relax, we are all just speculating here. Even if I'm wrong, so what ??
So, cool down.


----------



## jrista (Jun 20, 2014)

9VIII said:


> LetTheRightLensIn said:
> 
> 
> > x-vision said:
> ...



Sorry, but that (bolded) is such a ludicrous, laughable comment, I'm just flabbergasted. An AA filter DOES NOT cut resolution "in half". That is blowing things SO FAR out of proportion it may be one of the most ludicrous things I've read on these forums. OLPFs, optical low pass filters, are designed to affect high frequencies only, and only around the nyquist limit at that. You lose a TINY amount of resolution...but it doesn't matter, because the "resolution" your losing just contains nonsense anyway. OLPFs blur very high frequency data that nearly or exactly matches the spatial frequency of the sensor's pixels just enough such that they the information doesn't alias. That's it. Aliased information is a REAL loss of information. Technically speaking, OLPFs PRESERVE information...they save information that can be saved, and discard information that cannot be correctly interpreted by the sensor anyway. On top of that, a very light application of unsharp masking can effectively reverse the blurring, and improve the resolution of that high frequency data, without actually bringing back all the nonsense. 

Quadrupling resolution and removing the AA filter is only an option if your lenses cannot resolve that much detail. With the resolving power of Canon's current lens lineup at faster apertures, I'm not so sure that cutting pixels into quarters is actually enough to avoid any kind of aliasing. At narrower apertures, like f/8, diffraction already blurs information enough that it can't alias, but that's a really narrow aperture for a lot of work, not everyone uses it. There are very few applications where removal of an AA filter will not cause aliasing of some kind, and pretty much anything artificial is going to have repeating patterns that, depending on distance to camera, can create interference patterns (moire). 

This whole "Remove the AA filter" craze is just that...a craze. It's a "thing" Nikon started doing to be different, to get some "wows", and maybe bring in some more customers. Ironically, given that removal of an AA filter is really NOT a good thing...it's worked. Nikon's marketing tactics have sucked in a whole lot of gullibles who don't really know what an AA filter does or how it works, or how to work WITH it, and now we have a whole army of "photographers" who want AA filters removed from all cameras. Personally, I REALLY, TRULY, HONESTLY DO NOT want Canon to remove the AA filter. It is NECESSARY, it PRESERVES preservable data and eliminates useless data, and I LIKE THAT. 

And anything that is lost? It's MINIMAL. In the grand scheme of how much resolution you have...you maybe lose a percent or two of really high frequency information...but you really don't have that information anyway because it is similar in frequency to noise...so again, moot.


----------



## x-vision (Jun 20, 2014)

rs said:


> x-vision said:
> 
> 
> > So, you never know. The 7DII could have the first non-Bayer sensor in a DSLR.
> ...



Heh. You are right, of course.


----------



## Lee Jay (Jun 20, 2014)

jrista said:


> Further, for everyone else who continues to perpetrate the myth that somehow the two halves of the pixels, which are under not only one microlens, but also under one color filter block, could somehow magically be used to expand dynamic range "for free" are fooling themselves, and anyone who listens to them. Magic lantern either uses two FULL sensor reads (vs. half sensor reads), or they do line interpolation for half the resolution, to achieve their dynamic range. There is no free increase to dynamic range, and DPAF isn't going to somehow allow more dynamic range for free. The problem with the idea of using one half of the AF photodiodes for an ISO 100 read, and the other half for an ISO 800 read, is that is HALF the light! That is not the same as what ML does, which involves the FULL quantity of light, or else half the light AND half the resolution.



Huh? Please explain how reading both halves at the same gain gets you all the light but reading them at different gains gets you only half the light? How different do the gains have to be to cut the light in half? Is 1% enough?

What you said makes no sense to me.


----------



## Don Haines (Jun 20, 2014)

Lee Jay said:


> jrista said:
> 
> 
> > Further, for everyone else who continues to perpetrate the myth that somehow the two halves of the pixels, which are under not only one microlens, but also under one color filter block, could somehow magically be used to expand dynamic range "for free" are fooling themselves, and anyone who listens to them. Magic lantern either uses two FULL sensor reads (vs. half sensor reads), or they do line interpolation for half the resolution, to achieve their dynamic range. There is no free increase to dynamic range, and DPAF isn't going to somehow allow more dynamic range for free. The problem with the idea of using one half of the AF photodiodes for an ISO 100 read, and the other half for an ISO 800 read, is that is HALF the light! That is not the same as what ML does, which involves the FULL quantity of light, or else half the light AND half the resolution.
> ...


Jrista's right.

Think of it as taking two pictures at the same time. One picture is taken with one side of the pair at high gain, and the other picture is taken with the other side of the pair at low gain. Then the two pictures are combined for greater dynamic range. For the area where the ranges overlap, you are using all the light, but where it does not overlap you only have half of the light...... if you shoot with both halves at the same gain, there is 100% overlap and you use all of the light.


----------



## m8547 (Jun 20, 2014)

dilbert said:


> Does anyone have an example of what Canon call revolutionary?



T5i


----------



## jrista (Jun 20, 2014)

Lee Jay said:


> jrista said:
> 
> 
> > Further, for everyone else who continues to perpetrate the myth that somehow the two halves of the pixels, which are under not only one microlens, but also under one color filter block, could somehow magically be used to expand dynamic range "for free" are fooling themselves, and anyone who listens to them. Magic lantern either uses two FULL sensor reads (vs. half sensor reads), or they do line interpolation for half the resolution, to achieve their dynamic range. There is no free increase to dynamic range, and DPAF isn't going to somehow allow more dynamic range for free. The problem with the idea of using one half of the AF photodiodes for an ISO 100 read, and the other half for an ISO 800 read, is that is HALF the light! That is not the same as what ML does, which involves the FULL quantity of light, or else half the light AND half the resolution.
> ...



The photodiodes are SPLIT. Each half gets half the light coming through the lens. It doesn't matter what ISO you read them at...if you read "half"...it's half the light. So your reading half the light at ISO 100, and half the light at ISO 800...well, you really aren't gaining anything. The only way to increase dynamic range by any meaningful amount is to either gather MORE light IN TOTAL...or reduce read noise by a significant degree (i.e. drop it from ~35e- to 3e-). Assuming it ever even becomes possible to read the photodiodes for image purposes like that, you might gain an extremely marginal improvement...but overall, there really isn't any point. It isn't the same as what ML is doing. They are either reading alternate lines of the sensor at two different ISOs, then combining them at HALF THE RESOLUTION, or they are doing two full reads of the sensor. Either way, for the given output size, they double the quantity of light. Reading two HALVES of a SPLIT photodiode gets you...ONE full quantity of light.


----------



## tayassu (Jun 20, 2014)

Sabaki said:


> Question 1: Regardless of brand...which camera is the current King of Crop?
> 
> Question 2: What does Canon need to deliver with the 7Dii in order to put it in a league above every other crop body out there?



1: I think it's pretty much a tie. The Nikon D7100 and Pentax K-3 have the better IQ, but the 70D is the better allround package. It depends on the lens selection, so Canon wins here.

2: Sort of a mini 1DX with perfect AF, layout, WiFi etc. plus a new sensor with better IQ that rivals the Nikon D7200 (no, this is not a typo  ) the problem is that this will cost...


----------



## 9VIII (Jun 20, 2014)

jrista said:


> 9VIII said:
> 
> 
> > Right.
> ...



Given that the filter makes it physically impossible to have a repeating pattern of stripes the same frequency as the pixel grid, so that you cannot have a perfect transition of black pixels to white, I'd say that is cutting resolution in half. That is, compared to some magical thing that accurately reads the full RGB spectrum on each pixel.

You are right about the necessity of the AA filter though.
I was thinking that if the interpolation algorithm only sampled each pixel within a specific cluster of four pixels and not every pixel around it that it would solve the moire problem. Really that would just give you different colour banding instead.
Now, if we added a second layer of microlenses on top of the first to direct light only at individual groups of pixels, that would guarantee the full RGB read on each cluster, and allow hard transitions...

On second thought I guess that sounds a little excessive just to gain the ability to have large pixels with a hard transition instead of twice as many pixels with a row of grey pixels that's half as big. You can bin the smaller pixels with a normal AA filter just the same, we just need a way of doing that without destroying the flexibility of RAW (otherwise I assume people would have been using compressed formats a long time ago).


----------



## Marsu42 (Jun 20, 2014)

jrista said:


> It isn't the same as what ML is doing. They are either reading alternate lines of the sensor at two different ISOs, then combining them at HALF THE RESOLUTION, or they are doing two full reads of the sensor.



ML dual_iso is interlacing the frame with another iso at each other line.

What's really amazing is that the postprocessing program ("cr2hdr") does such a great job at reconstruction because you'd think you loose a lot of resolution. But since in the real world few people are shooting test charts or scenes that contain such fine detail to cover only one horizontal scanline, ML knows what "probably would have been there" if it wouldn't have been clipped.

It's only in completely black or white areas that are only constructed from half of the scanlines you see the loss of resolution, but after shooting ~3000 dual_iso shots I can say you have to really look closely and the benefits outweigh the drawbacks by far.

Looking back, a huge amount of my vanilla daytime shots on 60d w/o dual_iso or brackting have clipped sky somewhere, now with (higher dr) + dual iso you see it's actually still often blue even if you think it's white. It would be great if Canon would do this in-camera and output 16bit raw files like ML does.


----------



## Stuart (Jun 20, 2014)

Were they not limited to a wafer resolution, have they now hot a finer resolution fab. Can they offer more MP's with this.
Or better still - curve the sensors?


----------



## jrista (Jun 20, 2014)

9VIII said:


> jrista said:
> 
> 
> > 9VIII said:
> ...



I think your conflating the CFA with the AA filter. The CFA, color filter array, is what produces the RGBG pixel pattern. That is ENTIRELY different than the AA filter, which does optical blurring only at high spatial frequencies near the spatial frequency of the sensor pixels. 

The CFA doesn't cut resolution in half either. It has a minor impact on luminance resolution, it's mostly color resolution that is affected by the CFA. But since we pick up detail primarily due to luminance, a bayer sensor doesn't lose anywhere remotely close to as much resolution as Sigma would have you believe with their Foveon marketing, for example. The luminance resolution, the detail resolution, of a bayer still trounces anything else. It's your color fidelity and color resolution that suffers. Were not as sensitive to color spatial resolution as we are to luminance though, expecially when the luminance is combined. (It's actually a pretty standard practice in astrophotography to generate an artificial luminance channel, blur the RGB channel a bit (which practically eliminates noise and actually improves color fidelity a bit by reducing color noise), process the luminance channel for detail, then combine the L with the blurred RGB. The end result is a highly detailed image that has great color fidelity.)

As for the double layer of microlenses...sure, you could read a full RGBG 2x2 pixel "quad" and have "full color resolution". Problem is, that LITERALLY halves your luminance spatial resolution...so you actually don't gain squat from a resolution standpoint by doing that. Doing that, you would lose significantly more resolution than either the CFA or the AA filter cost you...both of which are trivial in comparison do doing what your asking for. BTW, what your describing is called super-pixel debayering. That, too, is a common option in astrophotography image stacking...instead of basic or AHD debayering, you usually have the option to either super-pixel debayer, or "drizzle" (which, if you have enough subs...such as a couple hundred...is a means of achieving superresolution, and can increase your output image resolution by two to three fold.) You don't even need another microlens layer to do super-pixel debayering...you could use a tool like Iris or maybe even DarkTable/RawThearapy, to do it on any image you want. 

Finally, even if you do super-pixel debayering, your not going to ever have "hard edges". Statistically speaking, the chances if a white/black line pattern you wish to photograph perfectly lining up with your pixels, regardless of how large or small they are, is so excessively remote that it is statistically impossible. Not in any real-world situation. You might be able to build some kind of contraption and AI software to eventually achieve it, but that is well beyond the realm of practicality. If you remove the AA filter, use super-pixel debayering, you might have larger pixels with full color fidelity...but your going to have a massive amount of aliasing. Those white and black lines would have some nasty stair-stepped edges, they would just look atrocious.


----------



## Radjan (Jun 20, 2014)

What do you all think are the chances, that the 7D II will surpass the 5D mark iii/1DX on certain things? Such as IQ, ISO, DR or other things?

I know it's a crop sensor, so it is obviously limited, compared to FF.

And I must say - it's brilliant reading all you folks replies, you all have some serious knowledge about the technology us mere mortals are toying with ;D


----------



## Lee Jay (Jun 20, 2014)

jrista said:


> Lee Jay said:
> 
> 
> > jrista said:
> ...



You're making no sense.

In all these cases, you're getting the same amount of light because you're starting with the same exposure. The point of reading the halves at different ISOs is to reduce read noise without causing clipping. One side ensures you get the highlights on bright pixels tthe other gives you dramatically reduced read noise on dark pixels. When you combine them, you've effectively increased real bit depth/DR.

Of course, if the halves could both be read out at dramatically lower noise that would be even better, but this is one way to achieve that without having an improved read noise behavior. Canon sensors have dramatically reduced read noise at high ISO, the only issue being saturation.


----------



## neuroanatomist (Jun 20, 2014)

Radjan said:


> What do you all think are the chances, that the 7D II will surpass the 5D mark iii/1DX on certain things? Such as IQ, ISO, DR or other things?



The 7DII will surpass the 5DIII/1D X in viewfinder magnification, and the 5DIII in frame rate...that's pretty much it.


----------



## neuroanatomist (Jun 20, 2014)

Lee Jay said:


> What you said makes no sense *to me*.





Lee Jay said:


> You're making no sense.



Your first statement is acceptable, your second statement is wrong. The explanation is true and correct, in spite of your apparent inability to comprehend it.


----------



## bdunbar79 (Jun 20, 2014)

Lee Jay said:


> jrista said:
> 
> 
> > Lee Jay said:
> ...



Nope, nope, and nope. I usually don't get involved in these discussions because I don't need to due to jrista and neuro, but this, nope.


----------



## tayassu (Jun 20, 2014)

neuroanatomist said:


> Radjan said:
> 
> 
> > What do you all think are the chances, that the 7D II will surpass the 5D mark iii/1DX on certain things? Such as IQ, ISO, DR or other things?
> ...



There also might be several small items with the 7DII like wifi, GPS or a built-in-flash. You can argue about whether you need them, but the 7DII will surpass the other two models on that.


----------



## neuroanatomist (Jun 20, 2014)

tayassu said:


> neuroanatomist said:
> 
> 
> > Radjan said:
> ...



It will be cheaper, too. That's probably the biggest benefit!


----------



## Lee Jay (Jun 20, 2014)

Don Haines said:


> Lee Jay said:
> 
> 
> > jrista said:
> ...



The only way you're going to lose any light is when the high-ISO sampled half is saturated. But that's only when you have too much light in that pixel half, and the other side will still have a lot of light. In other words, this isn't the case where it's a problem.


----------



## Marauder (Jun 20, 2014)

neuroanatomist said:


> tayassu said:
> 
> 
> > neuroanatomist said:
> ...


I expect it will also have a touch-screen.


----------



## GMCPhotographics (Jun 20, 2014)

Marauder said:


> neuroanatomist said:
> 
> 
> > tayassu said:
> ...



Wowzers....killer new feature..... :-\


----------



## neuroanatomist (Jun 20, 2014)

GMCPhotographics said:


> Marauder said:
> 
> 
> > neuroanatomist said:
> ...



Hey, don't knock it. After all, changing settings with your nose is better than butt-dialing.


----------



## wickidwombat (Jun 20, 2014)

neuroanatomist said:


> GMCPhotographics said:
> 
> 
> > Marauder said:
> ...


Hey with a nose the size of mine this is a very real concern :-[


----------



## Marauder (Jun 20, 2014)

wickidwombat said:


> neuroanatomist said:
> 
> 
> > GMCPhotographics said:
> ...


Forehead works best! 8)


----------



## NancyP (Jun 20, 2014)

FWIW, if the photodiode filter arrangement is not the classic Bayer type, but uses some other arrangement (Fuji being the main example), the major software companies, in addition to the DPP in-house software team, will need a bit of time to implement the new filter arrangement into their RAW converters. On the off chance that someone is hoping for a Foveon-ish sensor, the developers will need more than "a bit" of time because the algorithms are a lot different. There's only one non-Sigma RAW developer out there that can use x3f files, Iridient Developer.


----------



## Famateur (Jun 20, 2014)

This has been another informative thread -- I'm learning a lot. Thanks guys and gals, especially jrista and others on the in-depth tech talk. Very cool stuff.

My guess is that if there's a big improvement in image quality from the sensor, it's probably produced on a new fab. That could also explain the delay in getting the 7DII out the door and perhaps the absence of patents that would point to something totally new.

On the patents things, there is another way they could bring something "totally new" (at least for Canon). They could license or buy a patent from another company. Any thoughts on this? Does Canon take too much pride in their own development to do this? Do they not need to because of the technology they're already working on? Is it still just market dominance that allows them to not need to make a huge leap? (Something tells me that while market dominance might mean they don't need to _release _a new leap in tech, it doesn't mean they aren't furiously working on developing new tech all the time.)

Anyway, I'm still guessing new fab, but what do you think? Any chance some other company's patent technology shows up in a new killer sensor for the 7DII?


----------



## x-vision (Jun 20, 2014)

Radjan said:


> What do you all think are the chances, that the 7D II will surpass the 5D mark iii/1DX on certain things? Such as IQ, ISO, DR or other things?



IMO, the 7DII will at best match the ISO/noise performance of the 5DIII.
And if Canon has finally decided to implement on-chip analog-to-digital conversion (ADC), 
DR at low ISO could be better than on the FF cameras. 
That's about it, though, in terms of IQ. 

Also, just like with the 7D, we might see some features on the 7DII that will later make it 
into the higher end cameras. 

Overall, it's hard to imagine that the 7DII will offer much more from what you can get
already with the 5DIII today (except for higher frame rate and more pixels per duck).


----------



## East Wind Photography (Jun 20, 2014)

Famateur said:


> This has been another informative thread -- I'm learning a lot. Thanks guys and gals, especially jrista and others on the in-depth tech talk. Very cool stuff.
> 
> My guess is that if there's a big improvement in image quality from the sensor, it's probably produced on a new fab. That could also explain the delay in getting the 7DII out the door and perhaps the absence of patents that would point to something totally new.
> 
> ...



Not sure how informative all of the opinions actually are. I would be more interested in the opinion of the guy testing it at the world cup. Short of that, all I want to know is when I can place my pre-order.


----------



## Famateur (Jun 20, 2014)

East Wind Photography said:


> Not sure how informative all of the opinions actually are. I would be more interested in the opinion of the guy testing it at the world cup. Short of that, all I want to know is when I can place my pre-order.



Too true regarding predicting what the 7DII will actually have in it.

I just like learning all the info about AA Filters, CFAs, ADCs, read noise, et cetera. Even with conflicting arguments back and forth, the preponderance of info gives me a clearer picture of how works a Bayer-type sensor, even if some of the uber-technical details are up for debate.


----------



## LetTheRightLensIn (Jun 20, 2014)

docsmith said:


> Lee Jay said:
> 
> 
> > l_d_allan said:
> ...



No way they hold 5D4 back until 2016!


----------



## LetTheRightLensIn (Jun 20, 2014)

9VIII said:


> jrista said:
> 
> 
> > 9VIII said:
> ...



it's not cutting it in half


----------



## 9VIII (Jun 20, 2014)

LetTheRightLensIn said:


> 9VIII said:
> 
> 
> > jrista said:
> ...



Yes and no?

In the theoretical perfect transition, with no AA filter you get a "black-white" transition, and then you have a "black-grey-grey-white" transition with the AA filter.
Ideally with the AA filter the line would land directly in the middle of the pixels, which would actually produce exactly the same result as having no AA filter, a "white-grey-black" transition. But that is equally improbable as the perfect pixel transition, you're practically going to end up with a transition going from white, to two pixels of varying shades of grey, to black. Whereas without your worst case scenario is one line of grey pixels.

So the best case scenario with an AA filter is somewhere between a 50%-100% increase in blur. Where without you go somewhere from a 0% to 50% increase in blur. On a theoretical perfect line.
Yes, it's nitpicking, but that's what makes the forums so much fun.


----------



## bdunbar79 (Jun 20, 2014)

x-vision said:


> Radjan said:
> 
> 
> > What do you all think are the chances, that the 7D II will surpass the 5D mark iii/1DX on certain things? Such as IQ, ISO, DR or other things?
> ...



Not if it's not FF. No way.


----------



## LetTheRightLensIn (Jun 20, 2014)

NancyP said:


> FWIW, if the photodiode filter arrangement is not the classic Bayer type, but uses some other arrangement (Fuji being the main example), the major software companies, in addition to the DPP in-house software team, will need a bit of time to implement the new filter arrangement into their RAW converters. On the off chance that someone is hoping for a Foveon-ish sensor, the developers will need more than "a bit" of time because the algorithms are a lot different. There's only one non-Sigma RAW developer out there that can use x3f files, Iridient Developer.



the algorithms are much different, but also much, much simpler


----------



## 9VIII (Jun 20, 2014)

jrista said:


> As for the double layer of microlenses...sure, you could read a full RGBG 2x2 pixel "quad" and have "full color resolution". Problem is, that LITERALLY halves your luminance spatial resolution...



Thus you start with an 80MP sensor to get a nice 20MP image.



jrista said:


> BTW, what your describing is called super-pixel debayering. That, too, is a common option in astrophotography image stacking...instead of basic or AHD debayering, you usually have the option to either super-pixel debayer, or "drizzle" (which, if you have enough subs...such as a couple hundred...is a means of achieving superresolution, and can increase your output image resolution by two to three fold.) You don't even need another microlens layer to do super-pixel debayering...you could use a tool like Iris or maybe even DarkTable/RawThearapy, to do it on any image you want.
> 
> Finally, even if you do super-pixel debayering, your not going to ever have "hard edges". Statistically speaking, the chances if a white/black line pattern you wish to photograph perfectly lining up with your pixels, regardless of how large or small they are, is so excessively remote that it is statistically impossible. Not in any real-world situation. You might be able to build some kind of contraption and AI software to eventually achieve it, but that is well beyond the realm of practicality. If you remove the AA filter, use super-pixel debayering, you might have larger pixels with full color fidelity...but your going to have a massive amount of aliasing. Those white and black lines would have some nasty stair-stepped edges, they would just look atrocious.



Wow, it looks like superpixel debayering (http://pixinsight.com/doc/tools/Debayer/Debayer.html) is exactly what I'm after. Make a 128MP sensor and use superpixel debayering and you'll have a nice compact, super accurate 32MP image.
Again, really, I'm fine with just shooting on a 128MP sensor and dealing with 100MB+ RAW files, the trick is to get a similar result in a format that's going to be acceptable to the majority of photographers who refuse to deal with large file sizes.

As long as your final image is around 32MP I don't think people are going to notice the stair stepping, unless you're standing right next to something like a 40" high quality print.


----------



## Don Haines (Jun 20, 2014)

neuroanatomist said:


> GMCPhotographics said:
> 
> 
> > Marauder said:
> ...


GREAT! Yet another image I will never get out of my head..... Neuro dialing a phone with his butt.......


----------



## ScottyP (Jun 20, 2014)

I'll toss it out there....... 


APS-H?


----------



## Dylan777 (Jun 21, 2014)

x-vision said:


> Radjan said:
> 
> 
> > What do you all think are the chances, that the 7D II will surpass the 5D mark iii/1DX on certain things? Such as IQ, ISO, DR or other things?
> ...



Wishful thinking and I hope you right. The longest lens I have is 400mm f2.8 IS II, I don't mind adding 7D II(x1.6) if that the case


----------



## jrista (Jun 21, 2014)

9VIII said:


> jrista said:
> 
> 
> > As for the double layer of microlenses...sure, you could read a full RGBG 2x2 pixel "quad" and have "full color resolution". Problem is, that LITERALLY halves your luminance spatial resolution...
> ...



No, that is fundamentally incorrect. You start with a 20mp sensor, which has 40mp PHOTODIODES. The two are not the same. Pixels have photodiodes, but photodiodes are not pixels. Pixels are far more complex than photodiodes. DPAF simply splits the single photodiode for each pixel, and adds activate wiring for both. That's it. It is not the same as increasing the megapixel count of the sensor.

And, once again...I have to point out. There is no such thing as QPAF. The notion that Canon has QPAF is the result of someone seeing something they did not understand. Canon does not have QPAF. Their additional post-DPAF patents do not indicate they have QPAF technology yet...however there have been improvements to DPAF. 



9VIII said:


> jrista said:
> 
> 
> > BTW, what your describing is called super-pixel debayering. That, too, is a common option in astrophotography image stacking...instead of basic or AHD debayering, you usually have the option to either super-pixel debayer, or "drizzle" (which, if you have enough subs...such as a couple hundred...is a means of achieving superresolution, and can increase your output image resolution by two to three fold.) You don't even need another microlens layer to do super-pixel debayering...you could use a tool like Iris or maybe even DarkTable/RawThearapy, to do it on any image you want.
> ...



Well, someday we may have 128mp sensors...but that is REALLY a LONG way off. DPAF technology, or any derivation thereof, isn't going to make that happen any sooner.


----------



## jrista (Jun 21, 2014)

9VIII said:


> Yes and no?
> 
> In the theoretical perfect transition, with no AA filter you get a "black-white" transition, and then you have a "black-grey-grey-white" transition with the AA filter.
> Ideally with the AA filter the line would land directly in the middle of the pixels, which would actually produce exactly the same result as having no AA filter, a "white-grey-black" transition. But that is equally improbable as the perfect pixel transition, you're practically going to end up with a transition going from white, to two pixels of varying shades of grey, to black. Whereas without your worst case scenario is one line of grey pixels.
> ...



Your talking about the scientifically ideal situation. Those only exist in text books. They don't exist in reality, not with the countless other factors that go into resolving an image accounted for. That would be like saying that you could create the ideal frictionless surface often referred to in physics text books. You can greatly reduce the coefficient of friction, but you cannot eliminate it. You cannot actually achieve the perfectly ideal.

So there is the theory of resolving line pairs, and then there is the 100% perfectly ideal exemplar. Yes, in the ideal exemplar case, theoretically you could line up black and white lines perfectly on top of rows of pixels, and they would end up perfectly sharp. Perfect is unattainable.

When you account for other factors, such as the statistical improbability that you would EVER be able to line up alternating white and black lines perfectly on the sensor, the difference isn't blur...it's aliasing. You either end up with aliased results, which means you have "nonsense" information...or VERY SLIGHTLY blurred results for high frequency oscillations. It really isn't even blurring, it's frequency stretching or spreading, which effectively stretches high frequencies and makes them a slightly lower frequency, which actually represents the real information much more accurately than the nonsense. That is not a reduction in resolution, it's the elimination of useless data. Anti-aliasing is not designed to destroy information...it is actually designed to PRESERVE information, by throwing away what you cannot resolve accurately anyway.

The removal of an AA filter does not mean your producing more accurate images. Your producing less accurate images that have higher acutance. That's it. Thing about acutance is...it's easy to replicate, to the small degree necessary at high frequencies...with software. A simple unsharp mask will improve the acutance of an image taken with a camera that has an AA filter. The only difference between the two images at that point is that the anti-aliased image is accurate AND sharp, where as the aliased image is sharp but not accurate. There is really no benefit to removal of the AA filter unless your imaging highly random information. There are very few things like that. Landscapes come to mind as one of the primary, and very few, situations where removal of an AA filter could _*potentially *_be useful. I wouldn't even say macro photography would be better without an AA filter...when you magnify small subjects so significantly, there tends to be a LOT of high frequency data, and you would be surprised how often there are repeating patterns at the microscopic scale. Even without repeating patterns, certain natural features, such as the cells of an insect eye, end up looking more jagged and harsh than they do when you use a camera with an AA filter.

Sharpness isn't the supreme indicator of IQ. Too much sharpness is often the hallmark of significant overprocessing...a slight amount of softening of very high spatial frequencies is usually the hallmark of a skilled processor. Some of THE BEST landscape photography I admire the most has a very soft aesthetic, with a specific amount of slightly lower contrast in the high frequencies. These kinds of landscapes are the ones that really stand out from the throngs of landscape photos as being exceptional.


----------



## x-vision (Jun 21, 2014)

jrista said:


> No, that is fundamentally incorrect. You start with a 20mp sensor, which has 40mp PHOTODIODES.



Jrista, you are just assuming that Canon's _dual-pixel_ tech is in fact a dual-photodiode tech.
My assumption is that it's already a quad-photodiode tech - and it's equally valid, as neither 
one us has info on the actual implementation.

In general, before making any claims for photodiodes and pixels, consider the following: 
A 'classic' pixel design has a photodiode plus three transistors (you can read about it on Wikipedia): 

a reset transistor for resetting the photodiode voltage
a source-follower transistor for signal amplification
a row select transistor

So, one definition of a pixel is a photodiode with three transistors.

The thing is, to improve fill factor and for other design considerations, modern sensors are using transistor sharing. 
That is, a single set of the 'classic' transistors is shared between multiple phododiodes.

Transistor sharing is widely used in small sensors. 
In the case of these sensors, though, each photodiode has its own microlens.
Thus, the photodiode _*is*_ the pixel in these designs. 

In short, depending on the implementation, a photodiode and a pixel could mean the same thing.

Canon's 'dual-pixel' tech is assumed to be based on a shared-transistor design. 
That is, it is a multi-photodiode design. 
But since in a shared-transistor design photodiodes are effectively equivalent to pixels (as explained), 
Canon's tech could be called multi-pixel design as well. 

So, you can stop correcting people who use dual/quad-pixel terminology, as these could in fact be used interchangeably. 
The line between between a pixel and a photodiode is blurred in shared-pixel designs.
And the fact that the two photodiodes are read independently for auto-focus further 
indicates that these could very well be independent pixels - if they didn't share the 
same microlens and color filter.

Also, your claim that there are exactly TWO PHOTODIODES (and that's it!) is not based on fact.
We don't know for sure if Canon's design is a dual-pixel design (your assumption) or a quad-pixel design 
(my assumption). 

Canon's marking is selling it as a 'dual-pixel' tech likely because it's easier this way to communicate 
the concept to the general public.
But we don't know for a fact what the actual implementation is.

So, your TWO PHOTODIODES claim is based on marketing materials, really. 
If I were you, I wouldn't put too much weight into these 8).

My assumption for a quad-pixel design is based on simple geometry.
If there are just two photodiodes per pixel, these photodiodes need to be rectangular.
This would be uncommon - if not even a first in the industry.
But with a quad-pixel design, the photodiodes are square just like in any other sensor.

Considering the potential future advantages of a quad-pixel design (e.g. for a non-Bayer sensor), 
I'd speculate that Canon would have invested in a quad-pixel design from the start - rather than
designing rectangular photodiodes that later would need to be made square anyway.

Just a speculation, of course - but based on some informed assumptions.


----------



## Orangutan (Jun 21, 2014)

x-vision said:


> Jrista, you are just assuming that Canon's _dual-pixel_ tech is in fact a dual-photodiode tech.



See: http://www.usa.canon.com/cusa/consumer/products/cameras/standard_display/daf_technology



> Each pixel on the EOS 70D camera's sensor consists of two independent photodiodes that function both as imaging points and as individual phase-difference AF sensors.


It may be an _assumption_, but it's based on published material. What's the support for your assumption?


----------



## x-vision (Jun 21, 2014)

Orangutan said:


> ... but it's based on published material.



Right. But that's still marketing materials.


----------



## Orangutan (Jun 21, 2014)

x-vision said:


> Orangutan said:
> 
> 
> > ... but it's based on published material.
> ...



They literally say two "photodiodes," not "photosites," or merely "pixels," but "photodiodes." That's very specific and technical (the typical consumer has little to no idea what a photodiode is). Why would they say two when it's really four? From the marketing perspective, four is better than two.

Magic 8-Ball says "All signs point to 2 photodiodes." You still _might_ be right, but the bulk of evidence is against you.


----------



## x-vision (Jun 21, 2014)

Orangutan said:


> Magic 8-Ball says "All signs point to 2 photodiodes."


Fair enough. 



> You still _might_ be right, but the bulk of evidence is against you.



Heh. All the evidence is coming from one source - Canon. 
If it corroborated by at least one other source, I maybe wouldn't have argued.
But for now we either accept what Canon is saying ... or not 8).


----------



## jrista (Jun 21, 2014)

x-vision said:


> jrista said:
> 
> 
> > No, that is fundamentally incorrect. You start with a 20mp sensor, which has 40mp PHOTODIODES.
> ...



Sorry, but I am NOT assuming. I've actually read Canon's own patents. Those patents describe a system where the photodiodes for each pixel have been divided in half. This stuff isn't a mystery. Patentes are ESSENTIAL for the protection of intellectual property. Canon has been filing patents for DPAF for quite some time, a couple of years at least now, with the most recent ones being near the end of last year.

My assertions are based on concrete fact as described by the DPAF engineers at Canon themselves. Your assumptions are just that, assumptions based on an extremely TINY image posted on the ChipWorks page of the BACKSIDE of some sensor, an image which you have gravely misinterpreted, and an image we all can only assume is even of a Canon sensor, let alone one with DPAF technology (although it certainly is not of a Canon sensor with QPAF technology...since such a sensor doesn't exist yet.)




x-vision said:


> In general, before making any claims for photodiodes and pixels, consider the following:
> A 'classic' pixel design has a photodiode plus three transistors (you can read about it on Wikipedia):
> 
> a reset transistor for resetting the photodiode voltage
> ...



Sure. An extremely basic kind of "pixel" that you might find in an entry level course on image sensor design. Modern sensors often have a lot more logic than that per pixel. That logic usually involves some level of noise reduction, potentially charge bucketing for global shutter sensors, anti-blooming gates and shift registers in CCDs, extra logic to allow the selection of which photodiode to read in shared-pixel designs (which most smaller-pixel sensor designs are these days), etc. 



x-vision said:


> The thing is, to improve fill factor and for other design considerations, modern sensors are using transistor sharing.
> That is, a single set of the 'classic' transistors is shared between multiple phododiodes.
> 
> Transistor sharing is widely used in small sensors.
> ...



The photodiode is the light-sensitive part *of* a pixel. A standard bayer pixel is comprised of a photodiode, at least one microlens layer (sometimes two), and a color filter, as well as the row/column activate wiring, amplifier, and readout transistors. 

In a DPAF pixel, the photodiode has been split in half, with insulating material between the two halves. Each halve has independent readout. The photodiode, despite being split, still exists below the color filter and microlenses. Therefor, there is still ONE pixel...with two photodiodes. Canon did not increase the pixel count...they increased the photodiode count.



x-vision said:


> In short, depending on the implementation, a photodiode and a pixel could mean the same thing.



Your interpretation is wrong. ;P Sorry. Go read the darn patents, and stop making assumptions. 



x-vision said:


> Canon's 'dual-pixel' tech is assumed to be based on a shared-transistor design.
> That is, it is a multi-photodiode design.
> But since in a shared-transistor design photodiodes are effectively equivalent to pixels (as explained),
> Canon's tech could be called multi-pixel design as well.



But photodiodes and pixels are not effectively equivalent. A pixel is more complex than a photodiode. A photodiode is simply a PART of a pixel. Your conflating the two for the sake of your argument, but that does not mean your conflation is valid. 



x-vision said:


> So, you can stop correcting people who use dual/quad-pixel terminology, as these could in fact be used interchangeably.
> The line between between a pixel and a photodiode is blurred in shared-pixel designs.
> And the fact that the two photodiodes are read independently for auto-focus further
> indicates that these could very well be independent pixels - if they didn't share the
> same microlens and color filter.



You misunderstand shared-pixel designs. Shared pixels do not share the photodiode. Each pixel still has it's own independent photodiode. What's shared in a shared-pixel design is the readout logic...transistors. Usually, the sharing is diagonal, although some prototypical designs share directly neighboring pixels. Green pixels usually share their *readout logic* diagonally. Those two green pixels, however, each still have their OWN photodiode. The purpose of a shared pixel design is not to share the light-sensitive charge collector...that would be useless, since it would share each pixel's charge in one bucket, meaning you couldn't actually read them out independently. 

The purpose of a shared pixel design is to save die space FOR the photodiode by reusing transistors and wiring for more than one pixel. The use of shared transistors to activate, amplify, and read the pixel has nothing to do with blurring the line between pixel and photodiode. The pixel is a vertical stack of layers of silicon materials. The photodiode is (usually) at the bottom of a physical well...it's the bit of silicon that is actually sensitive to light and converts some ratio of incident photons to free electrons (charge). Above that is a layer of translucent silicon material, usually silicon dioxide. Above that is often a microlens, and above that is a color filter array. There is sometimes buffer materials in between these layers, on top of which you finally have the primary microlens. THAT is a "pixel". The photodiode is just one part of the whole pixel. If you split the photodiode underneath all those other layers...you still have just one pixel. You have a pixel that is now capable of detecting phase, but it's still just one pixel, not two pixels. Regardless of what kind of readout logic it has...a pixel is a pixel, independent and atomic, and a photodiode is just a part of a pixel. 



x-vision said:


> Also, your claim that there are exactly TWO PHOTODIODES (and that's it!) is not based on fact.
> We don't know for sure if Canon's design is a dual-pixel design (your assumption) or a quad-pixel design
> (my assumption).



Your assuming I am assuming. Your assumption is, once again, wrong. You are also assuming that "we" don't know anything "for sure" about Canon's sensor designs. Sorry, but again, your assumption there is WRONG. Canon has filed patents for all of their DPAF designs. Those patents are the basis for their technology...the technology that actually exists in the 70D, for example. I am not assuming. My assertions are based on actual fact as clearly and definitively defined by Canon engineers themselves. 

You can go look up these patents for yourself. They aren't hard to find. Many of them have been posted right here on CR in the past. This stuff isn't some mysterious, mystical, magical sensor technology that Canon is keeping obfuscated. Obfuscation and secrecy is the worst form of protection for technology. By filing and receiving patents, Canon LEGALLY protects their work from theft by other manufacturers...they have no reason to hide or obfuscate anything.



x-vision said:


> Canon's marking is selling it as a 'dual-pixel' tech likely because it's easier this way to communicate
> the concept to the general public.
> But we don't know for a fact what the actual implementation is.



We DO know what the actual implementation is. Not only that, we know EXACTLY what it is. See my prior comment.



x-vision said:


> So, your TWO PHOTODIODES claim is based on marketing materials, really.
> If I were you, I wouldn't put too much weight into these 8).



Again, wild assumption, and a wrong one. You assume WAY too much. You might want to verify your facts first, before putting yourself out like that. I have never based anything I've said about Canon sensor technology on marketing materials. I read patents, of which there are many thousands filed by Canon every year, and many thousands more filed by all the other entities involved in sensor research and design. I know EXACTLY what I am talking about, and it's based on actual sensor designs that have either been manufactured for commercial use, or have been prototyped and thouroughly demonstrated at one of the numerous ICS conferences around the world every year. 

The only person who puts weight into something they shouldn't is you...putting a lot of weight into the validity of your assumptions.



x-vision said:


> My assumption for a quad-pixel design is based on simple geometry.
> If there are just two photodiodes per pixel, these photodiodes need to be rectangular.
> This would be uncommon - if not even a first in the industry.
> But with a quad-pixel design, the photodiodes are square just like in any other sensor.
> ...



The photodiodes ARE rectangular! That's EXACTLY what they are! That's exactly how they are described in Canon's patents on the technology! : It's not a first in the industry...for decades, there have been sensors with non-square photodiodes, even non-square pixels. There have been hexagonal pixels (Fuji first released sensors with hexagonally shaped pixels with extra small "white" pixels filling in the diagonal spaces between them many years ago), triangular pixels (Sony has a prototype 50mp sensor with triangular pixels), even pixels with non-uniform pixel sizes and layouts (some sensor designs, usually from Fuji, have had large regtangular white pixels, along with a non-standard layout of smaller rectangular red, green, and blue pixels). I currently use a CCD camera for guiding my astrophotography that uses rectangular pixels, due to the use of an anti-bloom gate. Again...your making some wild assumptions that have absolutely no basis in fact. Your assumptions are FAR from informed, as well. I don't know where you think your "informing" yourself, but you really need to go right to the source...patents. You seem to think that all this technology is kept secret and obfuscated and hidden away within the bowls of "Canon the over-protective corporation". That is, once again, an assumption. Canon has decades of sensor technology filed legally as patents in countries around the world. Those patents are fully available, in complete detail, with abstracts, technical diagrams, and full-blown conceptual and functional dissection and breakdown, for review by anyone who wishes to spend the time looking them up. If patents weren't freely available, then they would be useless. Competitors have to be able to investigate what technology their competitors have already invented and patented, so they don't try inventing the same exact thing to patent themselves...that would be a patent violation. Potential licensees of patented technology need to know how the technology is implemented, so they may implement it themselves in their own products, with the added requirement of a royalty fee.

This technology is WELL KNOWN, because it has to be. "We" know EXACTLY how DPAF is designed...and it is not quad-pixel. It's, quite literally, dual-photodiode. There are now multiple patents that PROVE that FACT.


----------



## the blackfox (Jun 21, 2014)

having fallen foul of nikons removal of the AA filter as in the d7100 and after having two of those cameras literally go tits up on me after 3000 actuations each,and hence speeding up my return to canon gear ,i hope thats not what canon are doing .i also hope that whatever this breakthrough is its going to be backward compatible with current lenses and lens technology .as in most cases when a manufacturer brings out something thats a breakthrough it usually ends up the only thing it breaks is the bank .

lots of speculation on here i can't wait to see what actually comes forth .


----------



## x-vision (Jun 21, 2014)

jrista said:


> I know EXACTLY what I am talking about ...



Hmm. Doesn't look like it. Let's see. 

First you say this:



> The photodiode is the light-sensitive part *of* a pixel. A standard bayer pixel is comprised of a photodiode, at least one microlens layer (sometimes two), and a color filter, as well as the row/column activate wiring, amplifier, and readout transistors.



And then you say: 



> In a DPAF pixel, the photodiode has been split in half, with insulating material between the two halves. Each halve has independent readout.



So, a pixel has a photodiode and readout transistors (in addition to the other stuff).
Simialrly, a DPAF pixel is a (half) photodiode ... with an independent readout.

Do you even realize that by splitting the photodiode in two, and by providing independent readouts for each half, 
you have essentially created two pixels ? 

I don't think you do! 
That's why you can't grasp that a split photodiode is in fact a separate pixel - etched on the silicon wafer. 

The microlenses and color filters are secondary - put on top of the already etched wafer. 
You can put a microlense and a color filter on top of multiple pixels.
That's what Canon is doing - and what you call a pixel with a 'split photodiode'. 
What you don't grasp, obviously, is that a split photodiode with independent readouts ... is two separate pixels.



> The photodiode, despite being split, still exists below the color filter and microlenses. Therefor, there is still ONE pixel...with two photodiodes.



There you go. That's the part that is escaping you.
It's ONE pixel in the image, not on the silicon wafer. 

There is a reason I mentioned the 'classic' 3T pixel and the shared transistor designs.
And that is to illustrate the point that a 'pixel' can be implemented in different ways.
The important distinction is that the wafer is etched in a way so that you can read photodiode charges independently.

That's what a pixel is on the wafer level. 
You can certainly combine the output of multiple pixels into one.
Or put a single microlens on top of multiple pixels. 
But as long as you have a photodiode and independent readout circuitry, regardless of configuration,
you have a pixel - and that part is definitely escaping you.

And the reason is that you are not technical enough to grasp the underlying principle here.
So, no, you don't know what you are talking about.



> You misunderstand shared-pixel designs. Shared pixels do not share the photodiode. Each pixel still has it's own independent photodiode.



You mean just like a split photodiode with two independent readouts???




> The photodiodes ARE rectangular! That's EXACTLY what they are! That's exactly how they are described in Canon's patents on the technology! :



O-o-kay. Care to share a link to least one of these patents. That should settle it, right?
So, let's settle it by you providing a link to at least one of these patents. 
You have the link handy, don't you?

Look, I suggest that you drop the 'I'm the authority' attitude - because you are not an authority.
In fact, it's very clear that you don't even come from a technical background.
So, drop the attitude and let's have a friendly discussion.
That's the reason why we are all here, no?. What's with all that bullying ??


----------



## jrista (Jun 21, 2014)

x-vision said:


> jrista said:
> 
> 
> > I know EXACTLY what I am talking about ...
> ...



They aren't two pixels. It's two photodiodes in a SINGLE pixel. Your trying to reduce a pixel to just the photodiode. That's incorrect. Just because they have independent readout does not make them separate pixels. Both halves are still one color. There is no useful purpose to reading each half out independently for an image read. If you read a red DPAF pixel out as independent halves, you have two red rectangular results...but those results have no meaning independently. They are just two red values with half the light and twice the read noise than what you would get if you binned the two halves electronically at readout. 



x-vision said:


> I don't think you do!
> That's why you can't grasp that a split photodiode is in fact a separate pixel - etched on the silicon wafer.
> 
> The microlenses and color filters are secondary - put on top of the already etched wafer.
> ...



If we reduce everything to the most simple sensor, monochrome, with nothing but light-sensitive silicon patches and their companion readout transitors...then you would be correct. A photodiode would then be equivalent to a pixel. 

Were not talking about a simple monochrome sensor. We are talking about a bayer sensor. A sensor that has red, green, and blue PIXELS that are interpolated during demosaicing to produce full-color RGB output pixels. Each of these pixels in a bayer sensor is, at the VERY LEAST, comprised of a color filter layered on top of a photodiode. If you split the photodiode...you now have two photodiodes that read from below the same color filter. From an interpolation standpoint...you still have to combine those two halves...either electronically or digitally, to perform demosaicing.

If you completely ignore the fact that the RAW sensor readout in a bayer sensor needs to be demosaiced, then sure...you theoretically have the potential to produce two outputs per color filter. But what does that mean? How is that useful? Spatially, nothing has really changed. Whether you have two half reds, four half greens, and two half blues...SPATIALLY, they are still IDENTICAL to one red, two green, and one blue. SPATIALLY, you've gained nothing. It doesn't matter if you can read them out independently. 

Your trying to imply that somehow, this increases your resolution. It does not. Just because two (or more) photodiodes underneath a single color filter can be read out independently does not change the fact that they are all the same color, and they all define the same spatial frequency as a single photodiode under that same filter. The only way those independent photodiodes could actually become useful is if you actually built microlenses to focus a cone of light onto each one independently. THEN you might actually increase luminance resolution, and you might actually have an increase in spatial resolution.

But Canon's patents do not describe a pixel structure wherein multiple microlenses are used to focus a cone of light onto each part of the split photodiode. On the contrary, the patent's CAN'T describe such a pixel structure...as then you wouldn't actually be able to perform AF functions with it. The point of DPAF is to add PHASE DETECTION capabilities to a sensor...not increase it's resolution.



x-vision said:


> > The photodiode, despite being split, still exists below the color filter and microlenses. Therefor, there is still ONE pixel...with two photodiodes.
> 
> 
> 
> There you go. That's the part that is escaping you.



Read the above.



x-vision said:


> > You misunderstand shared-pixel designs. Shared pixels do not share the photodiode. Each pixel still has it's own independent photodiode.
> 
> 
> 
> You mean just like a split photodiode with two independent readouts???



That is the exact OPPOSITE of a shared pixel. Shared pixels SHARE readouts. Canon's DPAF use INDEPENDENT readouts. They HAVE to use independent readouts, because they are in the same row. You cannot share pixel transistors in the same row, since columns of pixels are read out row-by-row. DPAF is essentially the opposite of a shared pixel sensor design...instead of reducing logic space and sharing logic among larger photodiodes, DPAF increases logic space and isolates logic among smaller photodiodes. 



x-vision said:


> > The photodiodes ARE rectangular! That's EXACTLY what they are! That's exactly how they are described in Canon's patents on the technology! :
> 
> 
> 
> ...



I have the PDFs for those patents saved on my hard drive. I'm not going to do the legwork for you AGAIN to find the source for those downloads. I shared several links to DPAF patents the LAST time we had this debate. You clearly ignored them. If you want to educate yourself, educate yourself. You can start here (they have the patent number...go dig through the bowels of the internet on your own time to find it):

http://thenewcamera.com/canon-patent-more-sensetive-dual-pixel-cmos-af/

As for the rest, your making a LOT of assumptions, and piling assumption on top of assumption, then making bold claims about how you've discovered Canon has QPAF technology, all based on nothing but assumption and a misunderstanding of a tiny little image from a single page of ChipWorks site. If Canon had QPAF technology, they would NOT be keeping it a secret. That would be insane for them, what with the perception in the community at large being that Canon is behind on sensor tech. QPAF would be huge news for Canon. I've spent years on this forum debunking hair-brained theories like that because all it does is mislead people, give people false hopes, and otherwise confuse the issue about what any given technological advancement REALLY offers, what it REALLY allows, and how it is REALISTICALLY likely to evolve into the future. Your radically confusing the issue about DPAF. It's a very simple technology, designed for a SINGLE purpose, and it serves THAT purpose extremely well. Your trying to inflate it into something it isn't even remotely intended to be. Sorry, but I've never liked it when people make wild uninformed assumptions then boldly claim they know what their talking about. It's just something that irks me.


----------



## x-vision (Jun 21, 2014)

jrista said:


> That is the exact OPPOSITE of a shared pixel. Shared pixels SHARE readouts. Canon's DPAF use INDEPENDENT readouts.



Heh. You share readout circuitry between photodiodes ... to read their output independently. 
What a paradox. And yet, that's exactly what the industry has been doing for a decade now (or more?). 
Fascinating stuff. LOL.


----------



## jrista (Jun 21, 2014)

x-vision said:


> jrista said:
> 
> 
> > That is the exact OPPOSITE of a shared pixel. Shared pixels SHARE readouts. Canon's DPAF use INDEPENDENT readouts.
> ...



Your still misunderstanding. Pixels are activated row-by-row, and all columns are read out SIMULTANEOUSLY. Every column of an activated row of pixels has the charge stored in the photodiode read, amplified, and shipped down the column line AT ONCE. Because the photodiode is split per-pixel, that they occupy the same row. Therefor, you cannot share the readout transistors, because both are read out simultaneously. Therefor, DPAF does NOT use a shared pixel design. There is, literally, two independent sets of transistors to read out each half of the pixel when the row is activated...and twice as many columns. During an image read, additional binning transistors combine the charge of each photodiode half, that total charge is amplified, and only one half of the columns are used to move the charge down to the CDS units and column outputs.

Shared pixel designs usually share DIAGONALLY (I already said this, but apparently the reason did not sink in.) By sharing diagonally, you avoid the concurrent row problem. The first row is activated, the first set of pixels that are sharing readout logic are read. The next row is activated, and this second set of pixels uses the same set of transistors to read out as their DIAGONAL counterparts in the row above. I've also read about patents that share pixels vertically, which achieves the same result, but ends up resulting in mixed color output for every set of transistors...green/blue, red/green, etc. 

It isn't possible to share anything in the same row, though...because once a row is activated, everything in it has to be read out...and by nature, DPAF photodiodes for any given color filter share the same row.


----------



## x-vision (Jun 21, 2014)

jrista said:


> As for the rest, your making a LOT of assumptions, and piling assumption on top of assumption, then making bold claims about how you've discovered Canon has QPAF technology ...



Whenever I said anything on the topic, I've always started with 'by the look of things...', 'theoretically...', etc..
And I've been usually finishing my posts with 'it's all speculation', 'it's fun to speculate', or something like that. 
I wouldn't say that I've been making bold claims. Mostly speculation and fun 8).


----------



## x-vision (Jun 21, 2014)

jrista said:


> Therefor, you cannot share the readout transistors, because both are read out simultaneously. Therefor, DPAF does NOT use a shared pixel design. There is, literally, two independent sets of transistors to read out each half of the pixel when the row is activated...and twice as many columns.



In other words, the two halves are read as ... independent pixels. 
I feel that we are getting somewhere. 
Oh, wait! That's what I've been saying all along.

The shared-transistor part is a different story. 
I've never said explicitly which transistors are shared - and that's the key here.


----------



## Lightmaster (Jun 21, 2014)

i hope it´s not just dual pixel v2.... :


ps: some of you guys should really try to get a life and spend less time on forums..


----------



## neuroanatomist (Jun 21, 2014)

x-vision said:


> Also, your claim that there are exactly TWO PHOTODIODES (and that's it!) is not based on fact.
> We don't know for sure if Canon's design is a dual-pixel design (your assumption) or a quad-pixel design
> (my assumption).
> 
> ...



Your claim of a quad-pixel design is based on a flawed assumption you're making after totally misinterpreting a single piece of data (a picture of a part of the sensor that's not the active imaging area). The claim of a dual-pixel design is not based on assumptions, it's based on actual published patents issued to Canon for the technology. That's established fact vs. ill-informed speculation. 

If you want your claim to be in any way believable, you need actual evidence. Read the Canon patents on dual-pixel AF - show us where a quad pixel design is mentioned. Show us a verifiable image of the actual photodiodes of the 70D sensor (not the dead area at the very edge of the sensor in the Chipworks teaser). Until you can provide some evidence, your ASSumptions will just continue to make you look like a...foolish person.


----------



## wickidwombat (Jun 21, 2014)

Lightmaster said:


> i hope it´s not just dual pixel v2.... :
> 
> 
> ps: some of you guys should really try to get a life and spend less time on forums..



best post ever!


----------



## Orangutan (Jun 21, 2014)

x-vision said:


> I feel that we are getting somewhere.
> Oh, wait! That's what I've been saying all along.
> 
> The shared-transistor part is a different story.
> I've never said explicitly which transistors are shared - and that's the key here.



Doubling-down on a weak hand is not wise. I suggest you fold, and cut your losses.



> Oh, wait! That's what I've been saying all along.



Whatever you've been saying, you haven't said it very well. If you want to be taken seriously by some very well-educated folks on this forum, you might want to:


do your research before you make assumptions
be very clear about your definitions
cite sources
write clearly and concisely
not write antagonistically to people who may know more than you do


----------



## Orangutan (Jun 21, 2014)

I believe this is the original version: http://xkcd.com/386/





Lightmaster said:


> i hope it´s not just dual pixel v2.... :
> 
> 
> ps: some of you guys should really try to get a life and spend less time on forums..


----------



## scyrene (Jun 21, 2014)

Orangutan said:


> x-vision said:
> 
> 
> > I feel that we are getting somewhere.
> ...



Very well put 

I might add to that 'not retreat into "it's all fun speculation" when shown to be wrong'. Seriously, jrista and Neuro must have incredible patience.


----------



## tayassu (Jun 21, 2014)

Interesting... This cartoon also showed up in the last sensor technology discussion.  How did the wise man say once? "De-ja-vus aren't the errors in the matrix, they are the matrix." ;D


----------



## x-vision (Jun 21, 2014)

neuroanatomist said:


> Your claim of a quad-pixel design is based on a flawed assumption you're making after totally misinterpreting a single piece of data (a picture of a part of the sensor that's not the active imaging area).



My claim is not based solely on that. 

I now admit, though, that the partial picture of the sensor die from Chipworks is not sufficient/reliable evidence to support any claims. 
Therefore, I'm officially retracting it as evidence. I was wrong to quote it and I won't do it again. 



> The claim of a dual-pixel design is not based on assumptions, it's based on actual published patents issued to Canon for the technology. That's established fact vs. ill-informed speculation.



Of all those published patents, so far we've only seen a link to a schematic diagram of the 'dual-pixel' tech.
That's all. Everything else has been based on Canon's officially published information.

I'm not claiming that Canon's information is incorrect (hell, no). All I'm saying is that it's not guaranteed to be 100% revealing. 

I asked for a link to the all-important patent that shows rectangular photodiodes.
As I said in a previous post, this patent would settle the discussion - but Jrista hasn't provided it.

Btw, this is the CanonRumors site. 
By definition, most of the info published on the _*front page*_ of this site is ... ill-informed speculation.
If you have such high standards for fact vs fiction, how come you are a regular on this forum ??
If you were holding CanonRumors accountable for the info that they are publishing, they should have closed shop by now. 
???



> If you want your claim to be in any way believable, you need actual evidence. Read the Canon patents on dual-pixel AF - show us where a quad pixel design is mentioned. Show us a verifiable image of the actual photodiodes of the 70D sensor (not the dead area at the very edge of the sensor in the Chipworks teaser).



Fair enough. 

I've never presented my speculations as facts. 
But if one day I decide to do that, I'll make sure that I have very solid evidence.


----------



## jrista (Jun 21, 2014)

x-vision said:


> jrista said:
> 
> 
> > Therefor, you cannot share the readout transistors, because both are read out simultaneously. Therefor, DPAF does NOT use a shared pixel design. There is, literally, two independent sets of transistors to read out each half of the pixel when the row is activated...and twice as many columns.
> ...



You seem to think your getting somewhere...convincing me, or anyone, that you know what your talking about. I don't really have the heart to continue the conversation, because the clueless one here is not me...and this is just becoming disheartening. Go educate yourself. Please. Once you have actually seen the actual electrical diagram for at least one of Canon's DPAF patents, AND preferably read the accompanying description of how it works, maybe then we can have a coherent discussion.



x-vision said:


> > If you want your claim to be in any way believable, you need actual evidence. Read the Canon patents on dual-pixel AF - show us where a quad pixel design is mentioned. Show us a verifiable image of the actual photodiodes of the 70D sensor (not the dead area at the very edge of the sensor in the Chipworks teaser).
> 
> 
> 
> ...



The problem is that you keep "speculating" about the same thing, even though it's been proven wrong on multiple occasions. I DID link you several detailed pages that had the full patent information the last time we had this debate. It took a while to find a working link as well, but apparently you never read the link. I'm not going to go digging through the internet, spend all that time, to find something that your likely never going to read (most of the time, these patents are translated from Japanese, as they can be notoriously hard to find in the US Patent office's database...and therefor common search terms don't necessarily bring up what your looking for.) I'm not repeating that effort for someone who seems more interested in literally and intentionally ignoring the FACTS he's been presented with, and "speculating" about a falsehood that he himself initiated even though it's demonstrably false, invalid, incorrect, not real, not happening, and otherwise moot.

It's fine to speculate...when you ACTUALLY *DON'T* KNOW anything about whatever it is your speculating/rumormongering about. When it comes to DPAF...there is nothing to speculate about. Canon has files patents. That's the end of the story. WE KNOW. Your notion that "it's not guaranteed to be 100% revealing" is 100% absolutely wrong...patents MUST, BY DESIGN, INTENT AND FUNCTION, be 100% revealing. Otherwise they wouldn't be granted in the first place...specificity in a patent is key. Your ignorance in that only demonstrates you don't know much about patents, nor the technologies they can potentially describe. I'm no electrical engineer with a Ph.D, but that doesn't mean people who don't have degrees in electrical engineering are incapable of understanding an electrical circuit diagram or the terminology that describes them. I do my fair share of dabbling in electronics. (Right now I'm building a pelitier cooled DSLR cold box for my 5D III, which is involving a growing amount of electrical gadgetry and know-how to get it working the way I want.) Go educate yourself, read the actual patents, or in lie, read anything you can find about DPAF that isn't on Canon's site (so you can stop worrying that it's "just Canon marketing material"...although I'd offer that Canon is very up front about their technology, they have no _reason _to be misleading about DPAF)...and stop digging yourself into your hole. Your half-way to China right now.

Anyway, I think the myth of QPAF in the 70D or any other current Canon camera has been successfully debunked. The myth that DPAF or QPAF could be used to improve resolution in and of themselves, simply because they have independent readout, probably still needs to be revisited, however that's a debate for another day.


----------



## cnardo (Jun 21, 2014)

Looks like the folks over at Canonwatch.com have a different take on this whole 7D2, new sensor info going around as well as both announcement dates and shipping dates!  ???


----------



## x-vision (Jun 21, 2014)

jrista said:


> I don't really have the heart to continue the conversation, because the clueless one here is not me...


Here we go. Resorting to personal attacks and bullying again.

Admit it, you don't have a technical background. 
That's why you run in circles, contradicting yourself, when you try to explain a technical concept. 

You are the clueless one, not understanding the basic underlying principles.
I pointed out the flaws in your arguments about pixels vs photodiodes - but you are clueless to understand it.

Let's leave it at that.


----------



## x-vision (Jun 21, 2014)

jrista said:


> I'm no electrical engineer with a Ph.D ...



Yup, that was pretty obvious ... and I was right about that (so, I was definitely right at least about one thing 8) ).

Kudos for having the courage to admit it, though. Seriously.

Of course now all the people on your side of this argument should have second thoughts on
why they should be listening to you. 

And btw, I happen to be an electrical engineering by training. Not a Ph.D., though.


----------



## jrista (Jun 21, 2014)

x-vision said:


> jrista said:
> 
> 
> > I'm no electrical engineer with a Ph.D ...
> ...



LOL. I have not admitted anything that diminishes my position, here. It doesn't matter that I don't have a Ph.D. You missed the point entirely. You don't need a Ph.D. to understand this stuff, if your otherwise well educated enough to understand the technology involved.

Even if you are an electrical engineer, you are not educated as to how DPAF works. Your lacking a basic knowledge set that has allowed you to wildly speculate, without any sound basis for your speculation. Here is what Canon's ACTUAL patent, from the US Patent office, actually states:



> *United States
> Patent Application Publication
> Yoshimura et al.*
> 
> ...



Note the words used here. This is DIRECTLY from the patent:

"An image capturing apparatus *performs focus detection* based on a *pair* of image signals from an image sensor including _pixels_ _*each having a pair of photoelectric conversion units (photodiodes)*_ capable of outputting the *pair* of image signals obtained by _independently_ receiving a *pair* of light beams...

Performs focus detection. The intent of DPAF is to perform focus detection. There is no mention anywhere in this abstract that describes an increase in imaging resolution. 

Pair. Pair is used repeatedly in this abstract, indicating that there are only TWO photodiodes. 

The abstract quite explicitly describes a "pixel" as having a "pair" of _photoelectric conversion units_. In other words, a pair of photodiodes. 

Everything I've said before was not based on assumption, or my own personal ignorance because I'm not a Ph.D. wielding electrical engineer. Everything I said was based directly on Canon's own patent, invented by Yoshimura and Fukuda. Invented, actually, quite well before the 70D ever actually used the technology in an actual commercial product...the earliest date of foreign application (application for patent in Japan, I assume) was Dec. 13, 2011!!! The patent in the US office was filed Dec. 3, 2012. This technology is not particularly new, and it has been well described for years. We aren't lacking knowledge about it.

Here is some more of the patent. The summary of the invention (just the first part...this section of the patent is a couple pages long):



> SUMMARY OF THE INVENTION
> 
> [0007] The present invention has been made in consideration of the above situation, and improves defocus direction detection performance in focus detection of a pupil division phase difference method even When the pupil division performance is insufficient, and the influence of vignetting is large.
> 
> [0008] According to a first aspect of the present invention, there is provided an image capturing apparatus comprising: an image sensor including a plurality of two-dimensionally arranged pixels including pixels each having a pair of photo electric conversion units arranged to output a pair of image signals obtained by independently receiving a pair of light beams that have passed through different exit pupil regions of an imaging optical system; a first filter formed from a summation filter; a second filter formed from the summation filter and a differential filter; an acquisition unit arranged to acquire an f-number of the imaging optical system; a filtering unit arranged to perform filtering of the pair of image signals by selecting the first filter When the f-number is less than a predetermined threshold and selecting the second filter When the f-number is not less than the threshold; and a focus detection unit arranged to perform focus detection by a phase difference method based on the pair of image signals that have undergone the filtering by the filtering unit.



I don't assume when I have the option of actually knowing. I like to actually KNOW...know everything I can about as many things as I can. I have based my assertions in this debate on actual facts, derived from a patent that clearly and explicitly specifies exactly what DPAF is: What it's purpose is, how it works, why it works that way. 

You now have the patent number. You have what you need to go look the patent up yourself and read it.


----------



## neuroanatomist (Jun 21, 2014)

x-vision said:


> neuroanatomist said:
> 
> 
> > Your claim of a quad-pixel design is based on a flawed assumption you're making after totally misinterpreting a single piece of data (a picture of a part of the sensor that's not the active imaging area).
> ...



Well, that's magnanimous of you considering that you've been told at least half a dozen times that the image was irrelevant and proof of nothing. 

If your claim isn't based solely on that image, then upon what VERIFIABLE FACTS (you know...things like patents) is it based? One would think after the number of posts in which you insisted your opinion was plausible that you could have produced at least _some_ additional documentation.



x-vision said:


> Btw, this is the CanonRumors site.
> By definition, most of the info published on the _*front page*_ of this site is ... ill-informed speculation.
> If you have such high standards for fact vs fiction, how come you are a regular on this forum ??
> If you were holding CanonRumors accountable for the info that they are publishing, they should have closed shop by now.



Personally, I have no difficulty in distinguishing rumor from fact from logical opinion from incorrect opinion based on bogus assumptions. I think most of know into which of those buckets your 'the 70D uses quad-pixels' idea falls.


----------



## Sabaki (Jun 21, 2014)

Anybody think it's possible Canon could drop a dedicated processor into the 7D2 to handle AF subject recognition, like the 1DX does?

Better noise handling is my biggest want but a superb AF system is a close second.


----------



## jrista (Jun 21, 2014)

Sabaki said:


> Anybody think it's possible Canon could drop a dedicated processor into the 7D2 to handle AF subject recognition, like the 1DX does?
> 
> Better noise handling is my biggest want but a superb AF system is a close second.



It depends. If the 7D II hits with a new DIGIC processor, they may not need to resort to a dedicated AF processor. Each generation of DIGIC chips gets considerably more powerful than their predecessors. I'd imagine a DIGIC 7 would be quite considerably more capable than the DIGIC 6 used in some of the more compact cameras. DIGIC 6 does a LOT of image processing (It's more like Sony's Bionz X chip than the DIGIC 5), with high quality noise reduction, high quality video processing, etc. If Canon created a DIGIC 7 with some 7x or so more processing power than the DIGIC 6, they could easily handle high frame rates as well as high end AF capabilities all in the one chip (or two chips, as it's probably likely to be.)


----------



## x-vision (Jun 21, 2014)

jrista said:


> Here is what Canon's ACTUAL patent, from the US Patent office, actually states:



Oh my goodness! You haven't read the patent past the summary section. 

Because if you had, you would have come across section [0038].
Here's it is on Page-13 of the patent's PDF file (which you can get from here):


> *[0038]* FIG. 2A is a view for explaining the pixel arrangement
> of the image sensor... FIG. 2B is an enlarged view of
> the pixel 210G which includes a plurality of photoelectric
> conversion units (to be referred to as “*sub-pixels* 201a and
> ...



So, you quoted this patent in support of your TWO PHOTODIODES claims - but the patent says otherwise. 

Dude, you have to work not only on your technical skills but on your comprehension skills in general. 
When arguing about something, it's plain retarded to argue with yourself. 

Canon themselves are talking about sub-pixels. 
Not only that. Here's the last part of Section [0038]:



> The pixel group 210 having the above-described
> structure is repetitively arranged. Note that in the arrange-
> ment shown in FIGS. 2A and 2B, all pixels include the sub
> pixels 201a and 201b. _*Instead, the pixels including the sub
> pixels may be discretely arranged in the image sensor 107.*_



This is (maybe deliberately?) vague but one way to interpret it is that Canon is saying that the 
arrangement illustrated in Fig. 2A doesn't need to be strictly followed. 
That is, they are opening themselves to a different implementations of the same concept - without
actually specifying the implementation.

Anyway, I don't know what you are arguing about.

This patent is about a methid of focus detection on a sensor. 
That is, they are patenting the *method* itself; nothing is mentioned about how exactly 
they are going to etch the pixels and sub-pixels on the wafer.

They say that each pixel has two sub-pixels - but at the same time they are also leaving the door 
open for alternative implementations as well. 

That would be very clear to a technical person (or someone with a legal background as well - because of 
how the last part of Section [0038] is worded). 

But since you are not a technical person these things are escaping you.
(You could have at least read the patent in its entirety, btw - but you haven't done even that.) 

I'm done.


----------



## x-vision (Jun 21, 2014)

jrista said:


> "... an image sensor including _pixels_ _*each having a pair of photoelectric conversion units (photodiodes)*_ capable of outputting the *pair* of image signals obtained by _independently_ receiving a *pair* of light beams...



Btw, the word 'photodiode' is not even mentioned in the patent. So, you basically forged this quote. 

The patent talks about sub-pixels, not photodiodes. 
Your entire stance is based on forged/misinterpreted information.


----------



## mkabi (Jun 21, 2014)

jrista said:


> Sabaki said:
> 
> 
> > Anybody think it's possible Canon could drop a dedicated processor into the 7D2 to handle AF subject recognition, like the 1DX does?
> ...



I'm hoping for a dual-Digic 6+ or dual-Digic 7...


----------



## x-vision (Jun 21, 2014)

neuroanatomist said:


> If your claim isn't based solely on that image, then upon what VERIFIABLE FACTS (you know...things like patents) is it based?



See my previous reply to Jrista. 

He's been misleading you all - based on his (mis)interpetations of Canon's patents.
The Canon patent he quoted talks about sub-pixels, no photodiodes, as Jrista claims.
So, verify your facts when siding with someone. Otherwise, you end up looking a fool yourself.


----------



## neuroanatomist (Jun 21, 2014)

x-vision said:


> neuroanatomist said:
> 
> 
> > If your claim isn't based solely on that image, then upon what VERIFIABLE FACTS (you know...things like patents) is it based?
> ...



It clearly states *TWO* sub-pixels for each pixel, not four as you've been claiming all along. Please, just stop...you're embarrassing yourself.


----------



## Don Haines (Jun 21, 2014)

Who is more foolish, the fool, or the fool that keeps arguing with him?

Gentlemen, it is more than obvious that you are not ever going to agree. Give it a rest. Go take some pictures.... have a beer, watch a soccer game.. relax.. chill out.... hug your wife... play with a kitten...

There are far more important things to do than continue a battle where both sides loose.


----------



## x-vision (Jun 21, 2014)

neuroanatomist said:


> It clearly states *TWO* sub-pixels for each pixel, not four as you've been claiming all along.



Good. At least we've have established that these are sub-pixels, not photodiodes. 

But I guess that also makes Jrista's _hundreds_ of misleading claims about photodiodes all false.
It seems that he's the one who's been continuously embarrassing himself - together with the small 
gang lining up in support of an imposter. 

Me? I just made a speculation that instead of two sub-pixels, Canon is using four. 
That's not the least embarrassing.

LOL. I'm having so much fun.


----------



## danski0224 (Jun 21, 2014)

I'm hoping for a Foveon type sensor


----------



## Orangutan (Jun 21, 2014)

x-vision said:


> LOL. I'm having so much fun.



But you're learning nothing, not about photography nor about communicating with people.


----------



## neuroanatomist (Jun 21, 2014)

Orangutan said:


> x-vision said:
> 
> 
> > LOL. I'm having so much fun.
> ...



Gaining new knowledge and effective communication are not priorities for trolls.


----------



## neuroanatomist (Jun 21, 2014)

x-vision said:


> Me? I just made a speculation that instead of two sub-pixels, Canon is using four.
> That's not the least embarrassing.



Persistent defending a clearly flawed argument with a complete lack of supporting evidence against all documented evidence to the contrary is what you should be embarrassed about. The fact that you seem to find it amusing says much about your character...and none of it good.


----------



## x-vision (Jun 21, 2014)

neuroanatomist said:


> Persistent defending a clearly flawed argument with a complete lack of supporting evidence against all documented evidence to the contrary is what you should be embarrassed about. The fact that you seem to find it amusing says much about your character...and none of it good.



I think I've said at least a hundred times that I've been only speculating. 
And yet, you keep repeating that I'm defending a flawed argument. I'm not. 

Here it is one last time: I was/am just speculating ... on a speculation forum. 
Got it?

On the other hand, I did show that another forum member, jrista, has been making baseless claims. 
That's not the issue, though. He's been bullying people, including myself, with these claims. 

But somehow that's OK with you. In fact, you are obviously sympathetic and supportive of this bully.
Fair enough. You have to live with yourself, not me.


----------



## jrista (Jun 21, 2014)

x-vision said:


> jrista said:
> 
> 
> > "... an image sensor including _pixels_ _*each having a pair of photoelectric conversion units (photodiodes)*_ capable of outputting the *pair* of image signals obtained by _independently_ receiving a *pair* of light beams...
> ...



Photoelectric converter == photoconductor == photodiode.

It's all the same thing. I haven't _forged_ (LOL ???) or misinterpreted anything. As for "sub-pixel"...again, the *same thing*. It's all a photodiode. It's a concept were talking about here...an anode and a cathode plugged into a bit of silicon that has the ability to convert incident photons to free electrical charge. That's what a sub-pixel in Canon's patent is! I've always used the term photodiode (because that's what it is, if you look at the symbol on an electrical diagram, and there are two other patents in Japanese that have actual electrical diagrams, it's a *light-sensitive diode*). Some of the other patents actually clearly show two "photoelectric converters" per pixel in multiple diagrams. If you had read my earlier comments about SPATIAL RESOLUTION, you would understand WHY it's the same thing, and why it doesn't matter if there are two, four, or N number of them contained within a single "pixel" (a multi-layered structure containing one or more photodiodes, a CFA, and a microlens). 



x-vision said:


> neuroanatomist said:
> 
> 
> > It clearly states *TWO* sub-pixels for each pixel, not four as you've been claiming all along.
> ...



Hmm. Hundreds, eh? Care to, um, enumerate all several hundred for us? I'd...really like to see that.

You've somehow equated the term "sub-pixel" with "pixel". Why use a different term, sub-pixel, if it's the same thing? Sub-pixel == photodiode == photoconductor == photoelectric converter. _Conceptually_, in the context of CIS, these things are identical. They are all represented by the same symbol in an electrical circuit diagram (a photodiode:







). _Conceptually_, in the context of CIS, a pixel and a photodiode are not the same thing. This is a pixel:






As it so happens, this is a pixel with two photodiodes (the N-type silicon dropped into the P-type substrate.) I really don't care what terms are used, ultimately in the end, the term used to describe the thing isn't what matters, it's the concept the term encapsulates that matters...sup-pixel, photodiode, photoelectric converter...pick your poison. A (full, discrete, atomic) pixel and a photodiode are different conceptual things. You can mince words all you want, but now your obfuscating and dancing around the original point: _You_ have claimed, in multiple threads for a good while now, that not only does Canon have QPAF, but that somehow QPAF/DPAF somehow leads, probably with ML (although I don't remember if you actually said that exactly) to better resolution. THOSE are the points at debate. Try all you want to play me for a fool, it isn't going to phase me, I don't care. Be as happy as you want that you discovered the term "sub-pixel" in the patent. To me, it's the same freakin thing, the same exact concept...a photodiode. I don't equate sub-pixel with pixel, as one is a complex multi-layered structure, one is a bit of doped silicon with an anode and a cathode tacked onto the ends that is a part of a pixel. 

From a spatial resolution standpoint, DPAF doesn't bring anything to the table. You would need to redesign THE PIXEL, that complex multilayered thing built into the silicon substrate, to actually make DPAF, QPAF, or any number of bits to a diced up piece of silicon, become something more than just being separate photodiodes/photoconductors/photoelectric-converters/sub-pixels under one microlens/color filter into something that actually can actually meaningfully separate spatial frequencies, resolve them independently, and which ultimately represents more detail in a two-dimensional spatial frequency (in other words, an image.) 



Anyway...I'd like to see the list of _hundreds_ of things I've made misleading claims about. That's hundred*s*, plural. Get to work, bub! 

As for me, as Don has said, this conversation has just gone way off the tracks and has become pointless. I'm not embarrassed by anything I've said here, I'm confident in my knowledge and assessments, but the conversation itself is becoming embarrassing. It's clear your not interested in any of the facts, this has devolved into an "I'm right your wrong" spitfest. I'm not interested in that. I proved my points, I debunked the myth I wanted to debunk, and not even for your sake...for everyone else's sake (although I don't think they care any longer): There is no QPAF. DPAF does not enhance resolution, and likely never will (not without PIXEL redesign). You've retreated, and contracted your argument into the most basic, minimally attackable position possible: _I was just speculating and having fun!_ : : Fine by me. I'm clearly not alone in my assessments, others back me up, so I'm happy to exit the conversation here. I don't like to keep debating once the locals get fed up with the conversation.  (Sorry, Don!)


----------



## neuroanatomist (Jun 22, 2014)

x-vision said:


> Here it is one last time: I was/am just speculating ... on a speculation forum.
> Got it?



Speculation concerns the UNknown. Since Canon has patented dual pixel AF and explicitly stated that is the technology used in the 70D, that is KNOWN FACT. Stating that it could be 'quad pixel' isn't speculation, it's just plain wrong. 

But yes, I get it. You made claims and provided 'evidence', and when both were proven false, you continued to defend them as evidence mounted against you until...well...you were only speculating on a rumor forum for fun. That's a cop-out, pure and simple...and a rather undignified and pathetic one at that.


----------



## x-vision (Jun 22, 2014)

jrista said:


> You've somehow equated the term "sub-pixel" with "pixel".



Yes ... and this is correct. 



> Why use a different term, sub-pixel, if it's the same thing?


Canon is using the term sub-pixel in their patent, not me.

You _think _ that a sub-pixel and a 'photodiode' are the same thing - but they are not. This is where you are incorrect. 
A sub-pixel _*is*_ a pixel - and includes a photodiode plus readout circuitry.

So, this is not just a matter of using (incorrect) terminology. 
It's about understanding of how things work and what is meant when someone says a photodiode and a (sub)pixel.



> _Conceptually_, in the context of CIS, these things are identical.



Canon's diagram doesn't show a pixel with two photodiodes. 
It's just an illustration the of principle of reading two halves of a pixel independently. 

Here's how Canon describes Fig.2C that from your post: 



> [0017] FIGS. 2A to 2C are *views for explaining the arrangement* of an image sensor according to the embodiment;



These are _views for explaining the arrangement_. Just a conceptual diagram, basically. 
Nothing is mentioned about photodiodes. You are reading too much into it if you are thinking that 
it's an actual diagram of the sensor. 



> _You_ have claimed, in multiple threads for a good while now, that not only does Canon have QPAF, but that somehow QPAF/DPAF somehow leads, probably with ML (although I don't remember if you actually said that exactly) to better resolution. THOSE are the points at debate.



I have SPECULATED (repeat, *SPECULATED*) that the dual-pixel tech is in fact a quad-pixel tech. 

But for some reason you started a crusade against this (harmless) speculation - resulting in this tiresome debate 
about pixels and photodiodes - and about how you know what you are talking about - and how I don't.

I don't even understand why you are arguing.
The DPAF patent in no way disproves my speculation. 
In fact, the patent even hints that having dual sub-pixels is not the only possible arrangement. 
A quad-(sub)pixel arrangement is certainly not ruled out by this patent.



> Be as happy as you want that you discovered the term "sub-pixel" in the patent. To me, it's the same freakin thing, the same exact concept...a photodiode. _*I don't equate sub-pixel with pixel*_, as one is a complex multi-layered structure, one is a bit of doped silicon with an anode and a cathode tacked onto the ends that is a part of a pixel.



That's something you have to work on.

You will eventually realize that to make a sub-pixel ... it needs to be a full-blown pixel. 
Or, as Canon puts it their own patent: _the pixels including the sub pixels may be discretely arranged in the image sensor_. 



> As for me, as Don has said, this conversation has just gone way off the tracks and has become pointless.



Absolutely!


----------



## Brian VA (Jun 22, 2014)

While no one knows ( at least publicly) what the new 7D MK2 sensor is, I speculate we will all be surprised and impressed. Sounds like Canon will become a shaker and a mover as they have in the past with this new technology. I am very optimistic for something groundbreaking. The pieces I am putting together is that the CR1 pasted below is the same new sensor we might see in the 7D MK2. There are now rumors of a 5D MK4 and a 1-DX MK2 coming in early 2015. Is this to get the new sensor technology in these flagship full frame cameras? Based on the CR2 rumor that the new sensor will be used in all forthcoming cameras, I think it is. I'm excited and looking forward to the 7D MK 2!


A mention on Northlight about someone from Canon apparently visited a few studios in the New York City area recently with a “test” camera in an EOS-1D X body. Images had to be processed and viewed on a particular laptop and none of the images could be copied or kept.

The image files were very similar in size to the EOS 5D Mark III’s 22mp files, but exhibited “much” better colour accuracy and detail. This camera is supposedly for later this year or early next year.

Could this be a replacement to the EOS-1D X, or an introduction of a new camera such as the rumor favourite EOS 3D?


----------



## x-vision (Jun 22, 2014)

jrista said:


> You've somehow equated the term "sub-pixel" with "pixel". Why use a different term, sub-pixel, if it's the same thing?



So, the crux of this argument, really, is whether a sub-pixels is a photodiode or a pixel.

Let me just state the Canon's DPAF patent doesn't even mention the word photodiode. 
Instead, it is using the wording 'sub-pixel'.

You are only _*assuming*_ that by a 'sub-pixel' Canon actually means a photodiode. 
This is just an assumption, however, as no statement/fact from the patent supports it. 
Let's be very clear about this. 

I, on the other hand, am *assuming* that a sub-pixel is in fact a full-blown pixel.
This is another assumption, however, as the patent doesn't define what a sub-pixel really is.

In other words, this tiresome, pointless debate is a debate about two assumptions. 

I am perfectly fine that I'm making an assumption. But yours is an assumption too, mind you. 
Neither your assumption, nor mine, is any more valid than the other, though, as what you've 
been saying is no more based on facts vs what I've saying.


----------



## jrista (Jun 22, 2014)

x-vision said:


> jrista said:
> 
> 
> > You've somehow equated the term "sub-pixel" with "pixel". Why use a different term, sub-pixel, if it's the same thing?
> ...



It does state *photoelectric converter*, though, in the abstract (a simplification into fewer words of the extremely wordy breakdown) which most definitely IS a photodiode. The abstract also clearly states that there are two "photoelectric converters" per "pixel". You've handily ignored the abstract, but it is still an entirely valid description, and is still a part of the patent. The description of a sub-pixel, in combination with how they are portrayed in the diagrams, also indicates they are photodiodes. Sure, there is readout logic, as there is binning and readout logic for the whole pixel. Is the readout logic part of the pixel, or the photodiodes? We could debate that round and round as well, I'm sure. Again...all just words used to describe concepts. We can mince words all day and all night, which is why I'm going to go back to working in my yard. Ta! Ta!


----------



## x-vision (Jun 22, 2014)

jrista said:


> It does state *photoelectric converter*, though, in the abstract (a simplification into fewer words of the extremely wordy breakdown) which most definitely IS a photodiode.


The patent doesn't state/claim that a 'photoelectric conversion unit' is a photodiode.
You claim that - and I disagree, since this claim is not based on what the patents says. 



> The abstract also clearly states that there are two "photoelectric converters" per "pixel". You've handily ignored the abstract, but it is still an entirely valid description, and is still a part of the patent.


The abstract doesn't mention anything about photodiodes. You are making stuff up. 



> The description of a sub-pixel, in combination with how they are portrayed in the diagrams, also indicates they are photodiodes.


Nothing indicates that these are photodiodes. They are not marked as such with the photodiode symbol, for example. 
So, it's actually you who is assuming that these are photodiodes - the patent certainly doesn't indicate/state that. 



> Sure, there is readout logic, as there is binning and readout logic for the whole pixel. Is the readout logic part of the pixel, or the photodiodes?



A photodiode doesn't have a read-out logic; a pixel does. 
So, by this token, sub-pixels are in fact pixels, as pixels do have read-out circuits - unlike photodiodes.



> We could debate that round and round as well, I'm sure. Again...all just words used to describe concepts.



Sure. It's your assumptions vs mine. No facts from you so far, mind you.


----------



## jrista (Jun 22, 2014)

Ah...so, we have a sensor now, without any light sensing components? But we have sub-pixels, and that's VEEERY IMPORTANT, ppls! Interesting...


----------



## x-vision (Jun 22, 2014)

jrista said:


> Ah...so, we have a sensor now, without any light sensing components?


Aah. You come across as the king of false assumptions, you know. 
Who said that sub-pixels don't have light sensing components?
You made this up again. 

By definition, a pixel includes a photodiode. 
And as I already said ... many times ... sub-pixels are full-blown pixels - on the wafer level, that is. 
The only difference is that sub-pixels share the same microlens and color filter with other sub-pixels - to form full pixels.


----------



## Flailingarms (Jun 22, 2014)

jrista and neuroanatomist: You are wasting your time replying to ... well, you know ... seems to me ... well, you know who ... is just flapping his mouth ... er, keyboard ... to hear the rattling in his skull. He may be getting some sort of ego-trip satisfaction therefrom, but I opine he is the only one. There is an old saying: "You can't have a battle of wits with someone who comes to the conflict unarmed."


----------



## x-vision (Jun 22, 2014)

Flailingarms said:


> ... to hear the rattling in his skull.



You haven't read a single thing that I wrote, you bottom feeder. 
But crawled out from under your rock to share some personal insults. 
What a friendly little fella.


----------



## Orangutan (Jun 22, 2014)

Flailingarms said:


> jrista and neuroanatomist: You are wasting your time replying to ... well, you know ... seems to me ... well, you know who ... is just flapping his mouth ... er, keyboard ... to hear the rattling in his skull. He may be getting some sort of ego-trip satisfaction therefrom, but I opine he is the only one. There is an old saying: "You can't have a battle of wits with someone who comes to the conflict unarmed."



That's the sad part: he's not entirely stupid, he might even have some useful brainpower. The problem is that to be *more* knowledgeable than he is today, he needs to be quick to admit when he's wrong. The kid (I assume it's a younger person) might be able to make something of himself if he'd dial-down the testosterone and realize that an empty/false "victory" is less important than learning. Well, either he'll grow up or he won't.


----------



## x-vision (Jun 22, 2014)

Orangutan said:


> That's the sad part ...



I somehow get the impression that this jrista guy is some kind of royalty around here.
Why is every one kissing his @ss when he's so misguided on multiple accounts ??
Am I missing something? That's really rubbing me the wrong way. 

And yes, I did admit that I was wrong for the thing that I was wrong about.
Let's see if this guy jrista will do the same for the things that he's wrong about.


----------



## Orangutan (Jun 22, 2014)

x-vision said:


> Orangutan said:
> 
> 
> > That's the sad part ...
> ...



jrista is not "royalty." He's a very knowledgeable guy who has, in the past, readily admitted when he was wrong. He has a history of being generous with his time and experience, and arguing fairly. This has earned him a measure of respect, but I doubt anyone would cite him as an infallible authority, least of all himself.

My take-away is this: given jrista's history of arguing fairly, if you threw your best at him and he disagreed, then the burden is on you to be more persuasive. That's not to say that you're wrong, but that you've been unpersuasive. So why don't you try this: start all over with your thought-experiment. Be more precise about your definitions, and even try to dodge irrelevant definitions when they get in the way of the discussion. Lay out your speculation in language that's not contested, citing supporting evidence where you can find it. Then see what response you get.

By the way, there are several working EE's on this forum. My guess is that if jrista were way off the mark one of them would have corrected him (and it could still happen).


----------



## x-vision (Jun 22, 2014)

Thanks for the thoughtful and respectful answer.


----------



## jrista (Jun 22, 2014)

And back to our regularly scheduled programming.

The 7D II's new technology. What will it be? Here are my thoughts, given past Canon announcements, hints about what the 7D II will be, interviews with Canon uppers, etc. 

1. *Megapixels:* 20-24mp
2. *Focal-plane AF:* Probably DPAF, with the enhancements in the patents Canon was granted at the end of 2013. *
3. *New fab process:* Probably 180nm, maybe on 300mm wafers. **
4. *A new dedicated AF system:* I don't know if it will get the 61pt AF, probably too large for the APS-C frame. A smaller version...41pts would be my hope. Same precision & accuracy of 61pt system on 5D III, with same general firmware features.
5. *Increased Q.E.:* Canon has been stuck below 50% Q.E. for a long time now. Their competitors have pushed up to 56% and beyond, a couple have sensors with 60% Q.E. at room temperature. Higher Q.E. should serve high ISO very well.
6. *Faster frame rate:* I suspect 10fps. I don't think it will be faster than that, 12fps is the reserved territory of the 1D line. 
7. *Dual cards:* CF (CFast2) + SD. I hate that, personally, but I really don't see the 7D line getting dual CF cards. (I'll HAPPILY be proven wrong here, though!)
8. *No integrated battery grip.* Just doesn't make sense, the 7D was kind of a smaller, lighter, more agile alternative to the 1D, a grip totally kills that.
9. *New 1DX/5DIII menu system.* Personally, I would very much welcome this! LOVE the menu system of the 5D III.
10. *GPI and WiFi:* I think both should find their way into the 7D II, what with the 6D having them. Honestly not certain, though...guess it's a tossup.
11. *Video features:* Video has always been core to the 7D II rumors. 60fps 1080p; 120fps 720p (?); HDMI RAW output; External mic jack; 4:2:2; I think DIGIC 7 would probably arrive with enhancements on the DIGIC 6 image and video processing features. Maybe on par with Sony's Bionz X chip.

* Namely, split photodiodes, but with different sizes...one half is a high sensitivity half, the other half is a lower sensitivity half. The patents are in Japanese, and the translations are horrible, so I am not sure exactly WHY this is good, but Canon's R&D guys seem to think it will not only improve AF performance and speed, but "reduce the negative impact to IQ"....which seems to indicate that the use of dual photodiodes has some kind of impact on IQ, a negative impact. 

** We know Canon has been using a 180nm process for their smaller form factor sensors for a while. Not long ago, a rumor came through, I think here on CR, indicating Canon was building a new fab and would be moving to 300mm wafers. That should greatly help Canon's ability to fabricate large sensors with complex pixels for a lot cheaper. A smaller process would increase the usable area for photodiodes, as transistors and wiring would be a lot smaller than they are today on Canon's 500nm process. That would be a big benefit for smaller-pixel sensors. If they moved to a 90nm process, all the better. I don't suspect we'll see any kind of BSI in the 7D II...but, who knows.


----------



## Orangutan (Jun 22, 2014)

jrista said:


> And back to our regularly scheduled programming.
> 
> The 7D II's new technology. What will it be? Here are my thoughts, given past Canon announcements, hints about what the 7D II will be, interviews with Canon uppers, etc.
> 
> ...



If they sell that camera at a reasonable price I can see myself buying one (4-6 months after release, when the price drops)


----------



## Marauder (Jun 22, 2014)

jrista said:


> And back to our regularly scheduled programming.
> 
> The 7D II's new technology. What will it be? Here are my thoughts, given past Canon announcements, hints about what the 7D II will be, interviews with Canon uppers, etc.
> 
> ...



41 AF points would be excellent, especially if they were all cross-type! Still, Nikon has crammed 51 into its cropped frame, so it might be possible for Canon to put 61 into a crop frame. Based on what I've read (and the little on the few minutes I actually handled a 1DX at a camera show), the 61 point systems on the 1DX/5D3 are quite concentrated in the centre. Is it possible for Canon to adapt a 61 point system for the 7D2 that fills more of the frame, given that it's a crop? Would that be a possible adaptation of the existing Full Frame AF to a Crop Frame analogue--same density over a "larger" area of the frame given the tighter field of view?

As to the frame rate, the initial specs all said 10 fps. When they suddenly began to suggest it might be 12, I was rather surprised, but I'd be pleased if it were true--but 10 will be awesome too (it definitely needs to outperform the 8 fps of the current model!). I suppose it depends on what they think Nikon will do, and also what the next 1D series will do. I also think it will probably be 10, but 12 would give it more time on top of the competition and they might just want to overshoot in order to make this a standard for crop-frames for the next several years. 

Either way, I'm excited!


----------



## flyingSquirrel (Jun 22, 2014)

After reading a number of (what else, rumors, albeit convincing ones) on a few other websites,* I am highly inclined to believe that there will NOT be a 7D mark II.* I am very convinced that Canon will unveil this new body, which we are referring to as the 7D mark II, as a completely "unrelated" camera line/series/ what have you. It could potentially have some things in common with the 7D, and could potentially be considered somewhat of a follow up to it, but I really think this is meant to be the next level of body, something new, different, and not really meant to be a 7D mark II. This is completely my own feeling, based on what I've read here and on other sites. All I can do, personally, is pray that it is geared toward the type of shooting that I do, and my needs. If not, I will be beyond disappointed.

EDIT: After more 100% pure speculative thinking, I have decided that I would be willing to bet money (if I had any money) that this camera will have 30+ MP, and that, in Q1 of 2015, a camera will be released from Canon which will have 40-50 MP. Call me crazy.


----------



## jrista (Jun 22, 2014)

Marauder said:


> jrista said:
> 
> 
> > And back to our regularly scheduled programming.
> ...



The Canon 61pt system was actually ground-breaking for covering the widest spread of frame ever. Based on what I see in my 5D III viewfinder, I think the AF point spread is larger than the entire 7D frame!  It's actually rather incredible. 

That said, the dedicated phase-detect system is comprised of a little unit embedded in the base of the mirror box. Part of the unit is a small lens that splits the light and redirects it to each line sensor. I don't see why that little lens couldn't be redesigned for a smaller frame, but then I don't know if the size of the AF sensor itself would be too large (require too much bending of light that you could no longer accurately detect phase at the extremes?)

I think a 10fps frame rate is more than reasonable. I was pretty happy with 8fps on the 7D, 10fps would just be a bonus. I think 12fps would require some fairly significant engineering...that was another one of the ground-breaking things with the 1D X. They had to completely redesign the mirror apparatus to handle that kind of frame rate. It really sounds like a machine gun, too, and I'm actually not sure I like that. The 5D III is so quiet, it's wonderful for wildlife (it has more of a soft chi-ching sound, vs. the 7D's cha-thuck, and the 1D's Cha!-Cha!-Cha!-Cha!-Cha!-CHA!)


----------



## jrista (Jun 22, 2014)

flyingSquirrel said:


> After reading a number of (what else, rumors, albeit convincing ones) on a few other websites,* I am highly inclined to believe that there will NOT be a 7D mark II.* I am very convinced that Canon will unveil this new body, which we are referring to as the 7D mark II, as a completely "unrelated" camera line/series/ what have you. It could potentially have some things in common with the 7D, and could potentially be considered somewhat of a follow up to it, but I really think this is meant to be the next level of body, something new, different, and not really meant to be a 7D mark II. This is completely my own feeling, based on what I've read here and on other sites. All I can do, personally, is pray that it is geared toward the type of shooting that I do, and my needs. If not, I will be beyond disappointed.
> 
> EDIT: After more 100% pure speculative thinking, I have decided that I would be willing to bet money (if I had any money) that this camera will have 30+ MP, and that, in Q1 of 2015, a camera will be released from Canon which will have 40-50 MP. Call me crazy.



If the thing that is released is heavily video-based, then I think it'll probably be something else. If it is still primarily a stills DSLR geared for semi-pro action shooters, I think it will still get the 7D moniker.


----------



## mb66energy (Jun 22, 2014)

What about a 7D C(inema) as a parallel development with
4k Video at reasonable frame rates
a hybrid viewfinder

and a photo related 7D mark II which will be anounced later
with a totally new sensor tech.
Preferably a sensor which separates the whole incoming light
e.g. by interference filters to feed three photodiodes for R-, G- and
B-channel ... just dreaming about nearly lossless color separation
and a 1:1 mapping of imaging and image pixels!


----------



## Palettemediaproduktion (Jun 22, 2014)

Why don't we just sit still and wait until Canon actually shows us the next camera instead of guessing like this? Answer: Because it is so fun to guess and speculate. We enjoy ourselves here. My suggestion is to have a gentle smile on the lips writing your next post. I have a hard time with some of the heavy metal "I know better than all of you" styles appearing here from time to time. 

Ones upon a time the earth was considered to be flat. Then some one at the risk of his life made it change to a sphere. Today you find lots of clips on YouTube suggesting it is flat after all. Or rather: an inverted sphere. And of course we would all like to kill them off and will not listen to their claimed proofs. This is the downside of being a human. We are always feeling we got it better than everyone else. That is how the thinking brain we have is designed to function. "We will never fly, there will never be color television". You know, most of us are not interested in becoming a new Jules Verne. We just want everyone with a different idea that opposes our believes to shut up! And then have them say "My fault. YOU are right." 

We can help each other to gain from this if we use our knowledge about the strong part of our intellect and the embarrassing backsides. A flexible mind will move faster towards the unseen and not yet achieved. And lets be honest. We all would like to be there at the heart of the Canon development office giving them our best advices how to make next camera PERFECT. So let us unite and go to work We can fix this! 

Now 

Thank you guys for fighting about the "quad pixel theory" for me. I was tempted to rush back into it but I felt some good persons with a sharper intellect could do it better. Now I can step back in and continue where I left. There is still smoke in the air. Somehow I fear it is not going to disappear. Thank you X-vision for doing the dirty work for me  


canonrumors.com quote:

"We’re told to definitely expect new sensor technology to be introduced in the Canon EOS 7D Mark II. This tech will be used in all forthcoming Canon DSLRs. What is it? We’re not 100% sure yet, though we’re told it’s definitely not a foveon type technology that we’ve previously seen in patents.

This may be one of Canon’s best kept secrets as it’s apparently going to be more than an “evolutionary” technology."

Link: http://www.canonrumors.com/2014/06/new-sensor-tech-in-eos-7d-mark-ii-cr2/


This sounds like speculation, but still I think most people here believe it could be true, right? So now I will SPECULATE and do some guessing together with you. If Canon actually is moving on to new sensor tech. What is it? And are they making a U-turn or are they just pushing the tech forward in more than one step at the time?

If the quote from canonrumors is to be taken seriously the new sensor tech will be used on both pro and at least semipro DSLR in the future. FF and crop sensors will share this technology.

Some might say it will most likely just be180nm process and that´s it. But that will not be named "Canon’s best kept secrets" by anyone. So I think it is reasonable to assume they are trying to develop something that can rival the current SONY sensor tech.

I have already posted the link to the paper describing the CMOS Image Sensors With Multi-Bucket Pixels for Computational Photography in IEEE 

JOURNAL OF SOLID-STATE CIRCUITS, VOL. 47, NO. 4, APRIL 2012 written by Gordon Wan, Xiangli Li, Marc Levoy, Mark Horowitz and Gennadiy Agranov - Vice President of Imaging Technology at Aptina Imaging. They describe the quad bucket sensor aimed at Computational Photography. 

https://graphics.stanford.edu/papers/gordon-multibucket-jssc12.pdf 


Thanks to the last posts from X-vision we can compare this with the Canon patented dual pixel concept. 


In the simplified abstract Canon says: 

"Each pixel on the EOS 70D camera's sensor consists of two independent photodiodes that function both as imaging points and as individual phase-difference AF sensors. When the shutter button is pressed, parallax images on each photodiode of the pixel are detected, the amount of lens drive is calculated to correct the amount of shift in the AF points, and AF is achieved nearly instantaneously. During image capture, the same two photodiodes record the image and output as a single pixel. By placing approximately 40.3 million photodiodes on the camera's sensor, two per pixel, this caliber of AF is possible on approximately 80% of the image plane, vertically and horizontally. When the image or video clip is being captured, the CMOS sensor behaves as it always has with EOS SLR cameras, unimpeded by the dual photodiodes and recording each individual pixel with virtually no loss of detail or sharpness." 

In the detail patent we start to see that we might all be fooled by the "smoke screen" of the mixed terminology. It may be the reason why we are having these unnecessary confused discussions. We are presented several different words describing the subpixel function. 

A) Independent photodiode (Canon) 
B) Subpixel (Canon) 
C) Bucket (Aptina) 



I have mentioned before on the forum that the people at Aptina are discussing the Computational use of a quad bucket pixel design. Or call it a quad pixel under one micro lens. Canon calls its tech "dual pixel" but I think people more and more start to realize that is just "a name". It is a dual subpixel design. We have to start describing the design more clearly from now on! I might go for dual (or quad in the future) pixel. But someone might want to call it "dual bucket pixel" or "dual subpixel" instead. In any case lets agree that each subpixel has its own individual signal line that now or in the future can be used separately for computational DSP. That is my claim any way. Other might just want to stick to the idea that it can only be used for autofocus. And if/when the A/D conversion is done directly on the chip or even on every single pixel the next logical step is to allow for a design with four A/D channels per pixel if we talk about a quad diode design. (quad buckets or subpixels under one micro lens). 

This is what the Aptina folks have been looking into for several reasons.


The "buckets" discussed here have separate outputs and can be read as four individual signals in order to use them for computational processing. You can set four different ISO - one for each "bucket" output. You can read them separately with a very short, but still, time delay. You can compare the four signals and let a DSP use them for computing how to reduce the noise signal while preserving the statistically most likely source signal value. Isn´t this what astro photographers try to achieve by combining four pixel values down to one (down sampling the image size OR just smudging the image by showing the median value of four pixels with smooth transition to next block of four...) on the image in post processing? 

Sandwiching several images is another popular way to reduce the high frequency noise generated during exposure. But it takes several images at higher ISO and that makes it useless for shooting moving objects. And we still have no clue what is signal and what is noise. We must use a lot of images to exclude the noise from the signal. During the time you are taking multiple images the signal itself might have been differing A LOT. So 
we are averaging the signal and the noise at the same time. Think about long exposure of the sea. It is useless if we want to describe objects that move over time. And here we are of course talking about how to reduce noise from a sensor that deliver video as well as photos. 

How do we reduce noise at hi ISO in 4K video at frame rates between 50 and 200p (to come early next year or in five years perhaps). One solution will be to average multiple versions of the exact same moment. That is what quad bucket computation is able to help us do. And not only averaging. We can apply statistic analyze to perfect the noise cancelation. How that is done should probably be handed over to the guys at TOPAZ or Adobe to tell about. Or the other guy, you all know who I am referring to. (He might have mentioned that when he reaches 4000 posts he will actually go for a 5 minute walk in the park.) 

The optimal noise reduction needs multiple samples to compare. Is it a really smart thing to make that inside the microlens. "ONE" signal (light) goes in through the micro lens. It is interpreted by four individual receivers and the DSP uses intelligent software to try to separate noise from the sensor from the signal. The result is written as one pixel (R, G or B) in the RAW file. Or, why not keep it individual to be able to post process it further with more computer power and future, not yet invented, algorithms. Hmm.. maybe not this year. 

The advantage of computing multiple signals collected at one single moment is the key thing here. The downside is, yes I agree, that the individual buckets are smaller and will collect less photons. And this will reduce the benefits of the "quad pixel" design in terms of noise performance. But maybe less than we might expect. Aptina has invested time and supposedly money exploring the concept. They seem to think they are getting somewhere. 

So could this be what Canon also look into. The "dual pixel" patents keep coming and that points at the propability that Canon move on in this direction. 

Canon´s "latest dual pixel" patent point at the use of two photo diodes with different size. Making it more efficient and potentially resulting in a sensor with much higher dynamic range. There is a possible way to move on from there. To take it one more step further. You might guess what I am thinking about, right? You clever people! 

Patent description here: 
http://www.canonwatch.com/canon-patent-next-dual-pixel-cmos-auto-focus-70d-end-story/ 


Now I think I hear roaring animals in the background. OK. Let the lions in. And the big elephants too. 
Hi guys. No objections i suppose. Everyone agree right. Right?


----------



## dgatwood (Jun 22, 2014)

x-vision said:


> You are only _*assuming*_ that by a 'sub-pixel' Canon actually means a photodiode.
> This is just an assumption, however, as no statement/fact from the patent supports it.
> Let's be very clear about this.
> 
> ...



Short answer: Both halves of the pixel are photodiodes or equivalent. That's the only way you can get a signal from them. Both are underneath a single lens, so they get the same light, at least for in-focus content. For OOF content, the two halves get OOF bleed from opposite sides of the lens, resulting in a difference in amplitude that tells them which parts of the image are out of focus).

Because they both are under a single microlens and are stored in the final RAW output as a single value, I would consider them to be a single pixel. Well, I personally would call each pair a subpixel, since it only represents a single color within what will eventually become a pixel on final output, but....


----------



## Marauder (Jun 22, 2014)

jrista said:


> Marauder said:
> 
> 
> > jrista said:
> ...



I rather like the sound of my 7D shutter.  Much more pleasant than my T3i. But yes, 10 fps would be fine--the 8 fps of the current model is already quite impressive and 10 will match the 1D4 and 1D3, which is impressive in and of itself. 

As to names, well I'm not sure. I hope they call it 7D Mark II, but I'm more interested in what it does than what it's called. I don't mind it having good video features, but I hope it's a stills powerhouse as I rarely use it. If it's primarily a video device, with only minor upgrades to the stills capabilities, it will not interest me. But I suspect the rumours that it is going to be aimed at wildlife and fast action shooting with the addition of cool video features is more likely -- and then it will be awesome!


----------



## tayassu (Jun 22, 2014)

Marauder said:


> jrista said:
> 
> 
> > Marauder said:
> ...



+1 
Although, if they call it something else, the 7D will have a price increase, because it is the only 7D ever, which makes that an advantage for every 7D user ;D


----------



## cnardo (Jun 22, 2014)

My post from yesterday might have OBE'd, and/lost in yesterday's lively technical discussion, so here it is again. Have we discounted Canon Watch's statement ??


Re: New Sensor Tech in EOS 7D Mark II [CR2]
« Reply #135 on: June 21, 2014, 01:02:38 PM »
Looks like the folks over at Canonwatch.com have a different take on this whole 7D2, new sensor info going around as well as both announcement dates and shipping dates! 
Logged


----------



## cnardo (Jun 22, 2014)

Here is their full posting.


Think it is time to sum up some of the rumors I am getting.

I’ve been told (thanks) that whatever Canon is going to announce the third week of August 2014 (18th is the buzz) will be “something really big“ that will reaffirm Canon’s role as a leading imaging company.

This is the same source that contacted me time ago, saying that Canon will not replace the EOS 7D, which is part of a plan targeting a general revamping of Canon’s higher end DSLR lineup. The source repeated that what is coming is ”the biggest change in Canon’s history“.

The same source said that the camera Canon is going to announce will feature a new sensor technology. There is a lot of buzz on the web talking about a possible Foveon sensor (it surfaced previously). I got notified that a new sensor technology will come indeed, without a Foveon-like sensor being explicitly mentioned. The rumors about a new sensor technology seem to be the most reliable so far.

And then there is the viewfinder. Back in December 2013 I had some rumors about a hybrid viewfinder. Well, this is surfacing again too. It also seems that Canon is working on a camera that is oriented to new levels of videography.

Finally, Canon’s big shot that’s going to be announced at the end of August, will be ready to ship in October (so I have been told). It will be showcased at Photokina 2014 with big fanfare, and start to ship the weeks after.

Some considerations. Canon has the research facilities and the engineering know-how to produce something totally new and innovative. A new sensor technology by Canon is at least possible given the fact that Canon’s scientific and technological resources go far beyond their conservative reputation and equally conservative approach to sensor technology of the last years. However, at this stage it is difficult to distinguish the wishful thinking from the reliable rumors.

Take everything with a reasonable pinch of salt. There will be more in the next weeks.

Stay tuned…


----------



## flyingSquirrel (Jun 22, 2014)

cnardo said:


> Here is their full posting.
> 
> 
> Think it is time to sum up some of the rumors I am getting.
> ...



I did notice your original post from earlier (between all the hullabaloo and vicious arguments regarding technology), went over and read that, and then formulated my reply from earlier:

_"After reading a number of (what else, rumors, albeit convincing ones) on a few other websites, I am highly inclined to believe that there will NOT be a 7D mark II. I am very convinced that Canon will unveil this new body, which we are referring to as the 7D mark II, as a completely "unrelated" camera line/series/ what have you. It could potentially have some things in common with the 7D, and could potentially be considered somewhat of a follow up to it, but I really think this is meant to be the next level of body, something new, different, and not really meant to be a 7D mark II. This is completely my own feeling, based on what I've read here and on other sites. All I can do, personally, is pray that it is geared toward the type of shooting that I do, and my needs. If not, I will be beyond disappointed.

After more 100% pure speculative thinking, I have decided that I would be willing to bet money (if I had any money) that this camera will have 30+ MP, and that, in Q1 of 2015, a camera will be released from Canon which will have 40-50 MP. Call me crazy."_


----------



## 9VIII (Jun 22, 2014)

jrista said:


> 9VIII said:
> 
> 
> > jrista said:
> ...



Sorry, maybe I should have communicated that better. I wasn't referring to dual pixel technology, just normal sensors at high resolution.
(If superpixel debayering is really as simple as it sounds, dual pixel technology is completely unnecessary in this context.)



jrista said:


> Well, someday we may have 128mp sensors...but that is REALLY a LONG way off. DPAF technology, or any derivation thereof, isn't going to make that happen any sooner.



http://www.gizmag.com/canon-120-megapixel-cmos-sensor/16128/

I'm still of the opinion that Canon is only limiting resolution because of either the lack of user infrastructure (flash memory needs to drop in price), or the lack of a practical processor to pair with the sensor (problems with size, battery life, heat, etc...).

My bet is they will ramp up resolution as surrounding technology allows.


----------



## jrista (Jun 22, 2014)

9VIII said:


> jrista said:
> 
> 
> > 9VIII said:
> ...



Superpixel sounds like what you want. I actually wish that mainstream RAW editors like Lightroom would offer that as an option, honestly. Some people care more about color fidelity and tonal range than resolution, and having LOTS of pixels with superpixel debayering would be a huge bonus for those individuals.



9VIII said:


> jrista said:
> 
> 
> > Well, someday we may have 128mp sensors...but that is REALLY a LONG way off. DPAF technology, or any derivation thereof, isn't going to make that happen any sooner.
> ...



I am guessing it is more than that. Let's say Canon's next move would be to 3.5µm pixels. With a 500nm process, the actual photodiode, assuming a non-shared pixel architecture, would then actually be barely 2.5µm in size at most (once you throw wiring and readout logic transistors around it.) With a shared pixel architecture you might be able to make it a little larger. On the other hand, if you drop from a 500nm process to a 180nm process, the photodiode area could be close to 3.14µm. (This assumes that wiring and transistors only require a single transistor's width border around the photodiode...it's usually not quite that simple, at least based on micrograph images of actual sensors and patent diagrams.) With a 90nm process, the photodiode could be up to 3.3µm. 

I think the 500nm process is really limiting for Canon now. They COULD do it, there is nothing that prevents them from creating a 3.5µm pixel sensor with 2.5µm photodiodes...but I don't think it would be competitive. The smaller photodiode area wouldn't gather as much light as competitors sensors that are fabricated with 180nm or 90nm processes, and they would just be a lot noisier. 

I am really, truly hoping Canon has moved to a significantly more modern fabrication process with the 7D II sensor. I think that alone would improve things considerably for Canon's IQ.


----------



## 9VIII (Jun 23, 2014)

jrista said:


> Superpixel sounds like what you want. I actually wish that mainstream RAW editors like Lightroom would offer that as an option, honestly. Some people care more about color fidelity and tonal range than resolution, and having LOTS of pixels with superpixel debayering would be a huge bonus for those individuals.



Using it in post sounds nice, but what we're after is a way to save space on the memory card. Could a camera use superpixel debayering as a part of the image capture process and still save the file in RAW format?



jrista said:


> I am guessing it is more than that. Let's say Canon's next move would be to 3.5µm pixels. With a 500nm process, the actual photodiode, assuming a non-shared pixel architecture, would then actually be barely 2.5µm in size at most (once you throw wiring and readout logic transistors around it.) With a shared pixel architecture you might be able to make it a little larger. On the other hand, if you drop from a 500nm process to a 180nm process, the photodiode area could be close to 3.14µm. (This assumes that wiring and transistors only require a single transistor's width border around the photodiode...it's usually not quite that simple, at least based on micrograph images of actual sensors and patent diagrams.) With a 90nm process, the photodiode could be up to 3.3µm.
> 
> I think the 500nm process is really limiting for Canon now. They COULD do it, there is nothing that prevents them from creating a 3.5µm pixel sensor with 2.5µm photodiodes...but I don't think it would be competitive. The smaller photodiode area wouldn't gather as much light as competitors sensors that are fabricated with 180nm or 90nm processes, and they would just be a lot noisier.
> 
> I am really, truly hoping Canon has moved to a significantly more modern fabrication process with the 7D II sensor. I think that alone would improve things considerably for Canon's IQ.



I'm guessing the only reason you mention 90nm and not 30nm is that in this application the cost/benefit ratio favours slightly larger circuits rather than smaller? (you'd only gain minimal surface area but potentially make production much more difficult)


----------



## jrista (Jun 23, 2014)

9VIII said:


> jrista said:
> 
> 
> > Superpixel sounds like what you want. I actually wish that mainstream RAW editors like Lightroom would offer that as an option, honestly. Some people care more about color fidelity and tonal range than resolution, and having LOTS of pixels with superpixel debayering would be a huge bonus for those individuals.
> ...



Nope. Once you debayer, or do any kind of processing to the data, your no longer RAW. Canon does offer the sRAW and mRAW settings. Those are what, at best, you could call semi-RAW. They are closer to a JPEG in terms of actual storage format (YCbCr encoding, or luminance+Chrominance Blue+Chrominance Red), but everything is stored in 14-bit precision. It's also encoded such that you have full luminace data, basically a luminance value for every single OUTPUT pixel, but the Cb and Cr data is sparse, it's encoded from multiple pixels (I forget if it is a 1x2 short row, or a full 2x2 quad), and that encoded value is stored as a single pair of 14-bit Cb/Cr values for every 2 or 4 luminace pixels (I think exactly how many color pixels are encoded per luminance pixel depends on whether your sRAW or mRAW). Now the luminace is encoded per output pixel. If your mRAW, I think that's basically 1/2 the area of the full sensor, and for sRAW is basically 1/4 the area of the full sensor. So your luminance information is encoded from however many source pixels are necessary to produce the right output pixels. I think 2x2 for sRAW, something along the lines of 1.5x1.5 for mRAW. (There is a spec on the formats somewhere, it's been a long time since I've read it...my description above is not 100% accurate, but that's the general gist...basically, a 4:2:1 or 4:2:2 encoding of the image data.) 

You definitely save space with these formats, but I have experimented with them on multiple occasions, and your editing latitude is nowhere remotely close to a full RAW. You can shift exposure around a moderate amount, but you have limits to how far down you can pull highlights, how far up you can push shadows, how far you can adjust white balance, etc. 



9VIII said:


> jrista said:
> 
> 
> > I am guessing it is more than that. Let's say Canon's next move would be to 3.5µm pixels. With a 500nm process, the actual photodiode, assuming a non-shared pixel architecture, would then actually be barely 2.5µm in size at most (once you throw wiring and readout logic transistors around it.) With a shared pixel architecture you might be able to make it a little larger. On the other hand, if you drop from a 500nm process to a 180nm process, the photodiode area could be close to 3.14µm. (This assumes that wiring and transistors only require a single transistor's width border around the photodiode...it's usually not quite that simple, at least based on micrograph images of actual sensors and patent diagrams.) With a 90nm process, the photodiode could be up to 3.3µm.
> ...



Well, I mention 180nm and 90nm because I am pretty sure Canon has the fab capability to manufacture transistors that small. In the smallest sensors, transistor sizes are a lot smaller than that...I think they are down to 32nm for the latest stuff, with pixels around 1µm (1000nm) in size. I think that some of Canon's steppers and scanners can handle smaller transistors, 65nm using subwavelength etching, but I don't know if that stuff has been/can be used for sensor fabrication. I know for a fact that Canon already uses a 180nm Cu fab process for their smaller sensors, so I know for sure they are capable of that. Their highest resolution fabs are around 90nm natively, but again, most of what I've read about them indicates IC fabrication...I've never heard of them being used to manufacture sensors (but there honestly isn't that much info about Canon's fabs...nor who owns them...)


----------



## 9VIII (Jun 23, 2014)

jrista said:


> 9VIII said:
> 
> 
> > jrista said:
> ...




Darn.
After reading a bit about the various file formats (TIFF is high fidelity, but both huge and still damaging even in 16bit) It sounds like the best that could be done would be just to "prep" the raw file for superpixel debayering by saving it with the two green pixels averaged already. It would be useless for anything else, but you use 25% less space. Not nothing, but not great.
People just need to get used to handling large files.


----------



## 9VIII (Jun 23, 2014)

ScottyP said:


> I'll toss it out there.......
> 
> 
> APS-H?



Drool...

We can only hope.

Honestly if Canon wants to win the hearts of entry level consumers and make everyone else look bad they would scrap EF-S and move the entire Rebel line to APS-H. Low light is still king and the only guaranteed way to get more of it is with bigger sensors.


----------



## jrista (Jun 23, 2014)

9VIII said:


> jrista said:
> 
> 
> > 9VIII said:
> ...



The fundamental problem arises when you encode the color information. It doesn't seem to matter if its a chrominance pair, or RGB triplet, or anything else. Once you encode the color information...take it out of it's separate storage values, and bind those discrete red, green, and blue values together into a conjoined value set (i.e. RGB sub-pixel values for a full TIFF pixel, for example), you lose editing latitude.


----------



## 9VIII (Jun 23, 2014)

jrista said:


> The fundamental problem arises when you encode the color information. It doesn't seem to matter if its a chrominance pair, or RGB triplet, or anything else. Once you encode the color information...take it out of it's separate storage values, and bind those discrete red, green, and blue values together into a conjoined value set (i.e. RGB sub-pixel values for a full TIFF pixel, for example), you lose editing latitude.




That sounds like it's just a problem with the way the software is handling the data. A TIFF is still assigning full RGB values to each pixel, the debayering is done, and I'm assuming the original values are lost. Whereas if you store the information in a pre-debayered state, even with one pixel averaged out (which was next on the to-do list anyway) it shouldn't be any different from reading the original RAW... with the slight exception that adjusting the value after averaging would be different than adjusting the values of two pixels and then averaging them (I assume that when you adjust things in post it's playing with the RAW numbers before debayering).
But that still sounds like a fairly inconsequential concession to make compared to storing data after debayering.


----------



## jrista (Jun 23, 2014)

9VIII said:


> jrista said:
> 
> 
> > The fundamental problem arises when you encode the color information. It doesn't seem to matter if its a chrominance pair, or RGB triplet, or anything else. Once you encode the color information...take it out of it's separate storage values, and bind those discrete red, green, and blue values together into a conjoined value set (i.e. RGB sub-pixel values for a full TIFF pixel, for example), you lose editing latitude.
> ...



There are some things in a RAW editor that must be done before debayering (i.e. whitebalance), and some that are usually done after debayering. It's just that some things are more effectively performed with an original digital signal, and others with a full RGB color image. Exposure and white balance are the two main things that benefit most from being processed in the original RAW, where the signal information is pure and untainted with any error introduced by conversion to RGB. 

You also have to realize that RGB binds the three color components together....they cannot be shifted around much in an independent way, not like you can with RAW, without introducing artifacts. At least, not with real-time algorithms. There are other tools, like PixInsight (astrophotography editor) that have significantly more powerful, mathematically intense, and often iterative processes that put most of the tools in something like Lightroom to shame. One example is TGVDenoise...which is capable of pretty much obliterating noise without affecting larger scale structures or stars at all. Problem is, at an ideal iteration count (usually around 500) on a full RGB color image, running TGVDenoise can take several minutes to complete. And that is just one small step in processing a whole image. 

So sure, with the right tools, you can probably do anything with a 16-bit TIFF. It's just that with lower precision but significantly faster algorithms like are often found in standard tools like Lightroom, you either end up with artifacts, or run into limitations with the data or the algorithm that won't let you push the data around as much.


----------



## GMCPhotographics (Jun 23, 2014)

I seem to be one of the few who actually like 22mp as a high standard. I have no particualr use for 36/44/100mp.
I don't particaulrly need the iso noise or the file size. I don't particaulrly need to print large, A1+ is as big as i can go domestically / commercially and I've never sold a print bigger than that.


----------



## wickidwombat (Jun 23, 2014)

9VIII said:


> ScottyP said:
> 
> 
> > I'll toss it out there.......
> ...



LOL
I'm so glad someone else said it
now where's my popcorn...


----------



## stolpe (Jun 23, 2014)

Let's hope they com up with something new this time, I have already bought an Fuji X-T1 to complement my 5D3. I hope that the 7D2 motivates me to keep my lenses and keep on using my Canon gear instead of taking the complete Fuji train instead. Can we see a new mirrorless cameras as well Canon?

/ Stolpe


----------



## Larry (Jun 24, 2014)

jrista said:


> 9VIII said:
> 
> 
> > jrista said:
> ...



It's magic!

When a bunch of tech stuff flies by way above my head, if I try to watch for awhile, (with only my eyes, because my brain cant keep up) ...it turns into a sleeping pill! 

But I go to sleep thinking that it's good to know that SOMEBODY knows these things.


----------



## Quackator (Jun 25, 2014)

Interesting article about new canon patents:
http://www.photographybay.com/2014/06/23/canon-is-developing-a-new-organic-compound-for-sensor-and-optics-technology/


----------



## jrista (Jun 25, 2014)

Quackator said:


> Interesting article about new canon patents:
> http://www.photographybay.com/2014/06/23/canon-is-developing-a-new-organic-compound-for-sensor-and-optics-technology/



The organic compound patent is only patenting the molecule, and electrochromic molecule (molecules or polymers that change color in the presence of an electric current), but the implications are very interesting. It's an adjustable organic filter...which has very high transmittance in one mode, then has lower transmittance in another, the "color" mode. In color mode, it looks like the color would be more blueish or cyan colored, as some light is transmitted, but most IR is blocked. If they could refine the capability...it might lead to adjustable color filters. This was something I hypothesized a couple years ago....sensors with a high refresh rate during exposure, with dynamic color filters. You would get 100% fill factor for all colors, without the need to layer the sensor. That means you get higher sensitivity. 

(Note, this is not what the patent describes...however it is something it could imply Assuming this compound leads to a more controllable transmittance, it might be possible for Canon to create a full-color sensor with a single layer of organic film that changes for red, green, and blue channels during the length of the exposure. You could theoretically even switch the filter to full transparency mode for a "luminance" channel. This would be WAY better than a Foveon-style sensor, as you have problems with noise in the red and a bit in the green channels do to their depth within the silicon. With a dynamic color filter, especially one with high transmittance, you could gather far more light. You could even shorten the sub-exposures in each color channel, expose for longer in luminance to get more detail (for any given duration of exposure...say you choose 1/500th second exposure...the RGB sub-exposures might be 1/3000th of a second long, while the luminance sub-exposure might be 1/1000th long. Amplify, then slightly blur, the color channels...that reduces noise, then integrate with the luminance, which adds back the detail.)

No idea if such a sensor would ever materialized, but it's the ultimate implication of electrochromic compounds.


----------



## Quackator (Jun 25, 2014)

It could as well be a global shutter based on this technology.
No more X-sync barrier, no more rolling shutter problems.


----------



## jrista (Jun 25, 2014)

Quackator said:



> It could as well be a global shutter based on this technology.
> No more X-sync barrier, no more rolling shutter problems.



I don't think so. There are two modes...fully transparent, and "colored". The material does not get completely opaque, it takes on a lumpy curve that peaks near the blue end and ultimately trails off by IR. The material would need to be 100% completely opaque to operate as a global shutter. On the other hand, you wouldn't necessarily want it 100% opaque when using it as a color filter...you would want it to have some kind of response curve that peaks in a given range of wavelengths, and bottoms out in the other wavelengths. Even a standard color filter is not 100% completely opaque to other colors, at the very least there is usually a few percent of red and green getting through a blue filter, a few percent of blue getting through a red filter, etc. As filter material, it sounds pretty amazing. 

As a shutter, I don't know that it's capable of becoming opaque enough...even if Canon ultimately made it 0.1% or even 0.01% opaque...that is still allowing light through. Even if you weren't taking pictures...at that low transmittance level, your actually exposing the sensor...which would ultimately lead to higher levels of noise.


----------



## Diko (Jun 27, 2014)

*7D Mark II: Could it be with a FOVEON?*

Based on *that* I really begin to wonder if under new sensor the talk is actually about *FOVEON*....


----------



## Don Haines (Jun 27, 2014)

jrista said:


> And back to our regularly scheduled programming.
> 
> The 7D II's new technology. What will it be? Here are my thoughts, given past Canon announcements, hints about what the 7D II will be, interviews with Canon uppers, etc.
> 
> ...


I agree with everything... but I would like to make three comments.

It's not very cost effective to just upgrade fabrication by one step.... they are going to have to live with the new facility for a long time. I would expect that they would jump over 180 to 90nm... or even smaller.

And video, I think 4K video on this camera is a flip of the coin.. I wouldn't bet one way or the other, but if they do use C-fast cards the odds of 4K video goes up

And finally, I think it will be a dual processor.... but I am wondering if the time has come for one processor to be optimized for stills and the other processor optimized for video.


----------



## jrista (Jun 27, 2014)

Don Haines said:


> jrista said:
> 
> 
> > And back to our regularly scheduled programming.
> ...



I think that for larger sensors, 180nm is still used by other major manufacturers, like Sony. It's only with the much smaller sensors that have pixels smaller than 2µm that you start seeing transistors smaller. Even a move to 180nm for large (APS-C, FF) sensors would be HUGE. I mean, that's compared to 500nm. It's about 1/3rd the size. If pixel count only goes up to 24mp in the 7D II, a move to 180nm would mean that Q.E. actually increases on a per-pixel basis relative to say 20mp at 500nm. That's how significant it would be. 

I'd have to check, but I would expect that Canon is probably already on a smaller process, 90nm or even 65nm, for the 1/2", 1/3", and smaller sensors. 




Don Haines said:


> And video, I think 4K video on this camera is a flip of the coin.. I wouldn't bet one way or the other, but if they do use C-fast cards the odds of 4K video goes up



Agreed. I really hope they do move to C-Fast, and I also hope they move to USB 3.0. If we see both of those, then 4k should be a given. 



Don Haines said:


> And finally, I think it will be a dual processor.... but I am wondering if the time has come for one processor to be optimized for stills and the other processor optimized for video.



That's an interesting thought. I guess it depends on the frame rate. If they only bump up to 10fps, I think a single DIGIC7 could handle it (easily...with room to spare for a whole ton of other stuff). That would be ESPECIALLY if they move the ADC onto the sensor die and make it column parallel. They do have a patent for CP-ADC with a Dual-Scale Ramp ADC (the dual-scale just allows the ADC to operate at different rates based on some trigger factor...say sensor heat...more heat, higher noise, slower readout, less noise...switch from high speed readout to low speed readout when possible under higher heat, and you could counteract the increase in dark current noise...I have no idea what the trigger factor would be to switch from the higher speed to the lower speed or vice versa in an actual product, though.) Parallelizing ADC and putting that logic on the die also reduces load in the DIGIC itself...it would then solely be responsible for digital pixel processing, in which case a DIGIC 7 that has no ADC units could theoretically have even more processing power than a DIGIC 7 that did include the ADC units. So instead of being 7x faster than a DIGIC 6, it might end up being 12x or 14x faster. 

If you had one of those dedicated to stills/af/metering, and one dedicated to video, you could really do a hell of a lot with the video. Canon should be able to surpass what the Bionz X in the A7s does easily, achieving ultra low noise ISO 400k, maybe even 800k.


----------



## Don Haines (Jun 27, 2014)

jrista said:


> Don Haines said:
> 
> 
> > And finally, I think it will be a dual processor.... but I am wondering if the time has come for one processor to be optimized for stills and the other processor optimized for video.
> ...


Every so often there needs to be a "reset"

It happened in lenses when AF started to come out. Canon took the brave step of completely redesigning the interface to handle the requirements of digital communication between lens and body and we jumped from FD to EOS... this might be the time for a similar jump on the inside of the camera.

15 years ago, the innards of digital cameras were a collection of ic's with specific functions and the communication between them was both complex and at the same time, fairly limited. At the moment, Canon is doing a major redesign to fit DPAF onto it's sensors and presumably, onto a smaller fabrication technique. This gives us four big possibilities on the sensors. First, with less wasted surface area, the amount of captured light goes up and with that, so does the performance. The second one is with smaller fabrication comes smaller transistors and that means shorter electrical paths and that gives us higher speed. Third is that with smaller transistors and lower voltages, we get less heat, and that means lower noise and longer battery life. Fourth is that with smaller transistors the a/d can be integrated into the sensor and that gives us less noise and a cleaner design.

With all the analog confined to the sensor, one can now communicate digitally between the sensor, processor(s), display devices, and storage medium(s). Gigabyte per second transfer rates are easy to achieve.

15 years ago, there was no video on DSLRs.... now it is a standard feature.... yet we have the same general purpose DIGIC doing both..... it might be time for a split into a dedicated stills processor and a dedicated video processor.

This could be the time for a "reset" of the internal architecture of the Canon DSLRs.....


----------



## jrista (Jun 27, 2014)

Don Haines said:


> jrista said:
> 
> 
> > Don Haines said:
> ...



Regarding the shorter electrical paths bit...I think you may be conflating CPUs with sensors. In a CPU, you can shrink the whole package...and yes, that results in shorter distances for electrons to travel. In a sensor, it isn't really the same. There is a very minimal amount of logic that occurs at the pixel...basically amplify. The distance that amplified charge travels is the same for any given sensor size thought...the top row of pixels is going to have to travel the full 24mm height of the light-sensitive area, plus the additional millimeter or so of masked and calibration pixels around the border, plus the extra millimeter to the CDS unit, plus another short distance to some voltage output (or in the case of CP-ADC, to the ADC units). Shrinking the transistor size doesn't really change charge travel distance in CIS devices, since the whole die generally remains the same. Even if you shortened the distances in the logic around the pixel itself, that is such a trivial distance compared to the total readout distances, I don't think it would be meaningful.

As for lower voltages, for cameras used for normal photography, again I don't think the difference in noise from using a lower voltage on the sensor itself is going to matter much. In Canon's current setup, the sensor itself is actually very low noise. Based on Roger Clark's work, some of Canon's more recent sensors have as little as 1.5e- worth of electronic noise introduced by the sensor itself (or maybe it was 1.2e-). When your FWC is around 30ke- for APS-C and around 67ke- to 92ke- for FF, that is effectively meaningless. The primary source of read noise comes from the downstream (and off the sensor die) electronics, the off-sensor bus, secondary downstream amp and the high frequency ADC units. Those suckers are adding up to 35e- or so worth of noise. 

I do believe moving the ADC onto the sensor die is the biggest move that could reduce noise. Eliminating the bus, secondary amp, and massively increasing the parallelism of the on-die ADC units would allow them to operate at a significantly lower frequency. The reduction in operating frequency would have the most significant noise reduction effect. If Canon follows Sony's design, moving the clock and other high frequency components off to a remote corner of the die, away from the ADCs, would practically eliminate the noise they cause.

I also agree that once the analog signal is converted to a digital signal, then you have the freedom to move it around at high speed. You can use error-corrected transfer, and sure, ship the information around at gigahertz error-corrected speeds. 

I think that DIGIC is still handling both still and video because up to date, the DIGIC houses the ADC units. Once the ADC units are on the sensor, then sure, I think it would be a lot easier to have some kind of switching unit on the digital bus between the sensor and the processors, and you could dedicate one to stills processing and another to video processing. However, if you have a sufficiently powerful DSP with enough horsepower, it would probably be more energy efficient to hose both of those dedicated processors in a single unit. Especially of the ENTIRE unit can be dedicated to image processing, since the ADC units wouldn't be taking up die space or power.


----------



## Diko (Sep 8, 2014)

pierlux said:


> I think CR guy knows more than he's reporting. C'mon Craig, tell us something more!



His name is Craig !?! ;D ;D ;D
Anyways.....

;D ;D ;D



x-vision said:


> jrista said:
> 
> 
> > "... an image sensor including _pixels_ _*each having a pair of photoelectric conversion units (photodiodes)*_ capable of outputting the *pair* of image signals obtained by _independently_ receiving a *pair* of light beams...
> ...



Pspehaps the whole mistery behind "subpixel" can be easily explained: *subpixel = photodiode + microlense *

As for 7dm2 iso range of 100-12800 I now remember DPAF. More circuits around the photodiodes for improved DPAF would explain low isk digits 

300nm wafer!? Jrista you are quite optmistic really.

As for 180 process - most probably YES. 1Dx after all is using it already.
QE is my dream.... in time of development of 70 andeven some 80% QE CMOS it is about time to make some advancments... But I am sceptical about it.

Tablet writing sucks!


----------



## ULFULFSEN (Sep 9, 2014)

> These specs are largely identical with the spec list posted by CR, with a few differences. The source told me that:
> •The EOS 7D Mark II sports a 20.2 MP sensor but it is not the same sensor as featured on the EOS 70D
> •The AF has 65 points but only 32 are cross-type. Center point is dual cross. f/8 on center point
> •The EOS 7D Mark II has a “pro build quality” (whatever this means in detail)
> ...


----------



## ULFULFSEN (Sep 9, 2014)

Diko said:


> As for 180 process - most probably YES. 1Dx after all is using it already.



Source?


----------



## jrista (Sep 9, 2014)

Diko said:


> x-vision said:
> 
> 
> > jrista said:
> ...



Umm...where in the world do you get that the 1D X is already using a 180nm process? Of ALL of Canon's sensors, the 1D X needs the smaller process the least. The 70D is the most likely to have moved to a 180nm process, however there is no evidence that such a move actually occurred, only speculation. As far as I know, based on analyses done by Chipworks, the 1D X, 5D III, 6D and all of Canon's older sensors still use a 500nm process.

Regarding the 300mm wafers...every other manufacturer except Canon already uses them. As far as I understand, Canon's 180nm fab uses them already. If Canon is winding down production on their 200mm fab where their larger sensors are being manufactured, to ramp up production of new APS-C and FF sensors on their newer fab...then 300mm wafers with a 180nm process are pretty much a given. There was even a rumor not long ago about Canon's plans to improve their production capacity. I think Don Haines has it nailed: Canon's P&S business has plummeted, meaning they aren't using the full capacity of their newer and more advanced fabs...meaning the capacity that is coming off of P&S cams is available for use by APS-C and FF manufacture. It seems most logical that Canon would aim to cut costs by reducing the number of operating fabs, and in particular by eliminating the far less efficient 200mm wafers. 

As for Canon's DPAF pixel structure...it's two photodiodes embedded within the substrate underneath a SINGLE microlens, a SINGLE color filter, with a single combined set of readout/binning logic that happens to be capable of reading the charge in each photodiode independently and either binning those two charges or not. When a pixel is activated, both of it's photodiodes are read out simultaneouosly...whether they are binned depends on whether the read is an image read or an AF read.


----------



## ULFULFSEN (Sep 9, 2014)

1D X:



> On the process side, the 1D X is remarkable in that Canon continues to stay with the 0.5 µm process generation it has used for every APS-C and FF device analyzed



http://www.chipworks.com/en/technical-competitive-analysis/resources/blog/full-frame-dslr-cameras-canon-stays-the-course/


----------



## neuroanatomist (Sep 9, 2014)

ULFULFSEN said:


> > These specs are largely identical with the spec list posted by CR, with a few differences. The source told me that:
> > •The EOS 7D Mark II sports a 20.2 MP sensor but it is not the same sensor as featured on the EOS 70D
> > •The AF has 65 points but only 32 are cross-type. Center point is dual cross. f/8 on center point
> > •The EOS 7D Mark II has a “pro build quality” (whatever this means in detail)
> > ...



We should know soon. 

I'm not sure I buy the 32 cross-type out of 65 total AF point spec. Given the possible logical layouts of a 65-pt sensor, it's far more likely that a subset of cross-type points would be an odd number, not an even number.


----------



## Lee Jay (Sep 9, 2014)

neuroanatomist said:


> ULFULFSEN said:
> 
> 
> > > These specs are largely identical with the spec list posted by CR, with a few differences. The source told me that:
> ...



Even if you count the center "dual cross" separately?


----------



## Marsu42 (Sep 9, 2014)

ULFULFSEN said:


> The EOS 7D Mark II sports a 20.2 MP sensor but it is not the same sensor as featured on the EOS 70D



Yeah, right, I'm sure it's a _completely different sensor_ that just happens to have the same resolution :->


----------



## neuroanatomist (Sep 9, 2014)

Lee Jay said:


> neuroanatomist said:
> 
> 
> > ULFULFSEN said:
> ...



That's not how AF point count specs are listed, at least by Canon.


----------



## ULFULFSEN (Sep 9, 2014)

Marsu42 said:


> ULFULFSEN said:
> 
> 
> > The EOS 7D Mark II sports a 20.2 MP sensor but it is not the same sensor as featured on the EOS 70D
> ...



i say it´s the middleway.

it´s not the same sensor it´s a 70D sensor but with improved DPAF.


----------



## Lee Jay (Sep 9, 2014)

neuroanatomist said:


> Lee Jay said:
> 
> 
> > neuroanatomist said:
> ...



Yeah, but this is a third-hand rumor, probably translated multiple times from the original ancient Greek!


----------



## Marsu42 (Sep 9, 2014)

ULFULFSEN said:


> it´s not the same sensor it´s a 70D sensor but with improved DPAF.



That's my guess, too - Canon has kept improving it since the introduction in the Rebels and will most likely continue to do so for the 5d4 release.


----------



## NancyP (Sep 9, 2014)

well, they ought to be putting something seriously new in, considering the length of time it has taken to generate the 7D2. Or, did they just run out of old 7D stock recently...


----------



## x-vision (Sep 9, 2014)

> • The EOS 7D Mark II sports a 20.2 MP sensor but it is not the same sensor as featured on the EOS 70D
> http://www.canonwatch.com/another-mention-20mp-sensor-eos-7d-mark-ii/



Well, I really hope so. Otherwise, the 7D2 is DOA for me personally. 

As I said in another thread, Canon will be missing a golden opportunity if the 7DII has the same sensor as the 70D.

With better IQ than the rest of the crop (hehe), the 7DII will be in a class of its own.
But with the same IQ, it would be just a niche camera for a specific group of users. 

My 2c.


----------



## Lurker (Sep 9, 2014)

> Yeah, but this is a third-hand rumor, probably translated multiple times from the _original ancient Greek_!



Even back then they were waiting for a 7D II?


----------



## neuroanatomist (Sep 9, 2014)

Lee Jay said:


> neuroanatomist said:
> 
> 
> > Lee Jay said:
> ...



Yes, but of all things _numbers_ actually translate quite well.

I'm going to go with CR on this one, 65-pt AF, all cross-type. I think it makes sense for Canon to go that route, really pushing the AF and fps specs of the 7DII/X. Consider the 7D vs. the then-current 5DII. The 7D had a (much) better AF and (much) faster frame rate. I think there were quite a few people who owned both the 5DII and the 7D, with the latter being much better suited to action. The 5DIII was a big step up in AF and fps, making it better than the 7D for action from a practical standpoint, so if Canon wants a repeat of the dual FF/APS-C ownership paradigm, they need to set the 7D replacement apart. 65 cross-type points and 10 fps would help do that.


----------



## Don Haines (Sep 9, 2014)

Lurker said:


> > Yeah, but this is a third-hand rumor, probably translated multiple times from the _original ancient Greek_!
> 
> 
> 
> Even back then they were waiting for a 7D II?



I hear that they found rumoured specs for a 7D2 in hieroglyphics in a pyramid in Egypt.........


----------



## jthomson (Sep 9, 2014)

> Yeah, but this is a third-hand rumor, probably translated multiple times from the _original ancient Geek_!




There fixed the typo.


----------



## Lurker (Sep 9, 2014)

Wasn't that the reason the Egyptian empire fell? They were busy looking to the east and waiting instead of watching to the north.


----------



## jrista (Sep 9, 2014)

Don Haines said:


> Lurker said:
> 
> 
> > > Yeah, but this is a third-hand rumor, probably translated multiple times from the _original ancient Greek_!
> ...



I heard they were found in a tablet fragment of early Sumarian Cuneiform...


----------



## Don Haines (Sep 9, 2014)

jrista said:


> Don Haines said:
> 
> 
> > Lurker said:
> ...


 It is so hard to take all these conflicting rumours seriously....


----------



## Lee Jay (Sep 9, 2014)

You know, the A7S sensor numbers on sensorgen (http://sensorgen.info/SonyA7S.html) are quite a revelation. Huge QE (67%), huge well capacity and almost no read noise at high ISO. The read noise at low ISO isn't as good, but still better than anything Canon makes.

As far as sensor technology goes, there's your aim point, Canon.


----------



## jrista (Sep 9, 2014)

Don Haines said:


> jrista said:
> 
> 
> > Don Haines said:
> ...



Indeed. I also heard that the Dead Sea Scrolls contained rumors about the mythical "BigMP" camera...and it's role to play in the end times...


----------



## PureClassA (Sep 10, 2014)

jrista said:


> Don Haines said:
> 
> 
> > jrista said:
> ...



Perhaps Rosetta Stone makes a Rumors to English version that would clear all this up


----------



## cnardo (Sep 10, 2014)

What's the latest thinking on when the Official word" will be released??? 

Photokina starts Tuesday, 9/16 at 8 or 9am so that's 11pm -midnight 9/15 (Monday) West Coast USA time. Will we know anything officially any earlier than that ????

If not, I am just going to go to bed and wait


----------



## jrista (Sep 10, 2014)

cnardo said:


> What's the latest thinking on when the Official word" will be released???
> 
> Photokina starts Tuesday, 9/16 at 8 or 9am so that's 11pm -midnight 9/15 (Monday) West Coast USA time. Will we know anything officially any earlier than that ????
> 
> If not, I am just going to go to bed and wait



From what I gather, Canon is having a press event the night before...so, we won't know anything until the 15th at the earliest.


----------



## Old Sarge (Sep 10, 2014)

jrista said:


> Don Haines said:
> 
> 
> > Lurker said:
> ...



Don't trust that one. I wrote it.


----------



## jrista (Sep 10, 2014)

Old Sarge said:


> jrista said:
> 
> 
> > Don Haines said:
> ...



Damn Sarge, you is OOOOOOLD!


----------



## Woody (Sep 10, 2014)

I seriously hope this bit of rumor about new Canon sensor tech is true.

I am already very disappointed by the so-called Year of the Lens rumor.


----------



## mkabi (Sep 10, 2014)

Woody said:


> I seriously hope this bit of rumor about new Canon sensor tech is true.
> 
> I am already very disappointed by the so-called Year of the Lens rumor.



Whatchyu talking about woody?

It _has_ been the 'Year of the Lens.'
Sigma, Tamron & even Zeiss has produced in total more lenses than we would ever need.
Even Canon put out a few lenses, compared to what...last year?


----------



## Lurker (Sep 10, 2014)

Old Sarge said:


> jrista said:
> 
> 
> > Don Haines said:
> ...



Ok, I knew I shouldn't have fallen for these rumors. 
Nothing about a Canon release has ever been set in stone.


----------



## neuroanatomist (Sep 10, 2014)

Lurker said:


> Nothing about a Canon ... has ever been set in stone.



I beg to differ...


----------



## Woody (Sep 10, 2014)

mkabi said:


> It _has_ been the 'Year of the Lens.'
> Sigma, Tamron & even Zeiss has produced in total more lenses than we would ever need.
> Even Canon put out a few lenses, compared to what...last year?



When Canon Rumors calls 2014 the Year of the Lens, they are refering specifically to Canon lenses. Tamron, Sigma and Zeiss are not in the picture. See this:
http://www.canonrumors.com/forum/index.php?topic=18167.0

"We’ve had a few confirmations that 2014 will be the “year of the lens” for Canon. While Nikon and Sony go into new markets such as full frame mirrorless and retro designs, Canon will apparently stay the course and concentrate on DSLRs and lenses for the EOS lineup.

What should we expect?

Firstly, a new worlds widest full frame zoom lens (nor sure what this could be), a wide angle zoom with IS (17-50 f/4L IS?), a new fast wide angle successor with “new technology” (35 f/1.4?). We can also expect two new tilt-shift lenses, a telephoto zoom successor (100-400?) as well as “budget high quality lenses”.

I’m starting to believe the hype, as the info is coming from known and new sources, and they all seem to be saying the same thing.

More to come."

What we have in 2014 from Canon so far: EF-M 55-200 IS STM, EF-S 10-18 IS STM and EF 16-35 f/4 IS USM... paltry 3 lenses.

In 2013, Canon released 4 lenses: EF-S 55-250 IS STM, EF-M 11-22 STM, EF 200-400 f/4 IS USM, EF-S 18-55 IS STM.

So... no widest lens ever for FF, no fast wide angle successor, no tilt-shift lenses, no telephoto zoom for FF...

For all we know, Canon may very well chicken out after they review the sales data for DSLRs and decide there won't be a 7D successor... LOL


----------



## ULFULFSEN (Sep 10, 2014)

Lee Jay said:


> You know, the A7S sensor numbers on sensorgen (http://sensorgen.info/SonyA7S.html) are quite a revelation. Huge QE (67%), huge well capacity and almost no read noise at high ISO. The read noise at low ISO isn't as good, but still better than anything Canon makes.
> 
> As far as sensor technology goes, there's your aim point, Canon.



maybe today only on geeks forums people complain about canon sensors, as some say.
i don´t know if that´s true.

i have people asking me, if the canon sensors are good because they have "read on the internet..."
these are new user looking for their first DSLR. 

but im pretty sure that at some point it will backfire when canon does not adress this image of "5 year old sensors". 
i know that´s not entirely true, but the 18MP sensor was a bit of a running joke lately.


----------



## Lee Jay (Sep 10, 2014)

I'm interested in high-ISO DR because I fight with that all the time. The A7S sensor beats the 1DX (the best Canon has) by 1 stop at 12,800, 1 1/2 stops at 25,600 and 2 stops at 51,200.


----------



## Diko (Sep 11, 2014)

Marsu42 said:


> Yeah, right, I'm sure it's a _completely different sensor_ that just happens to have the same resolution :->


 Well how about that? And McDonalds and Korea both use red as primary colour ergo McDonalds is a communistic company? ;-)



ULFULFSEN said:


> Diko said:
> 
> 
> > As for 180 process - most probably YES. 1Dx after all is using it already.
> ...


 FALSE ALARM.

Sorry my bad. In 3 am my memory seems to be out of order. I've just checked: 

It's that damn SONIKON D800 and NOT 1Dx :-(((( 

They were in the same table and my imagination shifted the digits.... :-(

I guess everyone have already *read it*...


----------



## LetTheRightLensIn (Sep 11, 2014)

ULFULFSEN said:


> Lee Jay said:
> 
> 
> > You know, the A7S sensor numbers on sensorgen (http://sensorgen.info/SonyA7S.html) are quite a revelation. Huge QE (67%), huge well capacity and almost no read noise at high ISO. The read noise at low ISO isn't as good, but still better than anything Canon makes.
> ...



Canon sensor:





and 100% crops:
http://farm6.staticflickr.com/5561/15017838720_50f7a57b29_o.jpg
https://farm4.staticflickr.com/3874/15017948228_369f7a9829_o.jpg

Now I did manage to tripod a couple different exposures for this one, but with the wind, lots of little stuff, especially the highlighted maple in the center moved around too much, so now it's a long process of just painting in the parts of the stable trunks that I can. As a single shot it could be processed a bit better with a ton of careful selective NR and so on, but it still won't be all that great and that is a lot of time wasted and not on anything remotely fun. Still doable more or less with multiple shots combined in this case but with a lot of wasted time and tedious effort (which I have not yet gotten around to). With Exmor this shot as is would already be just good enough to get by with one shot. And that meant with Exmor it would've been no need for tripod, just snap and done and process. More fun and less time wasted. And in some cases too much is shifting around in too tricky of ways and even painting in for hours wouldn't really work.

So yeah I do hope Canon catches up with the 5D4 at least. I have doubts about the 7D2, although if they did with that it would be, obviously, more than encouraging for the 5D4. Otherwise, I'm sorry but no purchase from me, not this time. I've given them many years now and if they mess up the 5D4 DR then forget it. And that would be a shame since they have a great UI, great lenses, and with ML only, better video. Most of us harping a lot are harping because: 1. we actually like Canon better and would way prefer to be able to stay with Canon 2. the fanboys egg us on and mock everyone and spout nonsense and it either makes people leave and stop posting or become prickly and pushy in return although it might be better to resist


----------



## Marsu42 (Sep 11, 2014)

*Re: 7D Mark II: Could it be with a FOVEON?*



Diko said:


> Marsu42 said:
> 
> 
> > Yeah, right, I'm sure it's a _completely different sensor_ that just happens to have the same resolution :->
> ...



My thinking is that it would really be a sensor tech, Canon would take the opportunity to go to 24mp. This isn't much more in terms of resulting resolution, but it would be important for marketing vs. Nikon/Sony - and Canon seldom neglects that.



LetTheRightLensIn said:


> Now I did manage to tripod a couple different exposures for this one, but with the wind, lots of little stuff, especially the highlighted maple in the center moved around too much, so now it's a long process of just painting in the parts of the stable trunks that I can. As a single shot it could be processed a bit better with a ton of careful selective NR and so on, but it still won't be all that great and that is a lot of time wasted and not on anything remotely fun.



High dynamic range scenes with movement (i.e. you cannot bracket or only with composite shots) is exactly what Magic Lantern's dual_iso module is for - it boosts your dr to nearly 15 stops. Why not use it?


----------

