# A New Cinema EOS DSLR Body in 2013? [CR1]



## Canon Rumors Guy (Dec 27, 2012)

```
<div name="googleone_share_1" style="position:relative;z-index:5;float: right; /*margin: 70px 0 0 0;*/ top:70px; right:120px; width:0;"><g:plusone size="tall" count="1" href="http://www.canonrumors.com/?p=12442"></g:plusone></div><div style="float: right; margin:0 0 70px 70px;"><a href="https://twitter.com/share" class="twitter-share-button" data-count="vertical" data-url="http://www.canonrumors.com/?p=12442">Tweet</a></div>
<strong>Not just the EOS-1D C


</strong>There have been a few mentions of a new Cinema EOS DSLR body being announced in 2013. It would take on a smaller form factor than the EOS-1D C and would not shoot 4K video.</p>
<p>It wasn’t mentioned which DSLR it would be built around, but I think an APS-C Cinema EOS based around the upcoming EOS 7D replacement would be a good place to start.</p>
<p>The EOS-1D C has yet to reach the retail chain at the time of writing this, but it’s said to be arriving “very soon”.</p>
<p>From a new source, so take this with a grain of salt.</p>
<p><em><a href="http://www.bhphotovideo.com/c/product/855962-REG/Canon_EOS_1D_C_EOS_1D_C_4K_Cinema.html/bi/2466/kbid/3296" target="_blank"><strong>Preorder Canon EOS-1D C at B&H Photo $11,999</strong></a></em></p>
<p><strong><span style="color: #ff0000;">c</span>r</strong></p>
```


----------



## Daniel Flather (Dec 27, 2012)

RAW?


----------



## Lee Jay (Dec 27, 2012)

To be frank, any "cinema" body that doesn't shoot 4k or above should be DOA as of now. The new 4k delivery specs specifically stipulate no upresing, so if you shoot cinema below 4k, you can never deliver at 4k, which would be horribly stupid on a big-budget enterprise (i.e. one that would use "cinema" cameras and not just dSLRs).

When a $300 gopro can shoot 4k, there's no longer an excuse for a cinema camera not to.


----------



## StORM48 (Dec 27, 2012)

Cinema this, cinema that... is there any possibility for making photo camera, for a change?


----------



## AG (Dec 27, 2012)

StORM48 said:


> Cinema this, cinema that... is there any possibility for making photo camera, for a change?



Yes your right if it wasn't for the...

1Dx
5D3
6D
7D
60D
60DE
650D
1100D
EOS M
The G series
and various Point and Shoots

it would ALL be about those 4 damn cinema cameras.

A little perspective goes a long way.


----------



## bp (Dec 27, 2012)

Lee Jay said:


> To be frank, any "cinema" body that doesn't shoot 4k or above should be DOA as of now. The new 4k delivery specs specifically stipulate no upresing, so if you shoot cinema below 4k, you can never deliver at 4k, which would be horribly stupid on a big-budget enterprise (i.e. one that would use "cinema" cameras and not just dSLRs).
> 
> When a $300 gopro can shoot 4k, there's no longer an excuse for a cinema camera not to.



The GoPro is a jittery 15fps at 4K (or something like that). No 4K also has me scratching my head a bit on this one. Why bother? 2.5K RAW video and a pricetag to match or beat the BMCC, and it could find a fanbase. More 1080p low bitrate, and yeah, what's the point...


----------



## expatinasia (Dec 28, 2012)

Considering the 1DC is a staggering US$12,000, I can't wait to see what the pricing on a lesser model will be.

Funny, I have used the word staggering in my last two posts here at CR, and both with were to do with prices!!


----------



## mb66energy (Dec 28, 2012)

My first thought: Why just another SLR like camera for video - Canon is making its camera family more and more complex.

But: Why not? Each artist (or gear buyer...) has its own ideas about their "product" - photographs or movies. Canon does NOT mainstream their product portfolio but offer a vast range of different tools.

If I decide to go more into video (after having time to try it, a new PC) it might be a good idea to have a video capable camera which produces GREAT HD videos (with per pixel sharpness) and has good controls for aperture, shutter speed, iso etc.

After my experiences with my 600D the full HD resolution gives results which are technically far above movies made with Super 35 cameras. Viewing the smaller HD movies on an XGA beamer (1024 pixel horizontally) gives crisp quality at 2.5m projection width and 5 meters viewing distance. 4k Beamers are far away and just GOOD HD beamers are in the 5k€ region. So full HD in GOOD QUALITY will be sufficient for nearly all applications where the movie will tell us a story.

4k IMO is a new idea to make things incompatible and to sell new gear and to make things more complicated - with some exceptions. 7 years ago I searched for a large chip (Super 35mm) video camera with exchangeable optics - I had to spend roughly 100000€. I decided to wait a little bit and bought a house instead. The 500€ T3i (600D) does everything what I needed ... what a sense of freedom in terms of "you can buy valuable tools with a standard income".


----------



## Denisas Pupka (Dec 28, 2012)

Maybe it was good idea to not hurry with 6D and wait for January 8th. Maybe some good surprises.


----------



## Lawliet (Dec 28, 2012)

Lee Jay said:


> (i.e. one that would use "cinema" cameras and not just dSLRs).



Now imagine a cine cam that costs about as much as a normal dslr. Would that indie/TV/all kinds of limited budget crew rather use it or the more photo orientated body?
OTOH: the savings of picking a smaller cam over a 1Dc buy me how many globe-hours? I'd prefer streamlined logistics and handling.


----------



## Axilrod (Dec 28, 2012)

Lee Jay said:


> When a $300 gopro can shoot 4k, there's no longer an excuse for a cinema camera not to.



Have you actually seen the 4K on the Gopro? It's garbage and looks nowhere close to even a T2i, they only added it to fluff the specs. And everything is still delivered in 1080p, 4K is still unnecessary at the moment unless you're working on big budget productions. It sounds like the new Cinema EOS camera could be geared towards prosumers, in which case true 1080p RAW would be fine for me (depending on the price). DSLR's don't even produce true 1080p, it's really only resolving somewhere around 900 lines, so true 1080p would be a noticeable improvement.


----------



## Axilrod (Dec 28, 2012)

mb66energy said:


> After my experiences with my 600D the full HD resolution gives results which are technically far above movies made with Super 35 cameras. Viewing the smaller HD movies on an XGA beamer (1024 pixel horizontally) gives crisp quality at 2.5m projection width and 5 meters viewing distance. 4k Beamers are far away and just GOOD HD beamers are in the 5k€ region. So full HD in GOOD QUALITY will be sufficient for nearly all applications where the movie will tell us a story.
> 
> 4k IMO is a new idea to make things incompatible and to sell new gear and to make things more complicated - with some exceptions.



You think 4K is just to make things incompatible and sell more gear? What about megapixels on cameras then? Is that all a big scam too? Did you think the same when it went from SD to 720p? Or 720p to 1080p? 4K has alot of advantages and is the future, it's just following a natural progression of things getting better like any other product. 

And while the 600D may be fine for you, there are plenty of people that want something more. Canon DSLR's aren't even doing real 1080p, there is plenty of room for improvement.


----------



## HurtinMinorKey (Dec 28, 2012)

Lee Jay said:


> To be frank, any "cinema" body that doesn't shoot 4k or above should be DOA as of now. The new 4k delivery specs specifically stipulate no upresing, so if you shoot cinema below 4k, you can never deliver at 4k, which would be horribly stupid on a big-budget enterprise (i.e. one that would use "cinema" cameras and not just dSLRs).
> 
> When a $300 gopro can shoot 4k, there's no longer an excuse for a cinema camera not to.



Absolutely disagree. Raw is way more important than 4K. If you want to get the best IQ, Raw is the way to go. It'll be years before networks will be demanding things be 4k. So the only thing 4K gets you is the ability to crop in post. 

I hope it's a 7D-C, with 12bit raw, priced to compete directly with the BMC Camera. But it better have at least 13 stops of DR or it's a non starter. And sadly I bet it won't, because they have to nerf it pretty bad so it stays "differentiated" from the 1D-C.


----------



## mb66energy (Dec 28, 2012)

Axilrod said:


> mb66energy said:
> 
> 
> > After my experiences with my 600D the full HD resolution gives results which are technically far above movies made with Super 35 cameras. Viewing the smaller HD movies on an XGA beamer (1024 pixel horizontally) gives crisp quality at 2.5m projection width and 5 meters viewing distance. 4k Beamers are far away and just GOOD HD beamers are in the 5k€ region. So full HD in GOOD QUALITY will be sufficient for nearly all applications where the movie will tell us a story.
> ...



1080p is IMO sufficient for 99% of all applications and if 1080p beamers in GOOD quality are roughly 1000 EUR/$ we will wait another 10 years.

The way from SD via 720p to 1080p made sense: Image width: 4m, viewing distance: 5m, pixel size: 2mm x 2mm - that makes sense because the eyes resolution is in that region.

You are right that 1080p misses 1080 real lines. IMO it would be a better route to get REAL 1080p before you increase the sheer number of pixels without transporting more information. Nothing against 4k but it should not be the next step before 1080p is ripe.

Resolution for photography is a different thing because a landscape photo is browsed by our eyes for a longer time, resolution is important - in movies the camera operator does that for you.

But the 18 MP of the 600D do roughly compare with the 10 MP of the 40D: Limitations of the lenses and especially sensor noise cancel the advantage of 80 % more pixels a little bit. Except in situations where you have perfect light, perfect subject and perfect aperture values.


----------



## bp (Dec 28, 2012)

HurtinMinorKey said:


> I hope it's a 7D-C, with 12bit raw, priced to compete directly with the BMC Camera.



Yes. 12bit raw would have me drooling. Although I'd personally prefer full frame or APS-H, and perhaps 2.5K instead of 1080p


----------



## Policar (Dec 28, 2012)

mb66energy said:


> 1080p is IMO sufficient for 99% of all applications and if 1080p beamers in GOOD quality are roughly 1000 EUR/$ we will wait another 10 years.



I agree. 1080p looks surprisingly very good, even on a big screen. A whole generation of movies (the vast majority of DIs from the past decade) were done at 2k or a least with VFX done at 2k (2048X1080 at 1.85:1), so if 4k media requires 4k resolution then we are in trouble. The difference between a 2k and 4k scan is pretty trivial and has more to do with avoiding aliasing (oversampling) than producing signifiant additional sharpness. First generation 35mm prints have significantly more resolution than 1080p video, but the sharpness (area under the mtf curve) is not that different. Toy Story was originally rendered just above 720p. Most theatrical prints have around the same resolution as 720p video after they've played for a little while and on imperfectly calibrated projectors. I don't think 4k content distributors will ignore any movie posted in 2k so we will get 4k releases of movies posted in 2k the same way we have blu rays of 28 Days Later. Remember Avatar was shot at 1080p. But 4k is important for marketing because 3D HDTVs failed to catch on. It is also important from the perspective of camera marketing. Red has banked their whole business on it.

And, fwiw, the C300 and C100 shoot at 8MP resolution (4k), then downsampling to 1080p. And lenses show their full MTF. So the perceptual sharpness is much higher than a 1080p crop from a still image. 

From the perspective of post, both RAW and 4k are a pain in the ass. The Alexa (which has an image sharper than 35mm when shot at 2.5k RAW and very close at 1080p). is so popular because of this. It's what's used on most new TV productions and its image is generally more pleasing than the Red's. It's easier to work with, too. Why do you need better than that? Are your videos going theatrical? Are your clients screening at 4k? We don't even have affordable 4k monitors to post on. There will be a significant market for 4k video some day soon, but until then "true" 1080p is extremely sharp, and a $6000 C100 is cheap enough that replacing it in five years (without replacing any lenses) won't break the bank.

RAW does have some value. Even Canon's "C" line has poor dynamic range. But the Alexa has the same latitude in arriraw as it does in prores and its prores444 can handle crazy grading, despite only being 10 bit. The video out of Canon's dSLRs is relatively poor compared with true high end video. Even very good 1080p is a huge step up. Don't think 1080p and associate that with the mediocre (but impressive for the money) video from the 5D Mark III.

Personally, RAW and 4k I'd rather not have. Too much work in post for insignificant advantages if you shoot half competently in the first place. I do wish Canon could manage better dynamic range and 10-bit HDMI out. I'm considering buying a C100 but a 7DC or somesuch could prove even more compelling just by virtue of likely being half the price and maybe having a more robust codec.

Those who need 4k and RAW can easily buy a Scarlet at a very good price.


----------



## Lee Jay (Dec 29, 2012)

Policar said:


> I don't think 4k content distributors will ignore any movie posted in 2k so we will get 4k releases of movies posted in 2k the same way we have blu rays of 28 Days Later.



http://www.reduser.net/forum/printthread.php?t=88034&pp=10

"There is a new standard from Japan (not exactly sure why they get to call the shots) for consumer 4K . It dictates that you can't up-rez to 4K.

It means that features and TV shows shot on 1080P or 2K are destined to be left out of a second bite of the apple for a 4K delivery opportunity."


----------



## HurtinMinorKey (Dec 29, 2012)

Policar said:


> From the perspective of post, both RAW and 4k are a pain in the ass. The Alexa (which has an image sharper than 35mm when shot at 2.5k RAW and very close at 1080p). is so popular because of this. It's what's used on most new TV productions and its image is generally more pleasing than the Red's. It's easier to work with, too. Why do you need better than that? Are your videos going theatrical? Are your clients screening at 4k? We don't even have affordable 4k monitors to post on. There will be a significant market for 4k video some day soon, but until then "true" 1080p is extremely sharp, and a $6000 C100 is cheap enough that replacing it in five years (without replacing any lenses) won't break the bank.



I think your perception of the advantages of Raw are all wrong. In a way, it's easier for big budget productions to get away with not shooting in raw because they can light everything very well on set. For the indie guy, Raw affords the artist a lot of latitude to nail the proper exposure and WB in post. Just like film, there is more detail to recover in the shadows and highlights using RAW. 

In short, Raw offers the same advantages for cinema as it does for photography, and these advantages are far from trivial.

Now, i agree that it can be a pain in the ass in post, but if you get the right workstation it's easily managed. And within a couple of years the firepower needed to handle the Raw workflow (at least in 1080) will become standard issue.


----------



## Policar (Dec 29, 2012)

HurtinMinorKey said:


> Policar said:
> 
> 
> > From the perspective of post, both RAW and 4k are a pain in the ass. The Alexa (which has an image sharper than 35mm when shot at 2.5k RAW and very close at 1080p). is so popular because of this. It's what's used on most new TV productions and its image is generally more pleasing than the Red's. It's easier to work with, too. Why do you need better than that? Are your videos going theatrical? Are your clients screening at 4k? We don't even have affordable 4k monitors to post on. There will be a significant market for 4k video some day soon, but until then "true" 1080p is extremely sharp, and a $6000 C100 is cheap enough that replacing it in five years (without replacing any lenses) won't break the bank.
> ...



What leads you to believe any of that? The Alexa has far better latitude than the Epic and is MUCH easier to light for; DPs I've worked with who've worked with both always say as much, and I've been on set with the two as A/B cams and the difference in highlight rendering is very dramatic in the Alexa's favor, despite it recording Prores 444 (at 10 bit) and not 14 bit RAW. This is because the Alexa has two gain paths and it merges a tremendous amount of information into one high quality, very gradeable flat image. I've posted extensively on both Alexa and Red (for TV, in 1080p) and the Alexa footage is much more flexible with better latitude. The only situation in which RAW offered me an advantage was when white balance was totally off, but that's a sign of an incompetent DP.

What experience has led you to believe what you've written? Because it contradicts what everyone else who has used those two cameras has found. The footage coming out of the Alexa looks nothing like your dSLR jpegs. It's flat and in a true log colorspace and has more latitude by far than any other video or still system. 

Furthermore, 1080p raw would be a terrible format to work in; the "real" resolution would be 2/3s that at best. And 4k raw downscaled to 1080p takes all the horsepower of debayering and then the horsepower of downscaling for a deliverable that will ultimately not be much sharper than "true" 1080p (like out of the c100 and c300) in the first place. Those are "4k" bayer cameras; they just do an immediate conversion to "true" 1080p; unfortunately they have fake log curves and poor dynamic range. Fwiw, I've rendered hours of red footage (both in 4k and quad hd) on super high end mac pro work stations maxed out with ram and it takes days and days without red's $5,000 proprietary card. With CUDA you could surely get real time 1080p raw, but that's got the equivalent sharpness of 720p at best. You're talking out your ass about this stuff, frankly. Some day something similar will be here, but it's not as close as we think and in that time you can recoup the low cost of a camera system purchase (spend more money on the lenses than the camera).

Rent an Alexa. Rent an Epic. Shoot difficult footage side by side. Post side by side. Then get back to me on how much inherently better RAW is than Prores.


----------



## HurtinMinorKey (Dec 29, 2012)

^I think you're conflating sensor quality with codec quality. To make a long story short, if everything you said about raw is true then photographers wouldn't bother with raw either. 

And as far as the advantages of downscaling go, look at the footage coming off the BMC that's been properly graded. It's easily as good as the c300 (8MP vs. 2.5MP). And the downscaling helps reduce some artifacts, but it doesn't make it sharper necessarily.


----------



## Policar (Dec 30, 2012)

HurtinMinorKey said:


> ^I think you're conflating sensor quality with codec quality. To make a long story short, if everything you said about raw is true then photographers wouldn't bother with raw either.
> 
> And as far as the advantages of downscaling go, look at the footage coming off the BMC that's been properly graded. It's easily as good as the c300 (8MP vs. 2.5MP). And the downscaling helps reduce some artifacts, but it doesn't make it sharper necessarily.



Yes. It does. The nyquist sampling theorem states that if a system samples at a given frequency then it can record half that frequency accurately without aliasing. So a 4k sensor array can resolve a perfect 2k image without aliasing. But most people don't care about a little aliasing, which is why, as you point out, the BMCC can resolve 1080p nearly as well as the C300. A bayer chip can resolve 70% or more of its stated resolution if you don't mind aliasing. The BMCC is reported to exhibit pretty bad aliasing. (Even the Alexa, a 2.5k array, has a bit of aliasing, but it's really minor.)

When you work in photoshop, you're not working in RAW. Your end product, even if it's a TIFF that's going to print, isn't RAW. The advantage of a RAW workflow is flexibility at the cost of speed. Not so much image quality. The data will be processed and turned into RGB (or whatever your colorspace) pixels eventually. But in theory, RAW will always be the most flexible format. I think we can agree on that. Flexible does not mean best. If I sacrifice a tiny bit of flexibility with the Alexa by shooting prores instead of arriraw, I get insignificantly less resolution, a codec that actually edits in real time, and the same latitude as I had before. A competent shooter handing footage to a competent post house will produce the exact same image on that camera whether using prores or RAW (totally impossible to differentiate) assuming the exhibition medium is tv. For theatrical 4k projection, RAW will offer a very small amount more resolution (but not a lot, Arri has just added 2k prores video so theatrical shooters don't have to bother with what a pain in the ass arriraw is). Only the most inept shooter, totally bungling white balance and exposure, will notice any significant difference. Or the most brilliant shooter--Deakins claimed he noticed a slight, nearly imperceptible resolution edge when screening Skyfall tests shot on arriraw and upscaled for IMAX, though no difference in DR as compared with prores. Fwiw, it all looked much better than Red's 5k, used on some additional photography. The bigger difference is that prores files will be vastly faster to process. High end video doesn't look like a compressed JPEG (which is designed for delivery, not as an intermediate format); it's very flat and much less compressed. Some day when computers are way faster there may be an advantage to RAW video that outweighs the disadvantage in speed that comes from working with it, but how much Alexa footage is shot in arriraw vs prores? Very, very little. Game of Thrones looks fine to me. In Time looked fine and that was recorded in uncompressed HD video, not arriraw. Red forces you to use RAW but most producers wish it didn't.

Furthermore, it's not JUST sensor quality. The Alexa has a fine sensor, but it's the video processing (colors that emulate 5219 stock extremely accurately, high and low gain path merging, how the prores codec is implemented) that matter. And a poor RAW developer (just look at how many permutations redcine has been through to arrive at its current implementation, which produces rather boring colors) can be problematic; not everything is as good as Adobe's plug in, which is slow. Arri's in-camera software is better than any results you can get out of red's RAW developer. You need a Pablo or Da Vinci system to get a decent look out of that camera, and even then it's not ideal.

Have you used any of these cameras or are you just reading specs online? Granted the one camera you mention that I haven't used (the BMCC) does seem to have the best IQ for the money by far. So if that's your only concern, go ahead and buy one. Canon isn't catering to the testbed-in-a-box Frankenstein's camera market (not necessarily a bad thing, just a pain in the ass for both shooters and in post). Black Magic and Red have that covered.


----------



## HurtinMinorKey (Dec 30, 2012)

I really appreciate your opinion on this, clearly you have much more hands on experience than I do. My feelings about raw are based solely on what I've seen, my experience as a photographer, and my understanding of the theory. 

A nice flat profile is good, but at the end of the day raw maintains a certain amount of shadow and highlight detail that will be unrecoverable with a compressed codec. Of course you don't deliver in Raw, but the fact that it gives you more degrees of freedom in post is what makes the the final IQ better, no?


----------



## Policar (Dec 30, 2012)

I pushed and pulled at Alexa footage, but I couldn't get macroblocking or posterization or over-saturation -- or anything. I'm sure there's a difference compared with RAW, especially with regard to resolution -- after all Avengers and Skyfall were shot on arriraw but for 4k exhibition -- but it's trivial for tv, and prores is very fast and easy to work with and the baked in colors look beautiful. Alexa footage looks just like carefully developed 5219 film, down to the halation filter and "glow," except the grain structure is a bit different and it's much cleaner. Red has less latitude, despite being RAW, unless you enable one of the HDR modes.

It seems Canon's got a stop more highlights in c log on the c300 than in jpegs from their dSLRs, so that's a step in the right direction, but it's not nearly enough, not even close to the Alexa from what I've seen. I'm on board with what RAW promises (more highlight detail and a better image) but I'm just too lazy to make my computer do all the work when the camera can do it and get 99% as good a result. Out of camera JPEGs are a delivery format. Log on the Alexa and F3 are intermediate formats, meant for flexibility. Unfortunately only the Alexa (and maybe the F3 with s log and an external recorder) offer this combination of ease of use and IQ. Otherwise you have to pick your poison, and Canon's poison is reduced latitude, but still better than its dSLRS it seems. I personally dislike how red footage looks, but it's technically excellent and can look good with enough work in post. Prometheus looked amazing--the c300 won't do that! But it's for a different market, and the Scarlet and BMCC are already price-competitive.

Also remember that even redcode is compressed and a very detailed scene can induce artifacting with it.

Fwiw, I shoot stills in RAW, but I'm a bad photographer for the most part! It's just an indulgence since I like playing in photoshop and there's a lot more highlight detail there. For short form content I can see the Scarlet, for instance, being a lot of fun. For a feature, I'd dread the workflow.


----------



## Videoshooter (Jan 4, 2013)

If it's not going to have 4K then it better have some other benefits!

RAW is not a necessity for me but it would be very nice to have on certain occasions. ProRes is no good for me and anybody else working on a PC, but some other (proprietry?) high bitrate intraframe codec would be nice.

2.5K would be good as it would downscale to produce very sharp 1080p. True 1080p would be very nice, and to achieve that you'd most likely need a sensor designed from the ground up with video as a primary consideration. 

Canon has been very slow to add fast frame rates to any of their cameras (not a single camera with 720p120, compared to Sony & Panasonic who have several) but, if it lacks 4K maybe Canon will adorn it with a higher fps instead? 1080p 60 is a must, 1080p96 would be delightful, and 720p120fps (with proper 720p, not the soft mush most Canon DSLR produce) would put it on par with Scarlet in terms of higher frame rates. 

Of course the 30 minute limit would need to removed (very annoying for interviews), and it should include some video specific features such as zebras, waveform, peaking, zoom while recording for focus confirmation, as well as audio levels and manual audio control, headphones etc as on some of the newer models. 

If Canon was able to deliver all this in a 7D/5D style body then I'd gladly pay $6000+ for it.


----------



## ddashti (Jan 5, 2013)

I wonder if this type of cinema DSLR is really needed...


----------



## Videoshooter (Jan 7, 2013)

Well, it would be a whole lot more useful than the C100.


----------



## HurtinMinorKey (Jan 7, 2013)

ddashti said:


> I wonder if this type of cinema DSLR is really needed...



Canon has nothing that competes with the BMC Camera. Your options for Raw video with Canon is c500 or nothing.


----------

