# Do you have a 4K display?



## StudentOfLight (Dec 18, 2013)

I just want to get an idea of how widespread 4K displays are at the moment. 

Do you think 4K video formats will become more widespread with the upcoming DSLR generation of bodies or do you think manufacturers may only include it in top of the line models (EOS 1Dc) or more dedicated video cameras (e.g. EOS C~ line). 

Lastly, do you think the Canon EOS C~ line will formally expand into a product like for example EOS 7Dc?


----------



## Mt Spokane Photography (Dec 18, 2013)

If it sells cameras, it will come. There is a lot of discussion about the visual benefits, but that's not relevant, profit is what drives new technology.


----------



## 9VIII (Dec 18, 2013)

I actually think that cameras have in part driven 4K adoption. We've been seeing new 4K recording devices coming out of the woodwork this year. Even the Galaxy Note 3 can record 4K.
If I had a graphics card with Displayport I would have already ordered the new Dell UP2414Q, but, since I'm due for a system overhaul next winter I'll wait until then.


----------



## dolina (Dec 18, 2013)

If I were to get married today I would insist on my ceremony being recorded in 4K resolution.

I am currently using a 3K resolution (2560x1440p) display to type this post and I look forward to picking up a 4K resolution display when computer displays drop below $1000. Ideally it be 31.5-inch or wider. The Sharp PN-K321 today sells for $3,299.

As for 4K UHDTVs I see myself picking one up when there is downloadable content is available in 4K or there is data storage format that supersedes 2K resolution Blu-ray Disc.

Another possible condition of my getting a 4K UHDTV is when Sony & Microsoft releases their "slim model" of the PS4 & Xbox One in say 4-6 years.

I am one of the few guys who arent really interested in getting a video console this soon. During the last video console wars I waited until the first price cut to get one. Reason being the 1st year of a video console's life the games tend to be half baked.


----------



## expatinasia (Dec 18, 2013)

I remember someone once asked me a few years ago why I was recording all my videos in 1080, and I replied that technology was only going in one direction so while at that time broadband speeds were still quite slow, and 720 was more popular on the net, just a short time later and we are talking about 4K.

I agree with Dolina that if it were an important event like a wedding, I would, if possible like it to be recorded in 4K to future proof it as much as possible, but I think mainstream 4K will still take a while to catch up.

A lot has to do with the TV companies, some countries are faster than others to deliver full HD TV. I know that the Full HD TV channels I have look great on my big TV but the rest of the channels look bad. 

I could even see myself skipping 4K and going for whatever is after it.


----------



## rs (Dec 18, 2013)

dolina said:


> If I were to get married today I would insist on my ceremony being recorded in 4K resolution.
> 
> I am currently using a 3K resolution (2560x1440p) display to type this post and I look forward to picking up a 4K resolution display when computer displays drop below $1000. Ideally it be 31.5-inch or wider. The Sharp PN-K321 today sells for $3,299.
> 
> ...


http://www.macrumors.com/2013/12/02/24-inch-4k-display-from-dell-priced-at-1399-28-inch-4k-model-coming-at-under-1000/


----------



## StudentOfLight (Dec 18, 2013)

In terms of the practicality of working with such high resolution... 4K RAW is quite data intensive in terms of storage and read/write speed. The H.265 codec will hopefully be released soon and promises a useful improvement over H.264 which will help with reducing storage requirements for compressed UHDV, but compressed video is not as edit-friendly. Are there any new storage-media developments that will make working with uncompressed UHDV more tolerable? At the moment it seems very much like a PITA and pocket. :-\


----------



## ajfotofilmagem (Dec 18, 2013)

I prefer a full hd monitor with wide dynamic range and color reproduction. I do not want to spend $ 15,000 for a 4K monitor. That seems a real need for Hollywood filmmakers, or pixel peepers.


----------



## dolina (Dec 18, 2013)

I limit myself to 4K resolution as 8K resolution is not that commercially available.

Similar to expatinasia I told my friends who were getting married within the past 10 years to request the video ppl to store there 1080p video to HDD instead of down converting it to DVD.

If it was economical I'd go with 35mm film instead and scan it into 8K resolution later.

The typical upgrade cycle of consumers for TVs is 7-8 years. I have a 2006 32-inch 720p HDTV so that makes me a prime candidate to replace it. But it still works. 

Cable TV in the Philippines started offering 720p since 2009 through today so I do not see a purpose in upgrading to 4K resolution at the moment. Do not worry I also have a much newer 40-inch and 46-inch 1080p HDTV so we arent totally backwards here. 


expatinasia said:


> I remember someone once asked me a few years ago why I was recording all my videos in 1080, and I replied that technology was only going in one direction so while at that time broadband speeds were still quite slow, and 720 was more popular on the net, just a short time later and we are talking about 4K.
> 
> I agree with Dolina that if it were an important event like a wedding, I would, if possible like it to be recorded in 4K to future proof it as much as possible, but I think mainstream 4K will still take a while to catch up.
> 
> ...



2014 sub-$1000 4K resolution displays are most probably using lower-end panels and I did mention that I want a display larger than 31.5-inch,correct? At the current ppi of the 27-inch iMac a 4K resolution display would need to be 46-inch wide.

I'm after quality. If i wasn't then Amazon is selling a 50-inch Seiki 4K resolution UHDTV for less than $770.



rs said:


> http://www.macrumors.com/2013/12/02/24-inch-4k-display-from-dell-priced-at-1399-28-inch-4k-model-coming-at-under-1000/


----------



## dolina (Dec 18, 2013)

Hopefully by the time the "slim" Xbox One and PS4 comes out quality 4K resolution UHDTV will sell for under $2000. Maybe by then these "slim" updates will come with an optical disk drive that accept 4K resolution content.

Other than resolution the other motivation for me to upgrade would be the weight and power consumption. Power consumption is pretty much self explanatory as $/watt is always increasing and never decreasing but weight? It has been my dream to mount a display on the ceiling above my head. If the display is almost as light as an acoustic board then it is possible to do.

My dentist wanted to do that with his HDTV in his office so his patients can watch TV while he mucks around in their mouth but the contractor forbade it.


----------



## JonAustin (Dec 18, 2013)

I don't do video, so I'm not concerned about capturing or displaying it in any particular resolution. And even though we have a couple of Blu-ray players in the house (for their web apps and DLNA capabilities), we don't own or rent any Blu-ray discs; the up-scaling of DVD content is more than good enough for us.

I may look into 4K displays for my digital darkroom work, if they've become more mainstream and affordable by the time I need to replace my current displays, if ever (they're all less than 2 years old). But in the meantime, my Dell Ultrasharp 1920x1200 displays are very much up to the task for the foreseeable future.


----------



## RLPhoto (Dec 18, 2013)

No and don't plan to upgrade anytime soon.


----------



## jeffa4444 (Dec 18, 2013)

I work for a very large US motion picture rental company which is global so lets talk 4K. 

Canon, Sony, Red (they also have 5K), Black Magic make 4K cameras and will be joined by Phantom in the new year. The most popular camera in Hollywood is the Arri Alexa its a 3.5K camera that outputs 2K and most movies in theaters are 2K NOT 4K. The 4K cameras actually dont output 4K and most dont employ lossless compression but thats another story the real issue is a 4K TV to see 4K you need to sit considerably closer to the TV than 2K (1080P / i) for the same given size and the majority of broadcast content is NOT 4K but 2K even if shot on a 4K camera. In the pipeline are 8K cameras theoretically your would need to be even closer to the screen to get the benefit so is this a case of technology over common sense yes and no. A down sampled image would give cleaner images after compression from say 8K or 4K file after allowing for concantination through the broadcast pipeline for 2K but a pure 4K file will still require a closer viewing distance for a given screen size to get the benefit the same applies to a movie theatre.


----------



## 9VIII (Dec 18, 2013)

20/20 vision is by definition average. I'm a little better than that, but when I say that the difference between 1080p and 4K is blatantly obvious, I have confidence that it will be just as obvious for the average person reading this.

In 5-10 years when we start talking about 8K (hopefully sooner than later), it will be the same discussion all over again, and I will tell you that the difference between 4K and 8K, at the same distance you use your 1080p TV right now, will be blatantly obvious (as long as the content you're looking at is actually good quality).

Beyond 8K things will get a little fuzzy, but by my own measurements I could potentially use a 30" desktop monitor (3 foot viewing distance) with as high as 16,000x8,000 resolution before individual pixels start blending in with the inherent signal noise in my eyes. 
I think 8K would be a good resolution for industry to stick at, with maybe the odd 64 or 128 Megapixel screen made for special people like me.


----------



## 9VIII (Dec 18, 2013)

Another clarification should be made in the type of content being viewed. There are ideal situations for taking advantage of extra detail and not-so ideal situations. You will see extra detail best in high contrast still images. Oppositely low contrast video with lots of motion is sometimes so bad the whole thing almost looks washed over. That, it seems to me, is where you get people saying they can't see the benefit in higher resolutions. If you're looking at an image already devoid of detail then of course it's not going to look any better.
It may be that much of the content the average person looks at doesn't contain a whole lot of extra visual information, but that certainly doesn't mean that people have to get bigger TVs or sit closer to their screens in order to take advantage of extra detail when it is present.
I have also encountered people who can't see the difference due to sheer ignorance. One time I tried to point out all the jaggies on screen to a friend of mine. His response was that he didn't know what they are so it didn't bother him.


----------



## RLPhoto (Dec 18, 2013)

I have a 4K video camera and It's pretty sharp for what it is.


----------



## ajfotofilmagem (Dec 18, 2013)

Just as a picture with 36 megapixel and coarse compression does not seem better than another with 8 megapixel and compression fine, the video resolution is less important than the compression codec used in the video. Currently H264 has very significant quality losses during the editing process. Yes, there are other video codec to preserve more image quality, but let's be honest: Who would be willing to record 4K video, generating files of 5 gigabytes per minute?


----------



## dolina (Dec 18, 2013)

9VIII said:


> I have also encountered people who can't see the difference due to sheer ignorance. One time I tried to point out all the jaggies on screen to a friend of mine. His response was that he didn't know what they are so it didn't bother him.


Reminds me of ppl who are so used to mediocre food being fed good food.

Cannot relate.


----------



## danski0224 (Dec 18, 2013)

9VIII said:


> 20/20 vision is by definition average. I'm a little better than that, but when I say that the difference between 1080p and 4K is blatantly obvious, I have confidence that it will be just as obvious for the average person reading this.




I also find the difference to be blatantly obvious.

I checked out the 4k TV display at an electronics store, and I was floored.

Unfortunately, at least for upgrades, my current set still works (and I hope it keeps working for a while).

The next one will have to be bigger... 63" isn't big enough. So glad I didn't buy a smaller set.


----------



## Mt Spokane Photography (Dec 19, 2013)

danski0224 said:


> 9VIII said:
> 
> 
> > 20/20 vision is by definition average. I'm a little better than that, but when I say that the difference between 1080p and 4K is blatantly obvious, I have confidence that it will be just as obvious for the average person reading this.
> ...


 
Viewers can't tell the difference at normal viewing distance, you have to be close, like 5 ft or less. That's why video stores arrange them so that you will be close to the screen. At 10 ft, it makes no difference.

http://www.displaymate.com/news.html#7


----------



## Fleetie (Dec 19, 2013)

Mt Spokane Photography said:


> danski0224 said:
> 
> 
> > 9VIII said:
> ...




But, but 9VIII is a special person who can resolve 0.05mm from 3ft away. So it must make a difference to him.


----------



## 9VIII (Dec 19, 2013)

Mt Spokane Photography said:


> danski0224 said:
> 
> 
> > 9VIII said:
> ...



I'm betting the numbers those people use came from tests that do not represent what is possible with a computer monitor.
However it was that they came up with those results, they're wrong.

It's easy to test limits for yourself. Open a pant program and draw a straight line at a slight angle (make sure the program does not apply smoothing, so look really close or use a magnifying glass to ensure that transitions from one column of pixels to another happen without half shaded pixels). See the jaggies. Now back away from your screen until the line is blurred smooth. Note that a line at a 45 degree angle will look smooth sooner than one just a few degrees off vertical.
In order for a screen "look" perfectly smooth, I should not be able to see any stepping on a line of any angle.
On my 100PPI laptop screen I have to stand 9 feet away before jagged edges start to blur. At that distance I should be using a 45" 4,000x2,000 screen. For a 60" TV I would want 5,400x2,700 resolution.
That's a _minimum_ number, higher would be better to give margin for error.
But wait, that's not actually the limit of what I can see. If I place one white dot (ensuring it's a single RGB cluster) on a black background, in a dark room, I can still see it from 18 feet away. At 20 feet it blurs in with the image noise in my eyes. For a display to perfectly reproduce the image that I see when I look at something, it's going to have to match that level of detail. That would be 200PPI at 9 feet, or a 60" 10,800x5,400 screen.

Some might say that those numbers are unreasonable. As noted in the Displaymate article (and by me earlier), detail is as much about contrast as resolution. How often you would be able to take advantage of that level of detail depends largely on what type of content you're consuming. Games in particular are very good at producing high contrast imagery, and the amount of detail in digitally produced content is inherently tied to your display. If we have cameras that produce images of similar resolution as well I don't see a reason not to use matching monitors. For pictures and games I'll take all I can get. Movies, as noted, tend to look terrible to begin with. That application probably wouldn't be as demanding.
For all the practical reasons you normally hear people whine about, 8K sounds like a good sticking point until we figure out better ways to shoot images into your brain.


----------



## Hannes (Dec 19, 2013)

For my TV I don't mind if it stays 1080p for a bit longer but for my computer screen I wouldn't say no to a higher ppi count. 24" and 4k still isn't great at a little over 180 ppi but it is far closer to the print output. It is silly really we don't have higher res screens yet, most modern phones will have better resolution at 5" than my 24" monitor has


----------



## danski0224 (Dec 19, 2013)

Mt Spokane Photography said:


> Viewers can't tell the difference at normal viewing distance, you have to be close, like 5 ft or less. That's why video stores arrange them so that you will be close to the screen. At 10 ft, it makes no difference.
> 
> http://www.displaymate.com/news.html#7



I'm not so sure I agree. One set had a demo video of some sort that showed 1080 on one side and 4k on the other and the difference was clearly noticeable from about 8 feet out.

I suppose that each of these sets had demo content that was optimized. As I understand it, native 4k content is pretty scarce.

I wasn't there to do an evaluation, I just looked while I was there- not in the market for a new TV at this time. 

I didn't view "normal" HD content upscaled for the resolution, either. That could be pretty cool if it smooths out pixels on larger screens.


----------



## LetTheRightLensIn (Dec 19, 2013)

Not yet, but I can hardly wait!!!!

I hope the new Dell 2414Q proves to be good since it seems like NEC may be some years away still . I still can't get a solid answer as to whether the Dell has the 14 bit 3D LUT that a few recent Dells have had or not and whether it has any of the old over-drive and other issues some Dells have had.


----------



## LetTheRightLensIn (Dec 19, 2013)

ajfotofilmagem said:


> I prefer a full hd monitor with wide dynamic range and color reproduction. I do not want to spend $ 15,000 for a 4K monitor. That seems a real need for Hollywood filmmakers, or pixel peepers.



15k? no wide gamut?

They is already a $1299 99% AdobeRGB 4k monitor out (from Dell).


----------



## LetTheRightLensIn (Dec 19, 2013)

jeffa4444 said:


> I work for a very large US motion picture rental company which is global so lets talk 4K.
> 
> Canon, Sony, Red (they also have 5K), Black Magic make 4K cameras and will be joined by Phantom in the new year. The most popular camera in Hollywood is the Arri Alexa its a 3.5K camera that outputs 2K and most movies in theaters are 2K NOT 4K. The 4K cameras actually dont output 4K and most dont employ lossless compression but thats another story the real issue is a 4K TV to see 4K you need to sit considerably closer to the TV than 2K (1080P / i) for the same given size and the majority of broadcast content is NOT 4K but 2K even if shot on a 4K camera. In the pipeline are 8K cameras theoretically your would need to be even closer to the screen to get the benefit so is this a case of technology over common sense yes and no. A down sampled image would give cleaner images after compression from say 8K or 4K file after allowing for concantination through the broadcast pipeline for 2K but a pure 4K file will still require a closer viewing distance for a given screen size to get the benefit the same applies to a movie theatre.



You forget things such as:

consumer 4k video cams are coming out so people can produce and watch their on 4k videos

4k is wayyyy better for viewing still photography than 2k or less

4k (or simply higher pixel pitch, like on tablets) is wayyy better for text, so much better for reading electronic books, newspapers, magazines and heck much nicer text to look at on the web, for programming, anything

unless you sit too far away, 1080p looks pretty blocky on an HDTV set, nothing like looking out a window, and when you sit that far away it still doesn't feel like looking out a window since the detail scale doesn't match up to the FOV

oh and how about this, how come people scream bloody murder when a game doesn't offer AA, whether they view it on a 20" screen or a 60" 2k screen, they all scream about the nasty jaggies, well if you see nasty jaggies you sure as heck do not have too much res for the screen size!


----------



## LetTheRightLensIn (Dec 19, 2013)

Mt Spokane Photography said:


> danski0224 said:
> 
> 
> > 9VIII said:
> ...



10'+ back is getting to be pretty far and no you don't have to be 5' or less

it's ridiculous all the talk about how you need 55" for even 1080p to matter, utter nonsense, the same people go on about how their 24" print looks so much better at 300ppi. My 24" 1920x1200 monitor looks grainy as hell after using my retina ipad for a little bit or looking at any printed magazine or book for a little bit.

and if you really want the full impact from video the screen should be fov filling, not a little box taking up a fraction of your vision from 20' away


----------



## danski0224 (Dec 20, 2013)

LetTheRightLensIn said:


> 10'+ back is getting to be pretty far and no you don't have to be 5' or less
> 
> it's ridiculous all the talk about how you need 55" for even 1080p to matter, utter nonsense, the same people go on about how their 24" print looks so much better at 300ppi. My 24" 1920x1200 monitor looks grainy as hell after using my retina ipad for a little bit or looking at any printed magazine or book for a little bit.
> 
> and if you really want the full impact from video the screen should be fov filling, not a little box taking up a fraction of your vision from 20' away



I'm 9' away from a 63" set. I wouldn't mind being a bit closer.

But, too close and the individual pixels start appearing. I don't think I could comfortably watch it much closer than 8'. 

If 4k could smoothly upscale HD content and give me a clean ~60" display at ~8', that would be nice. 

If I was to replace my current set with a 4k display and keep the existing arrangement, the new one would have to be bigger. Probably won't be happening anytime soon.

Having experienced ~40" panels at ~10', I often wonder: what's the point? And it becomes worse when watching letterbox content.


----------



## jrista (Dec 20, 2013)

Mt Spokane Photography said:


> If it sells cameras, it will come. There is a lot of discussion about the visual benefits, but that's not relevant, profit is what drives new technology.



But profits are only attained when the consumer sees a benefit. Given that the primary talk about 4k screens among photographers is the visual benefits...finer detail, less ability to "pixel peep", higher microcontrast, 10-bit support/wider gamut, etc. Those are all the _reasons _photographers _would _buy a 4k screen. Without the visual benefit, there are no profits.


----------



## kaihp (Dec 20, 2013)

jrista said:


> But profits are only attained when the consumer seesthinks there is a benefit.



There, I fixed it for you 

I will seriously consider a 4K monitor when it comes down in price .... not necessarily for photos, but for the need of "screen real estate" for general usage. I was recently comparing a UHD TV vs a FHD TV for work (as a replacement for a projector), and the ability of the UHD to do render more details just for text etc was quite convincing.
The RMB19,000 price tag for the 65" version kept me at bay, though.


----------



## IMG_0001 (Jan 6, 2014)

Fleetie said:


> Mt Spokane Photography said:
> 
> 
> > danski0224 said:
> ...



I like consistent units better than 4k .


----------



## dolina (Jan 7, 2014)

A tad out of topic but...

My 5yo 46-inch Samsung LCD TV's panel needs to be replaced. This happened after 24 hours attached to a IPTV box.












Parts and labour will cost me $850 and a week's wait.

Went TV shopping yesterday and my takeaway is that the most basic of 46-inch LED TVs can be had for $850. Add $100 and I get a 50-inch LED TV. Add $500 and I get a 60-inch LED TV.

Power consumption of LED is a fraction of what I am getting with LCD.

Smart TVs are nice if you dont have a smartphone, tablet or computer. Wish Apple would make one, I'd be more inclined to buy a solution from them.

Of course this isn't a 4K display. I was initially planning to wait 3-5 years before picking up one. In time for a slim Xbox One & slim PS4.

Now I'm back to my 8yo 32-inch Samsung LCD TV and 4yo 40-inch Samsung LCD TV. Which is really sad considering we switched to HD cable this year.

=======================

Now for the 4K TV part.

I auditioned the following

LG 65LA9700 (65-inch LED)
Sony Bravia KD-65X9004 (65-inch LED)

Both look awesome with a 2048px on the longest side JPEG even zoomed in at 200%

Playing a 1080p & 720p MP4 with a low bitrate looks like a SD content


----------



## Don Haines (Jan 7, 2014)

Apparantly, someone at CES has announced a $999 4K tv.....


----------



## dolina (Jan 7, 2014)

Don: From a name brand like LG, Samsung, Sony, Toshiba, Panasonic or Philips?

I've tried TCL, Devant, Haier & "My View" and the image quality has a lot to be desired. GUI is uhhhhh. 

======

Some fun facts that I learned on TV upgrade cycle for a typical household

- New display is bought every 6.9 years on average
- Replacing an aging CRT TV
- Replacing a first generation LCD TV
- 32-inch is the most popular screen size for developing countries
- 44-inch is the most popular screen size for developed countries
- $940 tends to be the budget for new TVs in developed countries
- Declining price is a motivation to buy
- Newer technology
- More sizes available

Source:

http://www.displaysearch.com/cps/rde/xchg/displaysearch/hs.xsl/120529_global_tv_replacement_cycle_falls_below_7_years_as_households_continue_to_replace.asp
http://gigaom.com/2012/01/05/tv-replacement-cycle/

==============

I like the LED TVs with WiFi and Ethernet as I can stream my vids directly to the TV. A USB plug is also useful when networking is not practical.

I'm not that hot with Smart TVs unless they sport a more uniform GUI like that of iOS or Android for phones/tablets.


----------



## 9VIII (Jan 7, 2014)

dolina said:


> A tad out of topic but...
> 
> My 5yo 46-inch Samsung LCD TV's panel needs to be replaced. This happened after 24 hours attached to a IPTV box.
> 
> ...



Remember that there are no "true" LED TVs out right now. The difference in power consumption is between using a florescent backlight and an LED backlight. Both displays use an LCD panel to produce colours.
This will become immensely confusing once they actually start producing displays that use LEDs to produce the image.


----------



## dolina (Jan 7, 2014)

9VIII said:


> Remember that there are no "true" LED TVs out right now. The difference in power consumption is between using a florescent backlight and an LED backlight. Both displays use an LCD panel to produce colours.
> This will become immensely confusing once they actually start producing displays that use LEDs to produce the image.



Yes if you want to be pedant about it, but no one sells flat screens as CCFL TVs do they?

For the general public who wil go to the manufacturer websites they will be presented with TVs divide into LED, LCD, Plasma, OLED, etc etc. So using a generally accepted term is correct. It is about communicating effectively.

And AFAIK they do sell "true" LED displays already but for industrial/commercial use for outdoors. Very visible during the day and blinding at night.

======

Looking online a no-name 16-inch LED TV sells for $80 inclusive of 12% VAT. It can play media plugged into a USB port. I expect it to work for 6 months after which you will need to buy a new one.


----------



## 9VIII (Jan 7, 2014)

dolina said:


> 9VIII said:
> 
> 
> > Remember that there are no "true" LED TVs out right now. The difference in power consumption is between using a florescent backlight and an LED backlight. Both displays use an LCD panel to produce colours.
> ...



It's good to see that you're aware of the difference, a significant majority of the people I talk to are completely unaware. In my opinion it basically amounts to false advertising.


----------



## dolina (Jan 7, 2014)

9VIII said:


> It's good to see that you're aware of the difference, a significant majority of the people I talk to are completely unaware. In my opinion it basically amounts to false advertising.


Bravo I'm informed of the lies of the industry. Dude, get over it! It's just a marketing term to highlight a new feature that I am particularly thankful for.

Lower power consumption is _always_ welcome.


----------



## David_in_Seattle (Jan 7, 2014)

I'm debating on pulling the trigger on a couple new Dell 24" 4k displays, but have been hesitant to do so since I've heard issues with their display working on 60 Hz. Other displays are currently out of the question since the company I work for gets a sweet discount on these monitors.

The added resolution would definitely help with the type of photo and video editing I do on the job.


----------



## Ruined (Jan 7, 2014)

My opinion:

4K is good for proofing, i.e. computer monitors. Mainly because your eyes are right up near the screen and you have lots of great 4k content (your pics).

For movies, pointless, due to distance from screen, diminishing returns with motion compression, and fact that most content does not resolve beyond 1080p in detail even if encoded at 4k. You need a minimum of a 10ft screen to see significant improvement from 1080p at normal viewing distances per Joe Kane, who is an unbiased industry video expert.


----------



## Ruined (Jan 7, 2014)

David_in_Seattle said:


> I'm debating on pulling the trigger on a couple new Dell 24" 4k displays, but have been hesitant to do so since I've heard issues with their display working on 60 Hz. Other displays are currently out of the question since the company I work for gets a sweet discount on these monitors.
> 
> The added resolution would definitely help with the type of photo and video editing I do on the job.



Just as a warning, my friend bought a top of the line Dell 32" Ultrasharp 4k monitor. It had many both stuck and dead pixels. He exchanged it for 2 replacements, both with lots of stuck and/or dead pixels. He eventually gave up and asked for refund, though Dell in the end hooked him up for his troubles.


----------



## 9VIII (Jan 7, 2014)

dolina said:


> 9VIII said:
> 
> 
> > It's good to see that you're aware of the difference, a significant majority of the people I talk to are completely unaware. In my opinion it basically amounts to false advertising.
> ...



I'm just as happy as you are that the industry has switched to LED backlighting, it just should have been named differently.


----------



## jrista (Jan 8, 2014)

9VIII said:


> dolina said:
> 
> 
> > 9VIII said:
> ...



I suspect the industry will be skipping right by "true" LED displays, and heading strait for OLED displays. I don't think there is any way to market a "true" LED display (where there are discrete RGB LEDs for each and every pixel) such that the general public would understand the difference relative to an LED Backlit display (either edge backlit or matrix with local dimming.)

LG already has a 77" OLED TV (although it's curved, a feature I personally am not a fan of...I think it's just a gimmick.) Samsung is supposedly readying an 80" OLED display which features an adjustable curvature (again, a feature I think is a gimmick.) 

At 80", standard 1920x1080 pixels are MONSTROUS, and there is no question such large screens could benefit from a factor of four shrink in pixel dimensions. I think 4k will do wonders for these large OLED screens...I just hope they end up flat at some point, as I'd prefer not to have some hulking curved monstrosity popping out of my wall, when the intent is to have a 1" deep beautifully flat monstrosity sitting nearly flush and otherwise inconspicuous.


----------



## dolina (Jan 8, 2014)

I'd love to sell my 27-inch Dell U2711 now and get a 32-inch Sharp 4K display but my 27-inch iMac would probably not be able to drive it properly and not to mention not match in terms of resolution and screen size.

I hope in 3 years time all the issues for 4K displays will be resolved for the largest 4K iMac I can get.


----------



## mkabi (Jan 8, 2014)

I just got my 60" 1080p TV, 2 years ago.
So I'm not going to upgrade anytime soon.
But that doesn't mean that the next guy shouldn't.

You don't have to believe me or the next person here that tells you that "to see the difference you need to sit very close."

Here read it on CNET: http://reviews.cnet.com/8301-33199_7-57610862-221/four-4k-tv-facts-you-must-know/

Yeah sure, you can see the difference on Retina display, but really... you sit very close to it. Measure it, I'm sitting less than 3 feet from my computer screen. When I pull out my iphone or a friends ipad, it isn't more than 2-3 feet away.

You want to see the difference between 1080p? I guess wait for the 8K TVs?


----------



## 9VIII (Jan 9, 2014)

From the CNET article:


> Larger TVs or closer seating distances make that difference more visible, *as do computer graphics, animation, and games*...



Like I was saying, if someone is looking at something inherently blurry the resolution of it isn't going to matter.
I recently read that in the EU they're looking at including 100hz (double the standard 50hz PAL frequency) in with the 4K broadcasting spec. The original NHK UHD spec also included a 120hz refresh rate to help improve the image. I have to wonder if the archaic 24hz Hollywood standard frame rate isn't partly responsible for much of the negativity surrounding 4K?


----------



## jrista (Jan 9, 2014)

9VIII said:


> From the CNET article:
> 
> 
> > Larger TVs or closer seating distances make that difference more visible, *as do computer graphics, animation, and games*...
> ...



The archaic 24Hz Hollywood standard is changing, as well. The recent Hobbit movies were shot at 48 frames per second. James Cameron is apparently shooting Avatar 2 and 3 (and however many more there may be after that) at 60fps. A 60Hz refresh rate fits well with 240Hz 3D BluRay playback as well. Several cable providers are already clearing bandwidth in order to have more free in order to deliver native content in 4k resolution (and I believe there may already be some 4k content distribution, with 2k downgrade on those channels when 4k isnt' available.) It isn't just TVs that are moving forward into a new era of quality and resolution...the technology used to create and deliver the content we view on them is moving forward as well.

Nay-sayers are simply uneducated as to the big picture. It isn't just 4k TVs that will be playing back ancient Standard HD content (720p). It is 4k TVs that will be playing back native 4k content, from TV and BluRay, as well as internet enabled content delivery networks like NetFlix (which has adopted Super HD for a lot of it's content already, and is also working on preparing their system for delivering 4k content.)

Even assuming one "only" watches 1080p content on a 4k TV. That 2k content is supersampled (more pixels than necessary are rendering it), therefor it still looks better than on a native 2k device.


----------



## mkabi (Jan 9, 2014)

jrista said:


> The recent Hobbit movies were shot at 48 frames per second.



Have you seen the Hobbit at 48 fps?
It looks bad! You can see the make-up and you are able to discern the fake props.
I don't think 48 fps should be used in movies that use heavy costumes, make-up, props and CGI.
It possibly can be used in stuff like the Silver linings playbook, but anything else... 

The beauty of photographs @ 4K, 5K and higher is that you can remove blemishes, soften the image, etc. with photoshop. You can't do the same with video, unless you want to go through 48 frames and correct it individually, which will take forever when it is a 2 hour movie.


----------



## Lichtgestalt (Jan 9, 2014)

mkabi said:


> jrista said:
> 
> 
> > The recent Hobbit movies were shot at 48 frames per second.
> ...



and 48 fps has what to do with resolution?



> It looks bad! You can see the make-up and you are able to discern the fake props.



because of 48 fps?.... i doubt that.


----------



## jimjamesjimmy (Jan 9, 2014)

mkabi said:


> jrista said:
> 
> 
> > The recent Hobbit movies were shot at 48 frames per second.
> ...




you can easily do that with film, and you dont have to go through each frame individually, and if youre making a filom at 48 fps in such high res, im sure youve got the resources to check every frame.


----------



## jdramirez (Jan 9, 2014)

There is a huge distance between 480I and 1080p, but I find that mostly I'll watch dvd quality upscaled video which is tolerable. I'll watch satellite broadcast hd signals which are fine, but not really that impressive. I have a 3d tv and I don't hate the glasses and I don't mind the crud where they try to send debris out of the screen towards you, but the content isn't there.

I like basketball in 3d. Baseball and football aren't all that impressive. 

So my issue is that I don't know there will be a clamor for content. If Directv comes out with ten channels, Wil that make Comcast and other cable providers respond in kind? So I say no thank you until they flood my screen with higher quality programming.

And honestly... with football we are still getting blah hd feeds. I think it is cbs who down grades the image quality and it is noticeable. Give me 1080p first for a few years before I even consider upgrading.


----------



## Mr_Canuck (Jan 9, 2014)

Waiting for Apple to sort out their drivers and integration though so that I can use it on a macbook pro. Not there yet. Undoubtedly it will all get sorted out.


----------



## 9VIII (Jan 9, 2014)

jrista said:


> James Cameron is apparently shooting Avatar 2 and 3 (and however many more there may be after that) at 60fps.



Oh thank goodness. The whole "48fps" thing always sounded like a halfway measure. I'm sure he'll get the job done right.




mkabi said:


> jrista said:
> 
> 
> > The recent Hobbit movies were shot at 48 frames per second.
> ...



People said the same thing about "HD" when it was introduced. They adapted well enough, and will do so again.

It's interesting that this fits my point perfectly, higher framerates make the image more detailed.
It's just like taking a picture with a faster shutter speed. If you want to capture detail in motion, you need a shorter exposure. They could make 24fps movies with a really fast shutter speed, but then the movie would look like a slide-show.
Applying that to 4K, the amount of blur you're allowed before it crosses multiple pixels on the display becomes that much shorter, so framerate does make a difference as you increase resolution.

The complaint about 48fps that I read most often is it reminds people of a TV show. Live action has almost always been 60fps, if you put a side by side recording of a football game at 24fps and 60fps I doubt anyone would prefer the 24fps version. Why should it be any different in movies?


----------



## cayenne (Jan 9, 2014)

mkabi said:


> I just got my 60" 1080p TV, 2 years ago.
> So I'm not going to upgrade anytime soon.
> But that doesn't mean that the next guy shouldn't.
> 
> ...



How far away from your TV do you sit?

My TV is a couple years old, but I have a Samsung 59" Plasma, and I generally sit about 7-8 ft away from it...short living room.

C


----------



## cayenne (Jan 9, 2014)

9VIII said:


> <snip>
> The complaint about 48fps that I read most often is it reminds people of a TV show. Live action has almost always been 60fps, if you put a side by side recording of a football game at 24fps and 60fps I doubt anyone would prefer the 24fps version. Why should it be any different in movies?


Well, we've been used to watching true Movies at 24fps for decades now. To most peoples' eyes, that blur is part of what makes a movie look cinematic.

Strangely enough, our brains have been trained to think that what might be argued as old tech is actually what to many makes it look of higher quality or "movie-like".


----------



## florian (Jan 9, 2014)

I got myself the Panasonic 65 inch 4K TV because it has a 4k Player included and it has a displayport 4k 60p input. It does fit perfect to my 1D C and I don´t want to miss 4k.


----------



## LetTheRightLensIn (Jan 9, 2014)

dolina said:


> My dentist wanted to do that with his HDTV in his office so his patients can watch TV while he mucks around in their mouth but the contractor forbade it.



It won't be in HD, but maybe he can get TV headset goggles. Some dentists use those to let patients watch movies/TV during longer things like cavities and crowns and such.


----------



## LetTheRightLensIn (Jan 9, 2014)

ajfotofilmagem said:


> Just as a picture with 36 megapixel and coarse compression does not seem better than another with 8 megapixel and compression fine, the video resolution is less important than the compression codec used in the video. Currently H264 has very significant quality losses during the editing process. Yes, there are other video codec to preserve more image quality, but let's be honest: Who would be willing to record 4K video, generating files of 5 gigabytes per minute?



But more MP and a little more compression generally looks better than less MP and less compression.

(Of course in some cases compression is so over the top already.... you have 1920x1080 channels delivering about 720x640 detail and with lots of macro blocking all over. So for those super low bandwidths more MP is a waste.)

And often whatever total bandwidth you end up with using a given level of compression is better than using that same bandwidth for uncompressed.


----------



## LetTheRightLensIn (Jan 9, 2014)

Samsung HDTVs are realllllly poorly built these days (actually all of the brands are, although samsung certainly isn't the best and they have impossibly bad warranty service (criminally so in many cases).)

And yeah when the panel goes, it usually doesn't make sense to fix it.



dolina said:


> A tad out of topic but...
> 
> My 5yo 46-inch Samsung LCD TV's panel needs to be replaced. This happened after 24 hours attached to a IPTV box.
> 
> ...


----------



## LetTheRightLensIn (Jan 9, 2014)

9VIII said:


> It's good to see that you're aware of the difference, a significant majority of the people I talk to are completely unaware. In my opinion it basically amounts to false advertising.



Yeah the whole LED thing was such a crock the way they advertised it. You'd say so what size is your new LCD or did you go LCD or Plasma and they'd respond oh I didn't get an LCD this time or neither, I got a LED technology set instead of an old LCD panel!


----------



## LetTheRightLensIn (Jan 9, 2014)

Ruined said:


> My opinion:
> 
> 4K is good for proofing, i.e. computer monitors. Mainly because your eyes are right up near the screen and you have lots of great 4k content (your pics).



yeah, awesome for that



> For movies, pointless, due to distance from screen, diminishing returns with motion compression, and fact that most content does not resolve beyond 1080p in detail even if encoded at 4k. You need a minimum of a 10ft screen to see significant improvement from 1080p at normal viewing distances per Joe Kane, who is an unbiased industry video expert.



that's absurd

now sure some people won't care or will insist on sitting WAY far back and it won't matter, but to say flat out that you need a 10' screen for UHD video to make any difference is completely wrong, even a 24" set it would make a difference never mind a typical 55" if you sit at a distance that makes you feel anywhere remotely a part of the movie and it's not some tiny little spec in your FOV across a large room


----------



## LetTheRightLensIn (Jan 9, 2014)

9VIII said:


> dolina said:
> 
> 
> > 9VIII said:
> ...



Although most of the first couple years of edge lit LED backlit screens had pretty nasty light bleed and were way worse than CCFL visually. From what I can see it's only been a little over a year since they seemed to have managed to lick that problem at all.


----------



## LetTheRightLensIn (Jan 9, 2014)

jrista said:


> 9VIII said:
> 
> 
> > dolina said:
> ...



Yeah for sure, consumer regular LED sets won't ever appear. It's going to organic LED.


----------



## LetTheRightLensIn (Jan 9, 2014)

9VIII said:


> From the CNET article:
> 
> 
> > Larger TVs or closer seating distances make that difference more visible, *as do computer graphics, animation, and games*...
> ...



I don't think so, frame rate and spatial resolution are two entirely different things (and lots of TV stuff wasn't shot at 24Hz). Nobody complained about 65mm projections at 24fps looking inherently blurry!

Well actually there is some relation in that they generally shoot at twice the fps so each frame would be taken at a higher shutter rate, so on freeze frame it would show a lot more detail for anything in motion. For normal playback I don't know since yeah less blur in each frame but each frame also swept past the eye faster, I thought it was supposed to largely balance out.


----------



## LetTheRightLensIn (Jan 9, 2014)

jrista said:


> 9VIII said:
> 
> 
> > From the CNET article:
> ...



>24fps doesn't necessarily work out so well for movies though as it makes things seem too real in a sense which actually can make movies seem more fake in a way, you notice anythng that wasn't done perfectly more and it makes the actors seem less larger than life and a bit more like Joe or Jill down at shop rite or like something you shot with your camcorder in the back yard. It depends.

I could see it working better for something like Avatar, strong 3D, you want to smoothly immersed in that world and it's largely all CGI.


----------



## LetTheRightLensIn (Jan 9, 2014)

Lichtgestalt said:


> > It looks bad! You can see the make-up and you are able to discern the fake props.
> 
> 
> 
> because of 48 fps?.... i doubt that.



In the sense that the brain now treats the movie as if it were viewing real life. 24fps puts the brain into a different state where it knows things are not quite right and you are not just looking around. So when effects and make up and props are not quite perfect it just mentally sort of blends a bit more into being a movie but at high frame rate it sticks out a bit more. I never saw the Hobbit at all so I didn't make that particular direct comparison myself.


----------



## LetTheRightLensIn (Jan 9, 2014)

jdramirez said:


> There is a huge distance between 480I and 1080p, but I find that mostly I'll watch dvd quality upscaled video which is tolerable. I'll watch satellite broadcast hd signals which are fine, but not really that impressive. I have a 3d tv and I don't hate the glasses and I don't mind the crud where they try to send debris out of the screen towards you, but the content isn't there.
> 
> I like basketball in 3d. Baseball and football aren't all that impressive.
> 
> ...



Nah it is FOX that downgrades the football! To begin with they are at only 1/2 the MP since it's 720p channel but now they are sending out such a crap signal that it looks truly like that old school wide screen DVD res stuff that used to be broadcast at the start of HD.

CBS and NBC football show WAY more detail.


----------



## mhlas7 (Jan 9, 2014)

I think that 4k will be making its way into the home as 4k tv's are becoming more abundant, cheaper and content becomes more available. Once 4k has made its way into our tv's consumers will demand devices that can record in 4k. As far as 4k recording devices, right now they are mostly professional products such as cinema cameras that are used to make almost all major movies. The 1Dc is probably the lowest end 4k camera available now. Sony just announced a 4k handycam at CES but even if this makes it makes it to market I can't see consumers purchasing it until 4k tv's are more common. I could see canon putting 4k video into the next 5D and maybe the 7D.

As far as photography goes, 4k displays are perfect for editing as most cameras made in the last 5 years have enough resolution to fill up a 4k display (many have resolution much higher than 4k). For photography I can see many people wanting 4k displays on their editing computer to take advantage of the resolution these cameras already have.

Personally I really want to get 4k monitor. Dell just announced a 28" 4k monitor for $699 which is the cheapest 4k monitor yet and I think this might be the one to buy right now.


----------



## mkabi (Jan 9, 2014)

cayenne said:


> mkabi said:
> 
> 
> > I just got my 60" 1080p TV, 2 years ago.
> ...



About 10-12 feet.


----------



## jrista (Jan 10, 2014)

LetTheRightLensIn said:


> jrista said:
> 
> 
> > 9VIII said:
> ...



That isn't really because of a higher frame rate. It's because of the age of the technology...no one yet knows how to maximize 48fps or 60fps potential. There is the lack of motion blur, which is the thing most people "notice" when they view something filmed and played back at a frame rate higher than 24hz...there isn't much motion blur.

As Hollywood gets more familiar with higher frame rates, and as the software they use to post-process improves, these issues will fade and eventually cease to be an issue. I highly suspect that motion blur, when and where appropriate, will become the domain of post processing software, which will use temporal blending across frames and other techniques to restore (and even enhance) motion blur where necessary.

As Hollywood post processing labs get more familiar with how scenes may look too real, or where in each scene the "fake" shows through, they will develop ways to hide those things. That's all they have really done up till now. Even 24/30fps video long ago clearly showed all the "fakery" of a hollywood scene...it took years for CGI and post-processing/green screen technicians to learn how to blend things and process things such that you couldn't tell the difference. They already have that knowledge now, all they need to do is apply it in a way that is more suitable to higher frame rates. 

In a few years, the early issues that sometimes make 48/60fps cinematography a little "too real" will be lost in the past, and their benefits will be fully realized....and will pair very nicely with 4k TVs.


----------



## RGomezPhotos (Jan 10, 2014)

I tried a Mac Pro with the SHARP 4k display attached to it. I tell you, it was pretty amazing. If you think HD is impressive, 4k will blow you away. Everything looked amazing on that display. I just looked at the display and thought "I can edit photos all day on this. No problem".

The SHARP is $3500. But SHARP, ASUS and Lenovo are all coming out with with 28" 4k displays under $800 THIS year. I'm kinda wondering about quality since a Dell Ultrasharp 27" is about $900. Still, this is a good sign.


----------

