I was covering home theater and video, and only got to spend two days on the show floor, but Sony's CrystalLED prototype was just amazing. Very bright, 180 degree viewing angles with no color or contrast shifts, near infinite contrast ratios, and perfect motion with no blurring or other motion artifacts. I can only hope that Sony decides to release it at an affordable cost, as it's just amazing to see.
The OLED sets might have been almost as good, but the off-angles were not as good, and the demo content was not good for getting an idea of the quality compared to Sony. Of course they might ship this year and we have no idea when/if the Sony will be released. The 8K panel from Sharp was also just a proof-of-concept design, but amazingly detailed to the point that you can stick your head next to it and see no pixels. The contrast and angles were not nearly as good as the CrystalLED, though.
Nothing in Blu-ray really amazed me, as the only different feature I really saw was Sony offering 4K upconversion on their new player for their 4K projector, but I'd need a 4K projector to be able to evaluate that anyway. Overall it was the new panel technologies that really stood out to me.
ZDnet article said something different regarding the CrystalLED:
"Reports from the show floor came away impressed, if not awed. Engadget said the sample set on view failed to show off the speedy refresh rates, and our sister site CNET found that OLED TVs provided a bit more “wow.” CNET also posted a short video examining Sony’s Crystal LED Display in more detail that you can watch here. "
Which is fine. The OLEDs might be better, but the way the demo was setup on the floor I just really couldn't get a good idea for it, and the color shift on the LG model was a bit annoying since the Sony LED set had absolutely zero shift. I believe that Samsung had a demo unit setup in a private room that some journalists managed to see, though I did not, so that might have had better material or a better environment and led to a better response than I had. The other AV writers that I talked to during and after the show came away a bit split on the two, though we all want one of them in our living rooms.
Unfortunately no video that anyone took will do justice of the motion on the CrystalLED, since you'll be watching it on a conventional display. I imagine it might never come out, but we can all hope Sony finds a way to produce it since the results were amazing.
Whats the difference between OLED and Crystal LED? Is Crystal LED just Sony's marketing BS for OLED? They both seem extremely similar.
The Samsung TV at the show had a "Super OLED" display though. Super OLED sets don't use a color filter resulting in pictures with deeper contrasts and finer detail. So it should have been better.
Realistically CLED will likely never see the light of day. Sony stated that it was a tech demo and that they have no current plans to produce them. Considering each pixel is composed of 3 LEDs (RGB) on a chip, the display would be cost-prohibitive to build and sell in any mass market. Sony can't "choose" to release it at an affordable cost unless they find a way to make cheaper LEDs and find cheaper ways to connect them all.
Even if you could buy a single LED for $0.01 (one cent USD - which you can't), you would need 6 million of them. I'll math for you: $60,000 for just one display. And that's only for 1080p, 4K will be mainstream before this tech will. LEDs have been in mass use for decades in all manner of electronics and the prices aren't even close to make LEDs cheap enough for this tech to work.
This is where OLED comes in as a realistic alternative. Although as I understand they still need to work on its retention performance.
I've seen a lot of discussion of 4k displays following this year's CES, and invariably brief mention is made of the limited source material available. So; what 4k sources ARE available today? What are the demos running off of? What kind of processing power would it take to play, say, a 4k video stream encoded the same way as a blu-ray (I'm assuming 40 Mbit max for 2k video would roughly translate to 160 Mbit for 4k)?
Basically, beyond getting the displays into production, what needs to happen before 4k becomes a wider reality? Have we seen some significant improvement in compression technology in the last 5 years that would make 4k satellite broadcasts possible without sacrificing a huge number of channels?
4k sounds great, and and on the one hand it is just the next logical increment after 2k HD. However, it seems that we are still just barely managing the bitrates required by 2k HD in terms of storage, transmission, and playback; how close are we realistically to making the 4x jump in all of these to make 4k useful?
I think 4K will largely be for home theater buffs initially, with Blu-ray players that upconvert to 4K. Then we'll get something post-Blu-ray that will have new DRM and support higher bitrates. Of course, average bitrate of 50Mbps could still fit a two hour movie on a 50GB Blu-ray, so maybe it will use the same disc medium but with new standards? Don't know, but we'll see.
From doing a bit of Googling, it looks like 100GB is the likely requirement for 4K movies, which means 4 layers rather than 2. Apparently most Blu-Ray disk players can only read 2 layers, so would have to be upgraded. I suspect the bit rates would blow them up even if they did support the BDXL format...
Why wait for home theater/movie buffs to catch up when PC gaming could take full advantage of this tech today?
We just need 4K/2K to be supported over a single connector or for both IHVs to implement their professional single resolution over multiple display solutions on desktop parts, like the Quadro version described here:
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA... "professional level multi-display technology called "NVIDIA Scalable Visualization Solutions" that will allow multiple monitors to function as a single display to the OS and "just work" with any application."
Because marketing a TV for thousands of dollars now is not going to appeal to the small market of gamers who care :D
But i'm with you, if the price was right, I would be willing to be a first adopter if i could get a hands on preview of it. Meaning see how it does hooked up to a computer with a few games.
Computer monitors have been at a standstill for quite a long time. It basically went CRT to LCD and thats pretty much it. In fact i would venture to say its went BACKWARDS for monitors..it used to be anything over 24 inches was 1900x1200. Now you see the market flooded with 22-23 inch 1900x1080 monitors. Or worse 27 inch 1900x1200.
4K definitely works on PCs -- I don't even care if it requires two connectors. However, I expect the cost to be prohibitively high for a while. I mean, the movie theaters and digital film has already been using 4K for a while, but it's just not consumer grade stuff. But yeah, AMD showed you could definitely play games on a 4K display. All you need is the display and a GPU with the necessary ports, but I think the display part is only available special order for >$15K.
I asked in the other thread about 4K/2K, but did AMD actually demo any actual PC games (just saw some in-house castle demo)?
That was my point though, with PC gaming we don't need to wait for any content since most any game that reads resolution caps from Windows will be able to render natively at 4K/2K and output 4K/2K natively over 1 output, or with some help from the driver if 2 outputs.
But that makes sense about the price/demand aspect, since no one is going to make $15K displays affordable just for the PC gaming community. I guess our best bet of seeing these displays commoditized in the near future would be the professional graphics space, which is what largely drove the 2560x1600 format and 30" IPS market as well.
I only saw their rolling demo, but that's not too surprising. I also poked around at the system from the Windows desktop and everything was as you would expect. I thought I saw a shortcut for a game, but I don't have any pictures of the Windows desktop so I can't confirm or deny. Basically, the game would have to support the dual DP outputs running a single display I think, but if a game supports Eyefinity that shouldn't be a problem.
One obvious use of 4K displays would be to allow for passive 3D without sacrificing resolution. So while only half the pixels would go to one eye, you'd still have 1080p resolution (with say line doubling on top of it). Assuming anybody cares about 3D of course.
Another possibility is that displays could do upscaling. So just as we saw EDTVs at the end of the SD life-cycle there could be 4K displays upscaling 1080p content.
Then of course there's games. An updated XBox or PS4 could conceivably drive a higher resolution display. Not clear this will happen of course, but the potential for this increases as the number of years before these consoles get a refresh.
Then of course there's movies on disk. Studios want you to buy Blu-Ray movies and not stream stuff over the internet or watch it via your MSO's VOD offering. So a future 4K Blu-Ray standard could push higher resolution as one way of trying to stave off the eventual move to all digital delivery. Sony for example claims to have more than 60 theatrical releases shot in 4K, and there have been a number of high profile pushes for 4K (James Cameron for example is shooting Avatar 2 in 4K). Sony has promised to work with the Blu-Ray disk assocation to define a new 4K standard and has promised to release the next Spiderman movie in 4K.
How are they going to do that? Well... they can already do 1080p 3D, so all they need to do is something less than double that. And the next codec being developed has a goal of another halving of needed bandwidth. So... Or there's always more layers...
Looks like the Joint Video Team is targeting 2013 for the next video codec, currently tagged High Efficiency Video Coding or HEVC. It'll get deployed whether 4K is a reality or not of course, since it'll also allow lowering the bit rate for the same quality, whether for mobile video applications or simply 1080p content streaming over the internet....
And it looks like the existing PS3 will be able to display 4K stills. So these TVs will work great for photo-realistic paintings or simply displaying your high-resolution camera images.
Too me 4k displays are going to be about what used to be eyefinity. Everyone tryies to get small bezels and good monitors. 4k offers you that , it offeres you no bezel. So maybe if they are fast enough gamers will want them in place of 3 monitors.
I'm actually wondering why on earth we'd need 4K displays at all? I have a full HD 42" plasma (and love it), but I barely see the difference between 1080 and 720 content. Even when downloading for free, I don't bother going for the 1080 version. Same for the bitrate, why do you need the 40 mbit rate that bluray offers when a 4 mbit file (720p 40 minute episodes are generally around 1.2 GB) looks fine? What I wish the industry would move towards a bit faster, is a higher framerate! Sitting 3 meters away I don't really see more pixels, but I do see chopping when the camera is panning around (even though I have a plasma, I'd probably go nuts if I'd have an LCD with a static backlight). It seems insane to me that with all the improvements to image quality over the last decades we're still stuck at 24 to 30 frames per second...
That's your problem. 42" is too small to appreciate the detail. I know, I've got a few 1080p displays (17" notebook, 42" LCD, 60" plasma) and none of them compare to my 1080p projector (120"). 4K would be great to have though to more accurately capture the detail inherent to 35mm and 70mm film. 8K would be great too, but that's a ways away yet.
We're "stuck" at 24fps because that's how film is shot and has been shot for about 100 years.
Well I'm exagerating my point slightly, I don't actually mean that I see no point at all in upping the resolution and obviously on way bigger screens the advantage will be more obvious, I'm just saying that I think that increasing the framerate might be a bigger win for a lot of people. As for being stuck on 24 fps because that's just how it's always been done, well, I guess you still go around with a horse and cariage or take the steamtrain/boat for larger distances? Just because something was done in a certain way for a long time doesn't mean you can't improve it. But I'm glad to see what name99 and B3an are saying below :)
You are right about frame rate but there is s small amount of good news on that front. A few Hollywood directors who actually understand tech and are in a position to push the issue (notably James Cameron) are trying to ramp up frame rates.
Obviously with digital cinemas this is a lot easier to push, but I expect that even if Avatar2 is shot in 48 or 60 fps, there will be a long long period of crossover. I mean, my god, we're still stuck with interlace on plenty of broadcast TV.
The problem with high-DPI displays for laptops and desktops is none of the main operating systems are designed to handle resolution-independent graphics. Even OSX does it in a tricky way, and it works because they control everything (as usual). Windows or Linux should go the true resolution-independence way (not the tricky OSX way). Then, and only then, maybe, and just maybe, manufacturers would consider enhancing the DPI of their screens and consumer would buy into them. While a user gets tiny text on any display, high-DPI displays can't start showing on ordinary computers. That just doesn't happen on tablets. That's why you get high-DPI displays there.
BTW, true resolution independence calls for hardware acceleration, but that shouldn't be an issue on laptops, much less on desktops.
They break because the browsers aren't coded to be DPI aware right now. I think a lot of this will get fixed with Windows 8 and Metro apps; we'll find out later this year. Anyway, I'm using a 30" display with 120 dpi setting in Windows 7 and browsing the web hasn't been one of my complaints (though I wish text and Flash would scale rather than being done at a fixed pixel size). I suppose if you define "break" as "are really tiny and hard to read on a high DPI display" then I can agree with you.
does the Yoga 13" have some sort of Thunderbolt port?
I wish it does, external GPU is something I look forward to with future Ultrabooks to make my desktop obsolete, since my work doesn't use that much CPU anyway.
Of course every power user wants it. But the average user doesn't care, and won't care. They're used to $300-$500 laptops, and that's what they will continue to expect. Yes it's become a race to the bottom, but why would that change? I think it's much more likely that it'll remain the same because that's how the market has developed into. I think the real reasons tablets are pushing better displays is because they can't afford not to. They're supposed to have better viewing angles, and supposed to be something you can hold at any angle and distance. This is not the case with laptops. Laptops, IMO, as long as they continue to have the same form factor, will continue to have the same attributes, and the same race to the bottom.
I think the main issue is when Ultrabooks (and any laptops above $1000) have poor display, They are nowhere near $500, yet some of them have display resolution and sometimes display quality of a $500 laptop.
I will also add this. No one wants to add large amounts to the price of a laptop because they know they are going to throw it out soon. Lets leave apple users out of this because they are often less practical.
People will buy very nice monitors because they can move it through a couple different desktops or add a second one. But most laptops simply do not have any reasonable option to do something similar.
There are 2 things the laptop companies could have done. They could design modular displays. Have a connection similar to the asus transformer for the display. You buy the display of your choice and are able to upgrade or replace it if damaged. Then people might go for better displays knowing they might have that option.
The other thing is maybe if laptop manufacturers had ever come up with some standards for design it would have helped too, if people had an upgrade path. This is the same reason for the cheapness of laptops. Why build a strong case when it should be replaced every 2 years?
Somene of the reasons tablets are bucking the trend is because the internal components of a tablet are much cheaper. So you have something that overal is cheaper and smaller than even a netbook to produce. Second if you look at a tablet what makes it unique? Nothing other than the screen is the answer. They are so basica in design they are all almost the same. That is why they are always trying to push on the only 3 things that most people are ever going to notice. Thin, display, and price.
"Second if you look at a tablet what makes it unique? Nothing other than the screen is the answer."
That may be true for Android tablets. iPads come crammed with sensors, and the iPad2 has notably more sophisticated sensing (eg being able to track its 3D orientation) than iPad1.
The real problem is that Android land is utterly devoid of imagination ---- without Apple to copy they would be lost. Let me describe just some obvious addition that could be made to a tablet or phone --- and you tell me which Android (land of variety and choice) vendor has implemented them: (a) temperature measurements (using a bolometer) for both local temperature and "remote" temperature (eg point the device at my forehead and tell me how hot I am (b) incorporate a small laser. Now you can use the device as a laser pointer. Or as a plumbline. Or you can fire and detect a laser pulse, and use it for ranging. (c) sensible speakers. Apple has so far stuck to mono speakers because of the device rotation issue. Some Android vendors have stereo speakers --- which work OK if you're watching a movie and suck if the device is in portrait. INTELLIGENT would be to have four speakers, one in each corner, so that you can rotate the sound to follow the orientation of the device.
thanks Jarred! it's nice to know anything, even if it is a "not yet"... lol
i will keep waiting for that card, if it gets only 1 PCIe power connector it will be my next card. if not, I will just wait until this level of performance fits this power envelope.
I believe the base model Z starts at under $2000 and includes the dock, but still, it's an expensive (and beautiful I might add!) display for sure. I had one company suggest that such displays add $700 to the price of a laptop right now, and they might be right. Or they might be trying to make excuses for using crappy displays.
Incidentally, did you know you can buy a 1080p 95% NTSC matte 15.6" panel online for under $150? I'm not sure how a 13.1" display would cost four times as much to make; it's just a matter of getting enough supply and demand.
4K makes sense for movies if they're broadcast on a giant screen at a theater. Not at home on a 50'' screen 12 feet away from the viewer.
I'm a lot more excited to see OLED displays. We need to refocus on color gamut, contrast, refresh rates, not more pixels nobody can see. On a tablet two feet from your face higher resolutions matter.
As for YouTube, 4k is a joke because the bit rates aren't high enough to take advantage of the resolution.
Sorry, but I don't agree. If you think the TVs we've got now are as big as they're going to ever get, you're wrong. I sit some 8-10 feet from my 65" TV and it actually seems quite small. The number of degrees of arc isn't actually that great. If you want something that seems more like the experience of watching in a theater TVs need to get MUCH bigger.
Already last year we saw that 80" Sharp LCD TV at $4999, way below anything we'd seen at that size previously. And with LCD TV manufacturers seeing a glut in production and prices crashing below $1000 you can't really blame them for looking forward to even larger TVs.
The TV size sweetspot right now is 46 inches. Even for high end TVs where price isn't a concern. A lot of people apparently find massive TVs kind of tacky (have a look in a home design magazine). That may not apply to you but it means 80'' TVs will remain a niche market, regardless of how cheap they get. Which means 4k will remain a niche technology, which means it will remain expensive.
Blu Ray needs 40Mbits a second to drive 1080p. If Blu-Ray is the last physical media before we're all streaming, and 4k is 4x the resolution of 1080p, then we need 150-200Mbps internet connections before this is even feasible.
The future is a large picture window being replaced by a TV that can switch from window to giant display with a push of a button. The other alternative is advancements in flexible displays that will allow very large TVs to roll up into the ceiling, but that kind of already exists for projectors and nobody uses them, so it's not likely to be very big in the future either.
given that the majority of people can't tell 720 from 1080 on their TVs as it is - it's simply not possible for the human eye to resolve the detail at their sitting distance - I think 2K and above will not catch on except for professional installations or the richest early adopters.
No matter how many numbers people throw out on what resolution makes a difference, looking at the 8K set that Sharp had on display (never coming out, of course) and you'd see a huge difference from a 2K or 4K set. However, I'd still take the OLED or CrystalLED sets for their better viewing angles, contrast ratios, motion, and black levels than the extra resolution. That said, you can see the difference in resolution, but bandwidth concerns mean we won't get to see that for a long time.
I don't know why I'd want the print on my monitor to look better. I can read it perfectly fine as it is. In fact I don't know why all those printer manufacturers don't stop making printers at 600dpi or even 1200dpi. That's stupid. Who needs anything that readable? Everything should top out at 150dpi or so. Anybody who suggests otherwise is being unreasonable.
Printers use the higher resolution to get better gradations and color blending. Apparently you've never seen print out from the early 360dpi printers from the early '90s. 720dpi, 600dpi, 1200dpi, 1440dpi, and 4800dpi printers have gotten noticeably better with each upgrade in resolution from those early 300 dpi versions.
Perhaps some people can see finer resolution than others.
i would love to know the aspect ratio of the 4K displays. Also were there any curved displays. I remember the Ostendo CRVD that was awesome from back in CES 2009. They have not updated the display have they? Would lvoe to see a high res curved display for gaming rather than a bunch of panels with their bezels.
I wrote Ostendo Technologies about the CRVD last September asking them about a higher resolution version. Something like 3840x1200 (two 1920x1200 monitors), because 900px high is practically useless on a desktop computer.
They said they were out of stock on the current model and had no plans to build more. However my request for a higher res version had been noted, and I could get their newsletter to know about future announcements.
Maybe if enough people email them about a high res version they will try again with something more useful.
As I mentioned in the text, the 4K display I saw ran at 4096x2160, so it's a 1.896 AR (compared to 1.778 for 16:9 and 1.6 for 16:10). I've also seen some info saying we'll see 4096x2304 4K displays (16:9), and I neglected to get a picture of the resolution but I swear there was at least one 4K display that I saw that had a >2.0 AR.
I hate TN panels so yay for non TN but i view the resolution game in phones and tablets as mostly marketing BS and it creates additional problems.Prices go up (and this is why the industry likes the idea) gaming gets slower and battery life is lower (or ,for tablets,you shove in a bigger battery and then the price goes up some more).Is it really worth it? 4k TV's sure but i would rather see prices for 30" monitors come down a lot. Thin laptops with the CPU perf of systems costing half as much,GPU perf lower than terrible,poor battery life .... no thanks.Funny how intel tries to do what they already did years ago with the ULV line except this time they added some more shine and doubled the price.Touch screens on laptops,folks should realize how much that will add to the retail price before getting too excited.
I have an 800x480 4.3" display, and it looks more than fine to me. I also used an iPhone 4 and 4S with their 960x640 3.5" display. I'd take the 800x480 at 4.3" any day. It doesn't look any better to me at this tiny size, and I'd rather have the larger screen.
I'm using the Samsung Galaxy Note now (is it released stateside? No idea...) which has a 5.3" screen with 1280x800 display and I never want to go back. This resolution means I can view just about any website without zooming in in landscape and it's big and sharp enough for me to read ebooks in PDF form on (such as programming books). So for me, yes, the increase in resolution is definately worth it :)
" while I’m not sure if all 4K displays will use the same resolution, this particular panel was running at 4096x2160, so it’s even wider than the current 16:9 aspect ratio panels (and closer to cinema resolutions)"
I can't help feeling they would have done better to just make 4K a straight 2:1 aspect ratio, and kept the resolution at 4096x2048!
Technology is always about patience. If we're just patient enough to wait for them to be mature and more common, you can jump in with a much reasonable $$$.
Hard lesson learned when I bought launch day PS3 back in 2006. It was friggin $650, I couldn't afford to buy one actually, but I "had to". It was very very expensive to me. And just within 13months after that I got the YLOD for the first time in my life! My super expensive launch day PS3 was unrepairable, not under warranty anymore and I was just frustrated.
I paid Sony $650 just for being lousy beta tester! Now we could buy a $300 PS3 with no YLOD, more energy efficient, better cooling system and chips.
I said no more!! No more wasting my hard earned money to buy a expensive beta phase technology. And I would wait 4K display to be matured enough, have adequate and affordable contents, price range and availability.
Even now, how many 1080p content was actually available? Does it already widespread like standard DVD? I know many people who owns a LED HDTV but not care enough to notice that they still watch sub 720p content.
Most poignant point I've read all day: "What really irks me is that all of this comes in a 10.1” IPS package, exactly what I’ve been asking for in laptops for the past several years." AMEN brother. I bought a Sager because it actually offers a good 95% color gamut screen. As a designer, I really have a small window of choice because these manufacturers don't offer a decent monitor yet every tablet seems to push out great displays.
The 4K TV thing is great, but in terms of gaming there will be a huge technology gap once the 4K monitors are priced reasonably. I just don't think our GPUs will be able to push those kinds of pixels in the next 3 years except for maybe the $500 beast cards or SLI/Crossfire.
The Lenovo Yoga is ALMOST perfect. If it just had a wacom digitizer in it, it would be worth a lot of money to me and other graphic designers. Hell, it could force even Mac-heads to consider converting to Microsoft.
Idk, at the rate GPUs progress, next-gen single GPUs tend to perform similarly to last-gen's top-end SLI configs. Its not quite 100%, probably closer to 50% increase over 2 years but its very possible we'll see a single GPU push 4K/2K without much trouble while maintaining high framerates (~60FPS).
4K/2K is roughly 4x the resolution of 1080p, very close to 2x2 supersampling over the same screen area. We already have mid to high-end configurations pushing 3x1080p with surround configurations from both vendors with relative ease, and even 3D with those same resolutions which is an effective 6x1080p.
4K/2K wouldn't be too much additional burden, I think even a high-end single card would be able to push that resolution without too much trouble, and today's high-end card is tomorrow's mid-range card.
I'd just look at our triple-head desktop benchmarks if you want to know roughly how well current GPU setups can handle 4K displays for gaming. Basically, something like HD 7970 CF or GTX 580 SLI will certainly handle it, though you'll want the 2GB/3GB models. Now it's just waiting for the price to come down--on the GPUs as well as the displays.
4K support for studios varies a bit based on which studio it is. Sony, for example, has their Colorworks processing facility on their lot and all of their work is done at 4K at a minimum, with some work done at 8K resolution. As 4K displays start to come into play you will likely see a transition to 8K workflows at studios that allow them to sample CGI and other effects down to 4K for release. If you search at Sony's website you can find a list of 4K theater releases and theaters that support them. The majority of digital cinema is still presented at HD format, though Blu-ray releases likely use a slightly different master since the colorspace and gamma target for cinema releases is much different than the Rec 709 target for Blu-ray.
100 GB for 4K is really pushing the limits of the medium, and not likely to be what would be used. Most Blu-ray titles clock in around the 30-40 GB range for the film, so given 4x the resolution you are looking at 120-160 GB for a film of similar quality. Another current issue with 4K at the home is I believe that HDMI 1.4a is limited to 24p at 4K resolution, so while films can work fine, no TV or Video content will be able to be upscaled outside of the display or projector. I imagine before 4K really catches on we will have another update to HDMI, and a new media format, as downloads are obviously not fast enough for 4K streaming or downloads yet.
"Most Blu-ray titles clock in around the 30-40 GB range for the film, so given 4x the resolution you are looking at 120-160 GB for a film of similar quality. "
Not even close. The higher you push the resolution, the higher the correlation between adjacent pixels, meaning that the material compresses better for the same level of visual quality.
Heck, even if you compressed a 4K movie into the SAME 30-40GB you'd get better results. The compressor will throw away information intelligently, whereas you can view an existing BR movie as consisting of an initial dumb transfer stage (downsampling the 4K to 1080p) followed by compression. (This assumes, of course, that the compressor is actually intelligent and, for example, scales search domain upward to cover equivalent area in the 4K movie as in the 1080p movie. This is a obvious point, yet I've yet to see a compressor that does it properly by default, so you will have to tweak the settings to get the improvement. But I am correct in this --- with a smart compressor 4K at 40GB will look better than 1080p at 40GB.)
I was unable to find a good current-gen laptop with a 1920x1080 or 1920x1200 display for less than $1000. I eventually settled on a 3-4 year old refurbed IBM T61p with a beautiful WUXGA 1920x1200 15.4" display.
I looked up the part number for the panel, a replacement is less than $100 from eBay: http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&... I don't know what laptop manufacturers are doing these days that prevent them from putting a decent panel in their laptops, but they need to quit it. 1366x768 @ 15"? Are you joking? My 3 year old Dell netbook had the same resolution in a 10" form factor for $350.
I have not bought a tablet yet because I want at least a 1080p display on them and yes, this is for surfing.
As for Laptops, my last Laptop was a MSI GX640 and I specifically hunted that down because it has a good video card in it AND most importantly, it has a 1600x1080 display.
I had bought some Acer with a 768p display but it went back. What crap.
For those of you who that these low-quality displays are good enough...
I'm not going to leave you alone. I want you to get mad! I want you to get up now. I want all of you to get up out of your chairs. I want you to get up right now, and go to the window, open it, and stick your head out and yell "I'M AS MAD AS HELL, AND I'M NOT GOING TO TAKE THIS ANYMORE!"
You might as well just give up. As far as I can tell most tech blog commenters (a) are completely unaware that something like Moore's law exists (b) think everything right now is absolutely perfect and should never be changed.
Thunderbolt. Higher res iPad screens. 4K TVs. Windows File System Improvements. Cell phone system improvements. WiFi improvements. Doesn't matter what it is, there's a chorus telling you that it will always cost too much, that it probably won't work well, and no-one will care about the improvement.
My god --- who would have thought that the hangout spot for 21st century Luddites would be the comments section of sites like Ars Technica and AnandTech?
The problem is that we're not even getting 1080p on a lot of laptops. I want 1080p pretty much as an option on anything 13" and up. Right now it's really only there on 15.6" and 17.3" displays (and of course, Sony has one 13.1" panel on the VAIO Z).
i've been lamenting the watering-down of laptop screens for years. i'm bloody sick of 1080p screens. 1080p would be fine...if i had an 11" netbook. i LIKE gaming at 16:10--and i like doing it on a 15.4" screen. and for what it's worth, 17" laptops should be offering wqxga at a MINIMUM. ~300dpi should be the standard--regardless of display size.
of course, part of the problem with stagnation is probably because xbox-360 has dilluted consumer expectations for gaming--which has resulted in games in which practically the entire library of new GPU's will run in 1080p at acceptable framerates. that kind of versitility didn't even exist at 1024x768 resolutions 10 years ago. the ti4600 i had back then was the ONLY card which even had a prayer of running at 1600x1200. so rather than software improving (and keeping hardware choking on rendering), we've plateaued & hardware has caught up big-time. that's also why 10 years ago, mobile gaming left a LOT to be desired. today, (at 1080p) you can't tell the difference in many cases.
here's to hoping that my 15" form-factor laptop experience will soon offer 4MP gaming at a reliable 60fps. it's been long overdue. i'm still baffled by the reverse-progress in laptop displays. you don't see people rejecting 90's cars in order to drive 80's cars. doesn't make sense why we'd do the same with laptops.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
78 Comments
Back to Article
cheinonen - Tuesday, January 17, 2012 - link
I was covering home theater and video, and only got to spend two days on the show floor, but Sony's CrystalLED prototype was just amazing. Very bright, 180 degree viewing angles with no color or contrast shifts, near infinite contrast ratios, and perfect motion with no blurring or other motion artifacts. I can only hope that Sony decides to release it at an affordable cost, as it's just amazing to see.The OLED sets might have been almost as good, but the off-angles were not as good, and the demo content was not good for getting an idea of the quality compared to Sony. Of course they might ship this year and we have no idea when/if the Sony will be released. The 8K panel from Sharp was also just a proof-of-concept design, but amazingly detailed to the point that you can stick your head next to it and see no pixels. The contrast and angles were not nearly as good as the CrystalLED, though.
Nothing in Blu-ray really amazed me, as the only different feature I really saw was Sony offering 4K upconversion on their new player for their 4K projector, but I'd need a 4K projector to be able to evaluate that anyway. Overall it was the new panel technologies that really stood out to me.
AnnihilatorX - Wednesday, January 18, 2012 - link
ZDnet article said something different regarding the CrystalLED:"Reports from the show floor came away impressed, if not awed. Engadget said the sample set on view failed to show off the speedy refresh rates, and our sister site CNET found that OLED TVs provided a bit more “wow.” CNET also posted a short video examining Sony’s Crystal LED Display in more detail that you can watch here. "
cheinonen - Wednesday, January 18, 2012 - link
Which is fine. The OLEDs might be better, but the way the demo was setup on the floor I just really couldn't get a good idea for it, and the color shift on the LG model was a bit annoying since the Sony LED set had absolutely zero shift. I believe that Samsung had a demo unit setup in a private room that some journalists managed to see, though I did not, so that might have had better material or a better environment and led to a better response than I had. The other AV writers that I talked to during and after the show came away a bit split on the two, though we all want one of them in our living rooms.Unfortunately no video that anyone took will do justice of the motion on the CrystalLED, since you'll be watching it on a conventional display. I imagine it might never come out, but we can all hope Sony finds a way to produce it since the results were amazing.
B3an - Wednesday, January 18, 2012 - link
Whats the difference between OLED and Crystal LED? Is Crystal LED just Sony's marketing BS for OLED? They both seem extremely similar.The Samsung TV at the show had a "Super OLED" display though. Super OLED sets don't use a color filter resulting in pictures with deeper contrasts and finer detail. So it should have been better.
therealnickdanger - Wednesday, January 18, 2012 - link
Realistically CLED will likely never see the light of day. Sony stated that it was a tech demo and that they have no current plans to produce them. Considering each pixel is composed of 3 LEDs (RGB) on a chip, the display would be cost-prohibitive to build and sell in any mass market. Sony can't "choose" to release it at an affordable cost unless they find a way to make cheaper LEDs and find cheaper ways to connect them all.Even if you could buy a single LED for $0.01 (one cent USD - which you can't), you would need 6 million of them. I'll math for you: $60,000 for just one display. And that's only for 1080p, 4K will be mainstream before this tech will. LEDs have been in mass use for decades in all manner of electronics and the prices aren't even close to make LEDs cheap enough for this tech to work.
This is where OLED comes in as a realistic alternative. Although as I understand they still need to work on its retention performance.
demonbug - Tuesday, January 17, 2012 - link
I've seen a lot of discussion of 4k displays following this year's CES, and invariably brief mention is made of the limited source material available. So; what 4k sources ARE available today? What are the demos running off of? What kind of processing power would it take to play, say, a 4k video stream encoded the same way as a blu-ray (I'm assuming 40 Mbit max for 2k video would roughly translate to 160 Mbit for 4k)?Basically, beyond getting the displays into production, what needs to happen before 4k becomes a wider reality? Have we seen some significant improvement in compression technology in the last 5 years that would make 4k satellite broadcasts possible without sacrificing a huge number of channels?
4k sounds great, and and on the one hand it is just the next logical increment after 2k HD. However, it seems that we are still just barely managing the bitrates required by 2k HD in terms of storage, transmission, and playback; how close are we realistically to making the 4x jump in all of these to make 4k useful?
JarredWalton - Tuesday, January 17, 2012 - link
I think 4K will largely be for home theater buffs initially, with Blu-ray players that upconvert to 4K. Then we'll get something post-Blu-ray that will have new DRM and support higher bitrates. Of course, average bitrate of 50Mbps could still fit a two hour movie on a 50GB Blu-ray, so maybe it will use the same disc medium but with new standards? Don't know, but we'll see.hechacker1 - Tuesday, January 17, 2012 - link
Doesn't Blu-ray scale with layers? AFAIK, they've demonstrated versions with 10 or more layers. So we'd just need updated drives to read them.Fanfoot - Tuesday, January 17, 2012 - link
From doing a bit of Googling, it looks like 100GB is the likely requirement for 4K movies, which means 4 layers rather than 2. Apparently most Blu-Ray disk players can only read 2 layers, so would have to be upgraded. I suspect the bit rates would blow them up even if they did support the BDXL format...chizow - Wednesday, January 18, 2012 - link
@Jarred,Why wait for home theater/movie buffs to catch up when PC gaming could take full advantage of this tech today?
We just need 4K/2K to be supported over a single connector or for both IHVs to implement their professional single resolution over multiple display solutions on desktop parts, like the Quadro version described here:
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
"professional level multi-display technology called "NVIDIA Scalable Visualization Solutions" that will allow multiple monitors to function as a single display to the OS and "just work" with any application."
imaheadcase - Wednesday, January 18, 2012 - link
Because marketing a TV for thousands of dollars now is not going to appeal to the small market of gamers who care :DBut i'm with you, if the price was right, I would be willing to be a first adopter if i could get a hands on preview of it. Meaning see how it does hooked up to a computer with a few games.
Computer monitors have been at a standstill for quite a long time. It basically went CRT to LCD and thats pretty much it. In fact i would venture to say its went BACKWARDS for monitors..it used to be anything over 24 inches was 1900x1200. Now you see the market flooded with 22-23 inch 1900x1080 monitors. Or worse 27 inch 1900x1200.
JarredWalton - Wednesday, January 18, 2012 - link
4K definitely works on PCs -- I don't even care if it requires two connectors. However, I expect the cost to be prohibitively high for a while. I mean, the movie theaters and digital film has already been using 4K for a while, but it's just not consumer grade stuff. But yeah, AMD showed you could definitely play games on a 4K display. All you need is the display and a GPU with the necessary ports, but I think the display part is only available special order for >$15K.chizow - Thursday, January 19, 2012 - link
I asked in the other thread about 4K/2K, but did AMD actually demo any actual PC games (just saw some in-house castle demo)?That was my point though, with PC gaming we don't need to wait for any content since most any game that reads resolution caps from Windows will be able to render natively at 4K/2K and output 4K/2K natively over 1 output, or with some help from the driver if 2 outputs.
But that makes sense about the price/demand aspect, since no one is going to make $15K displays affordable just for the PC gaming community. I guess our best bet of seeing these displays commoditized in the near future would be the professional graphics space, which is what largely drove the 2560x1600 format and 30" IPS market as well.
JarredWalton - Saturday, January 21, 2012 - link
I only saw their rolling demo, but that's not too surprising. I also poked around at the system from the Windows desktop and everything was as you would expect. I thought I saw a shortcut for a game, but I don't have any pictures of the Windows desktop so I can't confirm or deny. Basically, the game would have to support the dual DP outputs running a single display I think, but if a game supports Eyefinity that shouldn't be a problem.Fanfoot - Tuesday, January 17, 2012 - link
Good questions of course.One obvious use of 4K displays would be to allow for passive 3D without sacrificing resolution. So while only half the pixels would go to one eye, you'd still have 1080p resolution (with say line doubling on top of it). Assuming anybody cares about 3D of course.
Another possibility is that displays could do upscaling. So just as we saw EDTVs at the end of the SD life-cycle there could be 4K displays upscaling 1080p content.
Then of course there's games. An updated XBox or PS4 could conceivably drive a higher resolution display. Not clear this will happen of course, but the potential for this increases as the number of years before these consoles get a refresh.
Then of course there's movies on disk. Studios want you to buy Blu-Ray movies and not stream stuff over the internet or watch it via your MSO's VOD offering. So a future 4K Blu-Ray standard could push higher resolution as one way of trying to stave off the eventual move to all digital delivery. Sony for example claims to have more than 60 theatrical releases shot in 4K, and there have been a number of high profile pushes for 4K (James Cameron for example is shooting Avatar 2 in 4K). Sony has promised to work with the Blu-Ray disk assocation to define a new 4K standard and has promised to release the next Spiderman movie in 4K.
How are they going to do that? Well... they can already do 1080p 3D, so all they need to do is something less than double that. And the next codec being developed has a goal of another halving of needed bandwidth. So... Or there's always more layers...
Bit rate for cable or satellite delivery? Well...
Fanfoot - Tuesday, January 17, 2012 - link
Looks like the Joint Video Team is targeting 2013 for the next video codec, currently tagged High Efficiency Video Coding or HEVC. It'll get deployed whether 4K is a reality or not of course, since it'll also allow lowering the bit rate for the same quality, whether for mobile video applications or simply 1080p content streaming over the internet....Fanfoot - Tuesday, January 17, 2012 - link
And it looks like the existing PS3 will be able to display 4K stills. So these TVs will work great for photo-realistic paintings or simply displaying your high-resolution camera images.PubFiction - Wednesday, January 18, 2012 - link
Too me 4k displays are going to be about what used to be eyefinity. Everyone tryies to get small bezels and good monitors. 4k offers you that , it offeres you no bezel. So maybe if they are fast enough gamers will want them in place of 3 monitors.Assimilator87 - Wednesday, January 18, 2012 - link
120Hz.
.
.
4k
.
.
.
OLED
>_<
Finraziel - Wednesday, January 18, 2012 - link
I'm actually wondering why on earth we'd need 4K displays at all? I have a full HD 42" plasma (and love it), but I barely see the difference between 1080 and 720 content. Even when downloading for free, I don't bother going for the 1080 version. Same for the bitrate, why do you need the 40 mbit rate that bluray offers when a 4 mbit file (720p 40 minute episodes are generally around 1.2 GB) looks fine?What I wish the industry would move towards a bit faster, is a higher framerate! Sitting 3 meters away I don't really see more pixels, but I do see chopping when the camera is panning around (even though I have a plasma, I'd probably go nuts if I'd have an LCD with a static backlight). It seems insane to me that with all the improvements to image quality over the last decades we're still stuck at 24 to 30 frames per second...
therealnickdanger - Wednesday, January 18, 2012 - link
That's your problem. 42" is too small to appreciate the detail. I know, I've got a few 1080p displays (17" notebook, 42" LCD, 60" plasma) and none of them compare to my 1080p projector (120"). 4K would be great to have though to more accurately capture the detail inherent to 35mm and 70mm film. 8K would be great too, but that's a ways away yet.We're "stuck" at 24fps because that's how film is shot and has been shot for about 100 years.
Finraziel - Wednesday, January 18, 2012 - link
Well I'm exagerating my point slightly, I don't actually mean that I see no point at all in upping the resolution and obviously on way bigger screens the advantage will be more obvious, I'm just saying that I think that increasing the framerate might be a bigger win for a lot of people. As for being stuck on 24 fps because that's just how it's always been done, well, I guess you still go around with a horse and cariage or take the steamtrain/boat for larger distances? Just because something was done in a certain way for a long time doesn't mean you can't improve it. But I'm glad to see what name99 and B3an are saying below :)name99 - Wednesday, January 18, 2012 - link
You are right about frame rate but there is s small amount of good news on that front. A few Hollywood directors who actually understand tech and are in a position to push the issue (notably James Cameron) are trying to ramp up frame rates.http://www.hollywoodreporter.com/news/james-camero...
Obviously with digital cinemas this is a lot easier to push, but I expect that even if
Avatar2 is shot in 48 or 60 fps, there will be a long long period of crossover. I mean, my god, we're still stuck with interlace on plenty of broadcast TV.
B3an - Wednesday, January 18, 2012 - link
The Hobbit movie is shot in 4k and 48 FPS.sicofante - Tuesday, January 17, 2012 - link
The problem with high-DPI displays for laptops and desktops is none of the main operating systems are designed to handle resolution-independent graphics. Even OSX does it in a tricky way, and it works because they control everything (as usual). Windows or Linux should go the true resolution-independence way (not the tricky OSX way). Then, and only then, maybe, and just maybe, manufacturers would consider enhancing the DPI of their screens and consumer would buy into them. While a user gets tiny text on any display, high-DPI displays can't start showing on ordinary computers. That just doesn't happen on tablets. That's why you get high-DPI displays there.BTW, true resolution independence calls for hardware acceleration, but that shouldn't be an issue on laptops, much less on desktops.
sicofante - Tuesday, January 17, 2012 - link
I meant "NO hires displays for computers while on Windows, OSX or LInux" for the title. Don't understsand why there's no edit button here.LesMoss - Tuesday, January 17, 2012 - link
Not to mention that many web pages break at higher resolutions.JarredWalton - Tuesday, January 17, 2012 - link
They break because the browsers aren't coded to be DPI aware right now. I think a lot of this will get fixed with Windows 8 and Metro apps; we'll find out later this year. Anyway, I'm using a 30" display with 120 dpi setting in Windows 7 and browsing the web hasn't been one of my complaints (though I wish text and Flash would scale rather than being done at a fixed pixel size). I suppose if you define "break" as "are really tiny and hard to read on a high DPI display" then I can agree with you.name99 - Wednesday, January 18, 2012 - link
Bullshit. They break on your crappy browser.Do web pages display fine on iPhone Safari? OK then.
I don't understand why people feel a compulsive need to say something doesn't work when proof that it works has been shipping for almost two years.
Malih - Tuesday, January 17, 2012 - link
does the Yoga 13" have some sort of Thunderbolt port?I wish it does, external GPU is something I look forward to with future Ultrabooks to make my desktop obsolete, since my work doesn't use that much CPU anyway.
JarredWalton - Tuesday, January 17, 2012 - link
I think it does, but I can't find an image to confirm and I might be confusing it with several other laptops.EnzoFX - Tuesday, January 17, 2012 - link
Of course every power user wants it. But the average user doesn't care, and won't care. They're used to $300-$500 laptops, and that's what they will continue to expect. Yes it's become a race to the bottom, but why would that change? I think it's much more likely that it'll remain the same because that's how the market has developed into. I think the real reasons tablets are pushing better displays is because they can't afford not to. They're supposed to have better viewing angles, and supposed to be something you can hold at any angle and distance. This is not the case with laptops. Laptops, IMO, as long as they continue to have the same form factor, will continue to have the same attributes, and the same race to the bottom.Malih - Tuesday, January 17, 2012 - link
I think the main issue is when Ultrabooks (and any laptops above $1000) have poor display, They are nowhere near $500, yet some of them have display resolution and sometimes display quality of a $500 laptop.PubFiction - Wednesday, January 18, 2012 - link
I will also add this. No one wants to add large amounts to the price of a laptop because they know they are going to throw it out soon. Lets leave apple users out of this because they are often less practical.People will buy very nice monitors because they can move it through a couple different desktops or add a second one. But most laptops simply do not have any reasonable option to do something similar.
There are 2 things the laptop companies could have done. They could design modular displays. Have a connection similar to the asus transformer for the display. You buy the display of your choice and are able to upgrade or replace it if damaged. Then people might go for better displays knowing they might have that option.
The other thing is maybe if laptop manufacturers had ever come up with some standards for design it would have helped too, if people had an upgrade path. This is the same reason for the cheapness of laptops. Why build a strong case when it should be replaced every 2 years?
PubFiction - Wednesday, January 18, 2012 - link
Somene of the reasons tablets are bucking the trend is because the internal components of a tablet are much cheaper. So you have something that overal is cheaper and smaller than even a netbook to produce. Second if you look at a tablet what makes it unique? Nothing other than the screen is the answer. They are so basica in design they are all almost the same. That is why they are always trying to push on the only 3 things that most people are ever going to notice. Thin, display, and price.EnzoFX - Wednesday, January 18, 2012 - link
Gotta agree, well said.name99 - Wednesday, January 18, 2012 - link
"Second if you look at a tablet what makes it unique? Nothing other than the screen is the answer."That may be true for Android tablets. iPads come crammed with sensors, and the iPad2 has notably more sophisticated sensing (eg being able to track its 3D orientation) than iPad1.
The real problem is that Android land is utterly devoid of imagination ---- without Apple to copy they would be lost. Let me describe just some obvious addition that could be made to a tablet or phone --- and you tell me which Android (land of variety and choice) vendor has implemented them:
(a) temperature measurements (using a bolometer) for both local temperature and "remote" temperature (eg point the device at my forehead and tell me how hot I am
(b) incorporate a small laser. Now you can use the device as a laser pointer. Or as a plumbline. Or you can fire and detect a laser pulse, and use it for ranging.
(c) sensible speakers. Apple has so far stuck to mono speakers because of the device rotation issue. Some Android vendors have stereo speakers --- which work OK if you're watching a movie and suck if the device is in portrait. INTELLIGENT would be to have four speakers, one in each corner, so that you can rotate the sound to follow the orientation of the device.
Malih - Wednesday, January 18, 2012 - link
That'll be great if people can just bring back their old laptop to upgrade the internal components and still use the old case.At least there's now hope for upgrading a laptop with an external GPU, thanks to Thunderbolt/Lightpeak.
marc1000 - Tuesday, January 17, 2012 - link
any words from AMD about radeon 7870 ????JarredWalton - Tuesday, January 17, 2012 - link
No, but I'm guessing Ryan knows and is under NDA, or that it won't be for at least two months.marc1000 - Tuesday, January 17, 2012 - link
thanks Jarred! it's nice to know anything, even if it is a "not yet"... loli will keep waiting for that card, if it gets only 1 PCIe power connector it will be my next card. if not, I will just wait until this level of performance fits this power envelope.
ty,
maglito - Tuesday, January 17, 2012 - link
I was genuinely excited about ultrabooks too. Crap (sub 1080p resolution) is a deal breaker.I guess I'll start looking more seriously at sticking it out longer on my core2 ULV 11.6" and look towards the 2XXXx15XX resolution tablets.
What a disappointment.
Roland00Address - Tuesday, January 17, 2012 - link
The problem is they only off this resolution on their signature sony Zs so it is about 2.8 to 3k for priceJarredWalton - Wednesday, January 18, 2012 - link
I believe the base model Z starts at under $2000 and includes the dock, but still, it's an expensive (and beautiful I might add!) display for sure. I had one company suggest that such displays add $700 to the price of a laptop right now, and they might be right. Or they might be trying to make excuses for using crappy displays.Incidentally, did you know you can buy a 1080p 95% NTSC matte 15.6" panel online for under $150? I'm not sure how a 13.1" display would cost four times as much to make; it's just a matter of getting enough supply and demand.
Roland00Address - Wednesday, January 18, 2012 - link
The base sony z is 1600x900You can find the base sony z for 1.8 to 2 k
To get the 1920x1080 sony z you need to upgrade to their "signature" models which cost 2.8 to 3k
JarredWalton - Wednesday, January 18, 2012 - link
Oh, you're right... forgot about that. Sony pricing is as "good" as Apple! :-)mckirkus - Tuesday, January 17, 2012 - link
4K makes sense for movies if they're broadcast on a giant screen at a theater. Not at home on a 50'' screen 12 feet away from the viewer.I'm a lot more excited to see OLED displays. We need to refocus on color gamut, contrast, refresh rates, not more pixels nobody can see. On a tablet two feet from your face higher resolutions matter.
As for YouTube, 4k is a joke because the bit rates aren't high enough to take advantage of the resolution.
Fanfoot - Tuesday, January 17, 2012 - link
Sorry, but I don't agree. If you think the TVs we've got now are as big as they're going to ever get, you're wrong. I sit some 8-10 feet from my 65" TV and it actually seems quite small. The number of degrees of arc isn't actually that great. If you want something that seems more like the experience of watching in a theater TVs need to get MUCH bigger.Already last year we saw that 80" Sharp LCD TV at $4999, way below anything we'd seen at that size previously. And with LCD TV manufacturers seeing a glut in production and prices crashing below $1000 you can't really blame them for looking forward to even larger TVs.
A "wall size" unit is still a long ways off.
mckirkus - Tuesday, January 17, 2012 - link
The TV size sweetspot right now is 46 inches. Even for high end TVs where price isn't a concern. A lot of people apparently find massive TVs kind of tacky (have a look in a home design magazine). That may not apply to you but it means 80'' TVs will remain a niche market, regardless of how cheap they get. Which means 4k will remain a niche technology, which means it will remain expensive.Blu Ray needs 40Mbits a second to drive 1080p. If Blu-Ray is the last physical media before we're all streaming, and 4k is 4x the resolution of 1080p, then we need 150-200Mbps internet connections before this is even feasible.
EyelessBlond - Wednesday, January 18, 2012 - link
You know where gigantic displays will eventually find a home in the, er, home? Windows. And no, I don't mean the operating system:http://www.theverge.com/2012/1/13/2705599/samsung-...
The future is a large picture window being replaced by a TV that can switch from window to giant display with a push of a button. The other alternative is advancements in flexible displays that will allow very large TVs to roll up into the ceiling, but that kind of already exists for projectors and nobody uses them, so it's not likely to be very big in the future either.
speculatrix - Wednesday, January 18, 2012 - link
given that the majority of people can't tell 720 from 1080 on their TVs as it is - it's simply not possible for the human eye to resolve the detail at their sitting distance - I think 2K and above will not catch on except for professional installations or the richest early adopters.cheinonen - Wednesday, January 18, 2012 - link
No matter how many numbers people throw out on what resolution makes a difference, looking at the 8K set that Sharp had on display (never coming out, of course) and you'd see a huge difference from a 2K or 4K set. However, I'd still take the OLED or CrystalLED sets for their better viewing angles, contrast ratios, motion, and black levels than the extra resolution. That said, you can see the difference in resolution, but bandwidth concerns mean we won't get to see that for a long time.Fanfoot - Tuesday, January 17, 2012 - link
I don't know why I'd want the print on my monitor to look better. I can read it perfectly fine as it is. In fact I don't know why all those printer manufacturers don't stop making printers at 600dpi or even 1200dpi. That's stupid. Who needs anything that readable? Everything should top out at 150dpi or so. Anybody who suggests otherwise is being unreasonable.pixelstuff - Tuesday, January 17, 2012 - link
Printers use the higher resolution to get better gradations and color blending. Apparently you've never seen print out from the early 360dpi printers from the early '90s. 720dpi, 600dpi, 1200dpi, 1440dpi, and 4800dpi printers have gotten noticeably better with each upgrade in resolution from those early 300 dpi versions.Perhaps some people can see finer resolution than others.
EnerJi - Tuesday, January 17, 2012 - link
I think you missed the sarcasm...B3an - Wednesday, January 18, 2012 - link
Pretty obvious it was a sarcastic comment.adonn78 - Tuesday, January 17, 2012 - link
i would love to know the aspect ratio of the 4K displays. Also were there any curved displays. I remember the Ostendo CRVD that was awesome from back in CES 2009. They have not updated the display have they? Would lvoe to see a high res curved display for gaming rather than a bunch of panels with their bezels.pixelstuff - Tuesday, January 17, 2012 - link
I wrote Ostendo Technologies about the CRVD last September asking them about a higher resolution version. Something like 3840x1200 (two 1920x1200 monitors), because 900px high is practically useless on a desktop computer.They said they were out of stock on the current model and had no plans to build more. However my request for a higher res version had been noted, and I could get their newsletter to know about future announcements.
Maybe if enough people email them about a high res version they will try again with something more useful.
JarredWalton - Wednesday, January 18, 2012 - link
As I mentioned in the text, the 4K display I saw ran at 4096x2160, so it's a 1.896 AR (compared to 1.778 for 16:9 and 1.6 for 16:10). I've also seen some info saying we'll see 4096x2304 4K displays (16:9), and I neglected to get a picture of the resolution but I swear there was at least one 4K display that I saw that had a >2.0 AR.jjj - Wednesday, January 18, 2012 - link
I hate TN panels so yay for non TN but i view the resolution game in phones and tablets as mostly marketing BS and it creates additional problems.Prices go up (and this is why the industry likes the idea) gaming gets slower and battery life is lower (or ,for tablets,you shove in a bigger battery and then the price goes up some more).Is it really worth it?4k TV's sure but i would rather see prices for 30" monitors come down a lot.
Thin laptops with the CPU perf of systems costing half as much,GPU perf lower than terrible,poor battery life .... no thanks.Funny how intel tries to do what they already did years ago with the ULV line except this time they added some more shine and doubled the price.Touch screens on laptops,folks should realize how much that will add to the retail price before getting too excited.
piroroadkill - Wednesday, January 18, 2012 - link
No, it's basically not worth it on phones.I have an 800x480 4.3" display, and it looks more than fine to me. I also used an iPhone 4 and 4S with their 960x640 3.5" display. I'd take the 800x480 at 4.3" any day. It doesn't look any better to me at this tiny size, and I'd rather have the larger screen.
Finraziel - Wednesday, January 18, 2012 - link
I'm using the Samsung Galaxy Note now (is it released stateside? No idea...) which has a 5.3" screen with 1280x800 display and I never want to go back. This resolution means I can view just about any website without zooming in in landscape and it's big and sharp enough for me to read ebooks in PDF form on (such as programming books). So for me, yes, the increase in resolution is definately worth it :)R3MF - Wednesday, January 18, 2012 - link
" while I’m not sure if all 4K displays will use the same resolution, this particular panel was running at 4096x2160, so it’s even wider than the current 16:9 aspect ratio panels (and closer to cinema resolutions)"I can't help feeling they would have done better to just make 4K a straight 2:1 aspect ratio, and kept the resolution at 4096x2048!
MamiyaOtaru - Wednesday, January 18, 2012 - link
gah I have been dreaming of a QXGA LCD for my desktop for ages! I'm stuck at 1600*1200 and hating this new widescreen stuffiSayuSay - Wednesday, January 18, 2012 - link
Technology is always about patience. If we're just patient enough to wait for them to be mature and more common, you can jump in with a much reasonable $$$.Hard lesson learned when I bought launch day PS3 back in 2006. It was friggin $650, I couldn't afford to buy one actually, but I "had to". It was very very expensive to me.
And just within 13months after that I got the YLOD for the first time in my life! My super expensive launch day PS3 was unrepairable, not under warranty anymore and I was just frustrated.
I paid Sony $650 just for being lousy beta tester! Now we could buy a $300 PS3 with no YLOD, more energy efficient, better cooling system and chips.
I said no more!! No more wasting my hard earned money to buy a expensive beta phase technology.
And I would wait 4K display to be matured enough, have adequate and affordable contents, price range and availability.
Even now, how many 1080p content was actually available? Does it already widespread like standard DVD? I know many people who owns a LED HDTV but not care enough to notice that they still watch sub 720p content.
bhima - Wednesday, January 18, 2012 - link
Most poignant point I've read all day: "What really irks me is that all of this comes in a 10.1” IPS package, exactly what I’ve been asking for in laptops for the past several years." AMEN brother. I bought a Sager because it actually offers a good 95% color gamut screen. As a designer, I really have a small window of choice because these manufacturers don't offer a decent monitor yet every tablet seems to push out great displays.The 4K TV thing is great, but in terms of gaming there will be a huge technology gap once the 4K monitors are priced reasonably. I just don't think our GPUs will be able to push those kinds of pixels in the next 3 years except for maybe the $500 beast cards or SLI/Crossfire.
bhima - Wednesday, January 18, 2012 - link
Needs an EDIT button:The Lenovo Yoga is ALMOST perfect. If it just had a wacom digitizer in it, it would be worth a lot of money to me and other graphic designers. Hell, it could force even Mac-heads to consider converting to Microsoft.
chizow - Wednesday, January 18, 2012 - link
Idk, at the rate GPUs progress, next-gen single GPUs tend to perform similarly to last-gen's top-end SLI configs. Its not quite 100%, probably closer to 50% increase over 2 years but its very possible we'll see a single GPU push 4K/2K without much trouble while maintaining high framerates (~60FPS).4K/2K is roughly 4x the resolution of 1080p, very close to 2x2 supersampling over the same screen area. We already have mid to high-end configurations pushing 3x1080p with surround configurations from both vendors with relative ease, and even 3D with those same resolutions which is an effective 6x1080p.
4K/2K wouldn't be too much additional burden, I think even a high-end single card would be able to push that resolution without too much trouble, and today's high-end card is tomorrow's mid-range card.
JarredWalton - Wednesday, January 18, 2012 - link
I'd just look at our triple-head desktop benchmarks if you want to know roughly how well current GPU setups can handle 4K displays for gaming. Basically, something like HD 7970 CF or GTX 580 SLI will certainly handle it, though you'll want the 2GB/3GB models. Now it's just waiting for the price to come down--on the GPUs as well as the displays.cheinonen - Wednesday, January 18, 2012 - link
4K support for studios varies a bit based on which studio it is. Sony, for example, has their Colorworks processing facility on their lot and all of their work is done at 4K at a minimum, with some work done at 8K resolution. As 4K displays start to come into play you will likely see a transition to 8K workflows at studios that allow them to sample CGI and other effects down to 4K for release. If you search at Sony's website you can find a list of 4K theater releases and theaters that support them. The majority of digital cinema is still presented at HD format, though Blu-ray releases likely use a slightly different master since the colorspace and gamma target for cinema releases is much different than the Rec 709 target for Blu-ray.100 GB for 4K is really pushing the limits of the medium, and not likely to be what would be used. Most Blu-ray titles clock in around the 30-40 GB range for the film, so given 4x the resolution you are looking at 120-160 GB for a film of similar quality. Another current issue with 4K at the home is I believe that HDMI 1.4a is limited to 24p at 4K resolution, so while films can work fine, no TV or Video content will be able to be upscaled outside of the display or projector. I imagine before 4K really catches on we will have another update to HDMI, and a new media format, as downloads are obviously not fast enough for 4K streaming or downloads yet.
name99 - Wednesday, January 18, 2012 - link
"Most Blu-ray titles clock in around the 30-40 GB range for the film, so given 4x the resolution you are looking at 120-160 GB for a film of similar quality. "Not even close.
The higher you push the resolution, the higher the correlation between adjacent pixels, meaning that the material compresses better for the same level of visual quality.
Heck, even if you compressed a 4K movie into the SAME 30-40GB you'd get better results. The compressor will throw away information intelligently, whereas you can view an existing BR movie as consisting of an initial dumb transfer stage (downsampling the 4K to 1080p) followed by compression.
(This assumes, of course, that the compressor is actually intelligent and, for example, scales search domain upward to cover equivalent area in the 4K movie as in the 1080p movie. This is a obvious point, yet I've yet to see a compressor that does it properly by default, so you will have to tweak the settings to get the improvement. But I am correct in this --- with a smart compressor 4K at 40GB will look better than 1080p at 40GB.)
legoman666 - Wednesday, January 18, 2012 - link
I was unable to find a good current-gen laptop with a 1920x1080 or 1920x1200 display for less than $1000. I eventually settled on a 3-4 year old refurbed IBM T61p with a beautiful WUXGA 1920x1200 15.4" display.I looked up the part number for the panel, a replacement is less than $100 from eBay: http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&... I don't know what laptop manufacturers are doing these days that prevent them from putting a decent panel in their laptops, but they need to quit it. 1366x768 @ 15"? Are you joking? My 3 year old Dell netbook had the same resolution in a 10" form factor for $350.
I'm glad I'm not the only one who's disappointed.
Mithan - Wednesday, January 18, 2012 - link
I have not bought a tablet yet because I want at least a 1080p display on them and yes, this is for surfing.As for Laptops, my last Laptop was a MSI GX640 and I specifically hunted that down because it has a good video card in it AND most importantly, it has a 1600x1080 display.
I had bought some Acer with a 768p display but it went back. What crap.
lbeyak - Wednesday, January 18, 2012 - link
For those of you who that these low-quality displays are good enough...I'm not going to leave you alone. I want you to get mad! I want you to get up now. I want all of you to get up out of your chairs. I want you to get up right now, and go to the window, open it, and stick your head out and yell "I'M AS MAD AS HELL, AND I'M NOT GOING TO TAKE THIS ANYMORE!"
inperfectdarkness - Thursday, January 19, 2012 - link
it's my money, and i need it now!!!name99 - Wednesday, January 18, 2012 - link
You might as well just give up.As far as I can tell most tech blog commenters
(a) are completely unaware that something like Moore's law exists
(b) think everything right now is absolutely perfect and should never be changed.
Thunderbolt. Higher res iPad screens. 4K TVs. Windows File System Improvements. Cell phone system improvements. WiFi improvements.
Doesn't matter what it is, there's a chorus telling you that it will always cost too much, that it probably won't work well, and no-one will care about the improvement.
My god --- who would have thought that the hangout spot for 21st century Luddites would be the comments section of sites like Ars Technica and AnandTech?
JarredWalton - Wednesday, January 18, 2012 - link
The problem is that we're not even getting 1080p on a lot of laptops. I want 1080p pretty much as an option on anything 13" and up. Right now it's really only there on 15.6" and 17.3" displays (and of course, Sony has one 13.1" panel on the VAIO Z).inperfectdarkness - Thursday, January 19, 2012 - link
i've been lamenting the watering-down of laptop screens for years. i'm bloody sick of 1080p screens. 1080p would be fine...if i had an 11" netbook. i LIKE gaming at 16:10--and i like doing it on a 15.4" screen. and for what it's worth, 17" laptops should be offering wqxga at a MINIMUM. ~300dpi should be the standard--regardless of display size.of course, part of the problem with stagnation is probably because xbox-360 has dilluted consumer expectations for gaming--which has resulted in games in which practically the entire library of new GPU's will run in 1080p at acceptable framerates. that kind of versitility didn't even exist at 1024x768 resolutions 10 years ago. the ti4600 i had back then was the ONLY card which even had a prayer of running at 1600x1200. so rather than software improving (and keeping hardware choking on rendering), we've plateaued & hardware has caught up big-time. that's also why 10 years ago, mobile gaming left a LOT to be desired. today, (at 1080p) you can't tell the difference in many cases.
here's to hoping that my 15" form-factor laptop experience will soon offer 4MP gaming at a reliable 60fps. it's been long overdue. i'm still baffled by the reverse-progress in laptop displays. you don't see people rejecting 90's cars in order to drive 80's cars. doesn't make sense why we'd do the same with laptops.