HTPC enthusiasts are often concerned about the quality of pictures output by the system. While this is a very subjective metric, we have been taking as much of an objective approach as possible. We have been using the HQV 2.0 benchmark in our HTPC reviews to identify the GPUs' video post processing capabilities. The HQV benchmarking procedure has been heavily promoted by AMD, and Intel also seems to be putting its weight behind that.

The control panel for the Ivy Bridge GPU has a number of interesting video post processing control knobs which earlier drivers lacked. The most interesting of these is the ability to perform noise reduction on a per-channel basis, i.e, only for luma or for both luma and chroma. More options are always good for consumers, and the interface makes it simple enough to leave the decision making to the drivers or the application. An explicit skin tone correction option is also available.

HQV scores need to be taken with a grain of salt. In particular, one must check the tests where the GPU lost out points. In case those tests don't reflect the reader's usage scenario, the handicap can probably be ignored. So, it is essential that the scores for each test be compared, rather than just the total value.

The HQV 2.0 test suite consists of 39 different streams divided into 4 different classes. For the Ivy Bridge HTPC, we used Cyberlink PowerDVD 12 with TrueTheater disabled and hardware acceleration enabled for playing back the HQV streams. The playback device was assigned scores for each, depending on how well it played the stream. Each test was repeated multiple times to ensure that the correct score was assigned. The scoring details are available in the testing guide from HQV.

Blu-rays are usually mastered very carefully. Any video post processing (other than deinterlacing) which needs to be done is handled before burning it in. In this context, we don't think it is a great idea to run the HQV benchmark videos off the disc. Instead, we play the streams after copying them over to the hard disk. How does the score compare to what was obtained by the Sandy Bridge and Llano at launch?

In the table below, we indicate the maximum score possible for each test, and how much each GPU was able to get. The HD3000 is from the Core i5-2520M with the Intel drivers. The AMD 6550D was tested with Catalyst 11.6, driver version 8.862 RC1 and the HD4000 with driver version

HQV 2.0 Benchmark
Test Class Chapter Tests Max. Score Intel HD3000 AMD 6550D (Local file) Intel HD4000
Video Conversion Video Resolution Dial 5 5 4 5
Dial with Static Pattern 5 5 5 5
Gray Bars 5 5 5 5
Violin 5 5 5 5
Film Resolution Stadium 2:2 5 5 5 5
Stadium 3:2 5 5 5 5
Overlay On Film Horizontal Text Scroll 5 3 5 3
Vertical Text Scroll 5 5 5 5
Cadence Response Time Transition to 3:2 Lock 5 5 5 5
Transition to 2:2 Lock 5 5 5 5
Multi-Cadence 2:2:2:4 24 FPS DVCam Video 5 5 5 5
2:3:3:2 24 FPS DVCam Video 5 5 5 5
3:2:3:2:2 24 FPS Vari-Speed 5 5 5 5
5:5 12 FPS Animation 5 5 5 5
6:4 12 FPS Animation 5 5 5 5
8:7 8 FPS Animation 5 5 5 5
Color Upsampling Errors Interlace Chroma Problem (ICP) 5 2 2 5
Chroma Upsampling Error (CUE) 5 2 2 5
Noise and Artifact Reduction Random Noise SailBoat 5 5 5 5
Flower 5 5 5 5
Sunrise 5 5 5 5
Harbour Night 5 5 5 5
Compression Artifacts Scrolling Text 5 3 3 5
Roller Coaster 5 3 3 5
Ferris Wheel 5 3 3 5
Bridge Traffic 5 3 3 5
Upscaled Compression Artifacts Text Pattern 5 3 3 3
Roller Coaster 5 3 3 3
Ferris Wheel 5 3 3 3
Bridge Traffic 5 3 3 3
Image Scaling and Enhancements Scaling and Filtering Luminance Frequency Bands 5 5 5 5
Chrominance Frequency Bands 5 5 5 5
Vanishing Text 5 5 5 5
Resolution Enhancement Brook, Mountain, Flower, Hair, Wood 15 15 15 15
Video Conversion Contrast Enhancement Theme Park 5 5 5 5
Driftwood 5 5 5 5
Beach at Dusk 5 2 5 5
White and Black Cats 5 5 5 5
Skin Tone Correction Skin Tones 10 0 7 7
    Total Score 210 173 184 197

A look at the above table reveals that Intel has caught up with the competition in terms of HQV scores. In fact, they have comfortably surpassed what the Llano got at launch time. Many of the driver problems plaguing AMD's GPUs hadn't been fixed when we looked at the AMD 7750 a couple of months back, so it is likely that the Llano's scores have not budged much from what we have above. In fact, the score of 197 ties with what we obtained for the 6570 during our discrete HTPC GPU shootout.

Testbed and Software Setup Video Post Processing in Action
Comments Locked


View All Comments

  • MGSsancho - Monday, April 23, 2012 - link

    While I agree with most everything there is something I would like to nit pick on, While making a digital copy of old film in what ever format you use, more often than not a lot of touching up needs to be done. Wizard of OZ and all the 007 films can be an example. (I am ignoring the remastering of Star Wars and Lucas deciding to add in 'features' vs giving us a cleaned up remaster sans bonuses.) Still when your spending millions in remaster I expect at least not muddy the entire thing up.

    However I feel we need to bring in higher bitrates first. I will not apologize over this, yes encoders are great but a 4mbs 1080p stream still is not as good as nice as a 20mb-60mb vbr blu-ray film The feeling that a craptastic 4k or even 2k bitrate will ruin the expedience for the non informed. Also notice I am ignore an entire difference debate whether the current can candle true HD streaming to every household, at least in the US.
  • nathanddrews - Monday, April 23, 2012 - link

    Higher bit rates will be inherent with 4K or 2K over 1080p, but bit rates aren't the be all end all. 4K will likely use HVEC H.265 which offers double the compression with better quality than H.264.

    Fixing scratches, tears, or other issues with film elements should never be a reason for mass application of filtering.
  • SlyNine - Tuesday, April 24, 2012 - link

    H.264 doesn't even offer 2x the compression over Mpeg 2. I doubt H.265 offers 2x over 264.

    "This means that the HEVC codec can achieve the same quality as H.264 with a bitrate saving of around 39-44%."

  • Casper42 - Monday, April 23, 2012 - link

    I LOL'd at "Walmart Black Friday" Nathan :)

    And for the OP, 32", really?
    Its completely understandable you don't see the difference on a screen that size.
    Step up to a 60" screen and then go compare 720p to 1080p (who uses 1080i anymore, oh thats right, crappy 32" LCDs. Don't get me wrong, I own 2, but they go in the bedroom and my office, not my Family Room.)

    I think 60" +/- 5" is pretty much the norm now a days for the average middle class family's main movie watching TV.
  • anirudhs - Monday, April 23, 2012 - link

    Cable TV maxes out at 1080i ( I have Time Warner). My TV can do 1080P.
  • nathanddrews - Monday, April 23, 2012 - link

    1080i @ 60 fields per second when deinterlaced is the same as 1080p @ 30 fields per second. The picture quality is almost entirely dependent upon your display's ability to deinterlace. However, cable TV is generally of a lower bit rate than OTA or satellite.
  • SlyNine - Tuesday, April 24, 2012 - link

    Yea but because of shimmering effects progressive images almost always looks better.

    If the video is 2:2 or 3:2 many tv's can build the frame in to a progressive image anymore.
  • Exodite - Tuesday, April 24, 2012 - link

    In the US, possibly, but I dare say 55-60" TVs are far from the norm everywhere.
  • peterfares - Thursday, September 27, 2012 - link

    2560x 27" and 30" monitors are NOT very pixel dense. 27" is slightly more dense (~12.5% more dense) than the standard display but the 30" is only about 4% more dense than a standard display

    a 1920x1080 13.3" display is 71.88% more dense than a standard display.
  • dcaxax - Tuesday, April 24, 2012 - link

    On a 32" you will certainly not see a difference between 720p and 1080p - it is barely visible on a 40". Once you go to 52"+ however the difference becomes visible.

    On a 61" screen as you suggest the difference will be quite visible.

    Having said that I am still very happy with the Quality of properly mastered DVD's which are only 576p on my 47" TV.

    It's not that I can't tell the difference, its just that it doesn't matter to me that much, which is why I also don't bother with MadVR and all that, and just stick to Windows Media Center for my HTPC.

    Everyone's priorities are different.

Log in

Don't have an account? Sign up now