:idea: Why would the image or audio quality vary? :idea:
http://www.reghardware.co.uk/2008/11/21 … /comments/

:canthearyou: These players transfer video and audio data over HDMI, which is digital. So how can the picture or sound quality be different between one player and another? :canthearyou:

Teh Cnology 
Anonymous Coward • Friday 21st November 2008 10:19 GMT

Now call me a liar, but please base that accusation on solid ground...

- Blu-Ray is a "industry standard" way to store digitally encoded video.

- All players have to conform this standard to carry Blu-Ray-logo.

- There cannot be anything different at the first stage, when decoding video - every player has to decode exactly the same way, otherwise the output is garbage - or differs greatly from what the editor of disc intended.

- Digital input in digital television is standardized too, so there cannot be anything different between what the players output to television - it has to follow standards (Ok, I know there are different resolutions, and interlaced/not interlaced - but let's assume every player would be set to output exactly same resolution - and that test setup television is actually handling signal digitally from input to output, and not having some analog stage or upscale/filter effect software active inside telly.)

El Reg finds some difference between players. Recommends stand-alone players, because they output better picture. May I ask editor, how did you come in a conclusion that there might be something different on output quality between players :) ? Measured by your eyes after 3 beers or actually using some scientific instrument?

As you know, in digital world, there are no analog conversions, decoded pixels must run "as-it-is" from decoded video to LCD cell on telly.

Conclusion -> Only thing that can make ANY difference would be some sort of scaling or filtering effects that are built-in the player decoding software. But these are hardly standard, output will be different from what the editor of Blu-Ray Disc intended, even if it might please the eye.

Only other reason would be that the player is not following standards and outputs crap - then auditor who gave the right to use the Blu-Ray logo failed.

A way to actually measure the output image quality would be to construct decoder device that 100% conforms to standard, and record digital pixel output from every player, compare recorded output bit-by-bit, and if there is any difference between signals it is non-standard and should be burned with fire.

My opinion is that the Playstation software could be updated at any time to include same crappy filters that "enhance" picture as any other player. So I hereby declare Playstation as the winner :)

...let the flames burn this witch. Apologies for length, girth, etc.

:flag: Picture Quality  :flag:
John • Friday 21st November 2008 11:19 GMT

Blimey, you think all decoders work exactly the same way? The quality of decoders varys enormously. :idea:

Some chips will cut certain corners, others chips will apply additional post processing to positively enhance the picture. HDMI ensures the output from the player is carried to the TV without degredation, it doesn't make all the players produce the same output in the first place. Software decoders (such as on the PC and the PS3) don't have the luxary of dedicated hardware and thus very often have to cut corners.

:flag: Image quality *does* vary  :flag:
Peter Kay • Friday 21st November 2008 11:54 GMT

Image quality does vary, and it's very easy to tell the difference if you know where to look. Decoding DVD and Blu-ray video codecs is not as simple as 'just decompressing it' and some of the rarer decoding scenarios are harder to fix. It took *years* for certain bugs that weren't present in high end DVD players to be fixed in the 30 quid player from Tesco.

CD redbook audio, for instance, is dramatically simpler than any video codec out there. That doesn't stop CD players worth hundreds (thousands) of pounds being sold when a CD player can be bought for less than 20 quid. Why? Again, it's not just ones and zeroes - there are issues with timing and suchlike.

:idea: price & quality of Blu-Ray  :idea:
Eric Van Haesendonck • Friday 21st November 2008 12:53 GMT

- To anonymous coward: even a single code specification such as H264 the quality of the image can vary greatly depending upon decoder parameters :music: . For example if you decode an H264 video with Smplayer on you computer you can disable the loop filter step to play on older hardware but with a significant quality loss, enable the loop filter to get "normal quality (at the cost of more processing power) or add some extra filters such as denoise3d to get even better quality (at the cost of even more CPU power). To a certain extent this is also true of DVD / mpeg 2