debo_2006

X post: Anyone know anything about HD / 4K tvs?

debo_2006
8 years ago

Posted in computer forum also.

Bought our first 4K UHD tv at BB. It's a Sony 55X850C, trading up from rear projector Sony which was amazing. I don't understand why all of the non-HD channel look lackluster. No sharpness to the pic, pixelated, fuzzy, and wording is broken up. From scene to scene I can see the TV try to adjust the pixels to some degree, but, they are still quite noticeable...and annoying. All the HD channels look great though. I have tried calibrating the TV myself along with using Internet suggested settings to no avail leaving the non-HD channel pictures looking washed out and faded.

I know there isn't much programming out there yet for 4K viewing, but, is this as good as it gets with these TV's? My mom has a non-smart Samsung and her pic is always so vibrant even on non-HD channels.

What am I missing? I'm ready to return the set not knowing if I'll find another that might be better. I have posted a similar question on Q&A sites but no replies. BTW, we have Fios if that matters. Sony site doesn't help much as I've been there. Thanks.

Comments (4)

  • 59 Dodge
    8 years ago

    Check out the AVS Forum for answers.

    Gary

  • steve_o
    8 years ago

    I can't speak to 4K TVs (which, btw, are often referred to as "UHD", not just "HD"). But most HD TVs available for the past 8-10 years or so have not done a good job displaying non-HD (or SD [Standard Definition]) content. The aspect (high-to-wide) ratio is different so the TV typically ends up remapping the display some to fill the screen as much as it can. This is very similar to changing the definition on a computer screen (essentially the same reasons).

    It's not clear to me whether you're discussing TV stations or all non-HD content. If you're trying to play videocassettes or MP4s through the TV, it won't look good because these sources don't have enough display information to look high-definition.

    You might try changing some features on the TV. Try changing the aspect ratio, perhaps accepting black borders on the screen rather than trying to fill the screen with the lower-definition signal. If you're talking about one particular input (say, VCR), you may be able to change that particular input and leave the rest optimal.

  • PRO
    A/V Consulting
    8 years ago

    The simplest explanation is this... Imagine that you are now looking at the Standard Def channels through a microscope. For the first time, you can really see all the "junk" that you never noticed before. Also, the fact that your eyes are going immediately form HD to SD, you now have a new baseline of comparison. Makes the difference really stand out.

  • PRO
    Audio Plus
    8 years ago

    Manufacturers don't invest much in standard definition processing. Similar to basic 1080P or 720P sets a few years ago. It doesn't make sense to design a tuner section to support antiquated standard def. The same reason why there is only 1 standard def input (and usually combined with the only component red, green, blue input).

Sponsored

Reload the page to not see this specific ad anymore

Ireland
Tailor my experience with cookies

Houzz uses cookies and similar technologies to personalise my experience, serve me relevant content, and improve Houzz products and services. By clicking ‘Accept’ I agree to this, as further described in the Houzz Cookie Policy. I can reject non-essential cookies by clicking ‘Manage Preferences’.