• cynar@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    26 days ago

    Uncompressed 1080 is already approaching the eyes resolution limit, when viewing it in a living room environment. 4K is close to the monitor usage limit.

    The reason that 4K seems better is often down to bandwidth and colour depth.

    There’s zero benefit to an 8K TV. An 8K monitor might be useful, but is still well into the diminishing returns curve.

    There’s still some ground to be made up with colours and frame rates, but resolution is effectively maxed out already.

    • dan@upvote.au
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      You can have a smart TV but never set up any of the smart features. I have two LG OLED TVs but rarely touch anything on the TV itself. I’ve got Nvidia Shields for streaming and turning it on or off also turns the TV on or off. Same with my Xbox.

      I just need to figure out if I can use CEC with my SFF gaming PC (so that turning it on also turns the TV on, and turning it off turns the TV off), then I won’t have to touch the TV’s remote again.

      Ethernet port or wifi are good for controlling the TV using something like Home Assistant. I have my TVs on a separate isolated VLAN with no internet access. I have a automation that runs when the TV turns on, to also turn on some LED lights behind the TV.

  • Wispy2891@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    I’d buy a 8k TV, provided that it has no smarts, no WiFi, no TV tuner and its price isn’t over 5% than a 4k TV

      • Wispy2891@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        Somehow when it’s called a “monitor” it quadruples the price.

        I can’t really accept that a basic 4k 27" monitor without even speakers costs the same of a 4k 65" TV with HDR, deeper blacks, WiFi and it even comes bundled with dozens of spyware for added convenience

        • jet@hackertalks.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          The cost difference is right there in your description, pixel density is prohibitively expensive

  • n1ckn4m3@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    As someone who stupidly spent the last 20 or so years chasing the bleeding edge of TVs and A/V equipment, GOOD.

    High end A/V is an absolute shitshow. No matter how much you spend on a TV, receiver, or projector, it will always have some stupid gotcha, terrible software, ad-laden interface, HDMI handshaking issue, HDR color problem, HFR sync problem or CEC fight. Every new standard (HDR10 vs HDR10+, Dolby Vision vs Dolby Vision 2) inherently comes with its own set of problems and issues and its own set of “time to get a new HDMI cable that looks exactly like the old one but works differently, if it works as advertised at all”.

    I miss the 90s when the answer was “buy big chonky square CRT, plug in with component cables, be happy”.

    Now you can buy a $15,000 4k VRR/HFR HDR TV, an $8,000 4k VRR/HFR/HDR receiver, and still somehow have them fight with each other all the fucking time and never work.

    8K was a solution in search of a problem. Even when I was 20 and still had good eyesight, sitting 6 inches from a 90 inch TV I’m certain the difference between 4k and 8k would be barely noticeable.

    • TeddE@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Computer monitor with multiple simultaneous 4k displays?

      Grasping at straws here