A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    5
    ·
    5 months ago

    The actual paper presents the findings differently. To quote:

    Our results clearly indicate that the resolution limit of the eye is higher than broadly assumed in the industry

    They go on to use the iPhone 15 (461ppi) as an example, saying that at 35cm (1.15 feet) it has an effective “pixels per degree” of 65, compared to “individual values as high as 120 ppd” in their human perception measurements. You’d need the equivalent of an iPhone 15 at 850ppi to hit that, which would be a tiny bit over 2160p/UHD.

    Honestly, that seems reasonable to me. It matches my intuition and experience that for smartphones, 8K would be overkill, and 4K is a marginal but noticeable upgrade from 1440p.

    If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish

    Three paragraphs in and they’ve moved the goalposts from HD (1080p) to 1440p. :/ Anyway, I agree that 2.5 meters is generally too far from a 44" 4K TV. At that distance you should think about stepping up a size or two. Especially if you’re a gamer. You don’t want to deal with tiny UI text.

    It’s also worth noting that for film, contrast is typically not that high, so the difference between resolutions will be less noticeable — if you are comparing videos with similar bitrates. If we’re talking about Netflix or YouTube or whatever, they compress the hell out of their streams, so you will definitely notice the difference if only by virtue of the different bitrates. You’d be much harder-pressed to spot the difference between a 1080p Bluray and a 4K Bluray, because 1080p Blurays already use a sufficiently high bitrate.

  • sunbeam60@lemmy.one
    link
    fedilink
    arrow-up
    4
    ·
    5 months ago

    I think people, and this paper, misses a few elements.

    4K encoded content often has significantly higher bitrate (well, duh, there’s more content) and often higher than the simple increase in pixel density would suggest. So content with heavy moment (flocks of birds, water, crowds etc) still looks better than 1080p, not because of the increase in pixel density, but because of the decrease of compression artefacts.

    Second, high dynamic range yo! On a still picture on my TV it’s hard to see difference between 1080p and 4K but it isn’t hard to see the difference between SDR and HDR.

    So I still vastly prefer 4K content, but not because of the resolution.

  • Banzai51@midwest.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    5 months ago

    And I’m still here with a 1080p TV. Because my previous ISP had data caps, I’m not convinced my current ISP won’t move to data caps, and there isn’t a lot of good 1080p content out there besides DVD/BluRay and my Rips, let alone 5k or 8k content.

    • sanzky@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      I think resolution is just a tiny part of why newer TVs are so much better than a 10yo TV. there is better contrast, HDR, more brightness, angle of vision, etc.

      1080p content will look better in a good 4k TV and not just because the resolution.

  • network_switch@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    1080p to 4k was a big improvement in my opinion. I still have a mix of 1080p and 4k equipment. 4k to 8k is real minor to me because 4k TVs are varying degrees of HDR now. Brightness range per zone/pixel and wider color gamut. 8K someday because someday the only TVs you should be buying for the price will be 4k but content picture quality, 4k with quality HDR brightness, contrast, color gamut - minor difference. Just need high quality sources. When I encode something, I use fairly high bitrate AV1. Another 5-7 years and I expect to be encoding everything new at AV2

  • thingsiplay@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Depends on the source material, the distance you look and quality of the panel. 4k is just more resolution than HD. This does not really need a study.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      5 months ago

      It sounds like the study actually did include display distance, and gave different requirements depending.

      • thingsiplay@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        My point is, I could do a study too and claim that 4K/8K TVs are much better than HD to your eyes. Its just the setup and source that makes the difference.

        • CanadaPlus@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          5 months ago

          Now that I’ve actually looked at the study, what they did is make an apparatus with continuously adjustable distance to display and try to get people to distinguish scaled, fairly similar clips until they couldn’t anymore.

          Actual maximum pixel-per-visual-degree values varied quite a bit based on colours involved and the like. And like @GenderNeutralBro@lemmy.sdf.org said, they framed the results the opposite way to the article - human vision can distinguish more than previously thought.