• WolfLink@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Even if you assume the images you care about have this metadata, all it takes is a hacked camera (which could be as simple as carefully taking a photo of your AI-generated image) to fake authenticity.

    And the vast majority of images you see online are heavily compressed so it’s not 6MB+ per image for the digitally signed raw images.

      • WolfLink@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It’s not that simple. It’s not just a “this is or isn’t AI” boolean in the metadata. Hash the image, then sign the hash with digital signature key. Anyone will know the image has been tampered with, and you can’t make a new signature without the signing key.

        • cmnybo@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Cameras don’t cryptographically sign the images they take. Even if that was added, there are billions of cameras in use that don’t support signing the images. Also, any sort of editing, resizing, or reencoding would make that signature invalid. Almost no one is going to post pictures to the web without any sort of editing. Embedding 10+ MB images in a web page is not practical.