Photographers claim that Meta tags their actual photos as “created with AI.”
Meta introduced this automated tagging to help people understand whether what they're looking at is real or computer-generated, but even the AI itself isn't very good at telling what is and isn't AI, which is why users like a former White House photographer and a professional cricket team end up posting photos that are flagged as AI-generated.
Former White House photographer Pete Souza told TechCrunch that this could be the result of basic photo editing that photographers do before uploading their photos. So uploading the image to Photoshop, cropping it, and then exporting it as a new .JPG file could set off Instagram's detection feature. This makes sense, since Instagram says its AI looks at metadata to detect the legitimacy of images.
As election season approaches, it will become increasingly important that large social platforms like Meta are able to properly moderate AI-generated content. But as things stand, if the “Made with AI” tag can't be trusted, does it really accomplish anything?