Meta will mark political and social ads altered by AI starting next year

Meta will require advertisers to disclose whether the ads they submit for its websites have been digitally altered, including through the use of AI tools, if they’re political or social in nature. Ads that have been digitally altered will be marked as such on Meta’s platforms, in the same way some advertisements come with a “Paid for” disclaimer. The company will start implementing the rule in the new year, just as the campaign period for what’s expected to be a brutal and divisive 2024 US presidential elections heats up. 

In a blog post, Meta explained that advertisers have to disclose in the advertising flow if they submit a social issue, electoral or political ad with photorealistic images or videos — or one with realistic sounding audio — that was altered to make a real person say or do something they didn’t actually say or do. They’re also required to tell Meta whether they’re submitting an ad with a realistic-looking person that doesn’t exist, a realistic-looking event that didn’t happen or an altered footage of a real event that truly occurred. If they submit a fake image, video or audio recording of an event that allegedly took place — say, something they created with the help of AI image generators — they have to notify Meta, as well. Advertisers don’t need to disclose if they’d only size adjusted, cropped, color corrected and sharpened their ads. 

Meta, which already expects some advertisers to run afoul of the new rule, warned that it will reject ads if it determines that they failed to or deliberately didn’t disclose that they’d digitally altered their submissions. Further, it said that repeated strikes against the rule “may result in penalties.” It has yet to elaborate on the authorization process advertisers have to go through and the safeguards in place to prevent them from gaming the system, but it promised to share more details in the future. 

Politicians and supporters from both sides of the aisle have already raised concerns about the possibility of AI being used to propel election misinformation to new heights this campaign season. There’s already an altered video of President Joe Biden that was edited to make him appear as if he was inappropriately touching his granddaughter circulating on Facebook. Meta’s Oversight Board opened a case after users appealed to have the video removed, and it’s expected to release a decision in the near future. 

This article originally appeared on Engadget at 

Leave a Comment

Generated by Feedzy