in

In order to uphold democracy, political advertisements must disclose their utilization of AI technology.



US Representative Yvette Clarke Proposes Legislation to Regulate AI in Political Ads

US Representative Yvette Clarke has introduced a bill that would require political ads’ creators to disclose their use of artificial intelligence (AI) in their creation. Clarke’s timely proposal to regulate AI in political advertising is essential to safeguarding democracy. AI technology has recently advanced to the level where people can use it to create false or intentionally misleading ads, using language modeling, chatbots, deepfakes, and image generation technologies. By requiring disclosure, the proposed bill would increase transparency and information for voters, making it easier for them to assess AI-generated advertisements.

Why the Proposed Bill is Timely

Fake-news stories and doctored photographs have been used for decades in fashion magazines worldwide. The primary objective of using such material on the fashion magazine covers is to gain sales and popularity. However, during the 2016 and 2020 US elections, allegations of fake-news became prevalent and past campaigns that involved consequential outcomes increased distrust of the media in the US. According to the Pew Research Center, Republican voters have experienced a particularly large drop in trust in news organizations. This mistrust of news coincides with current sophisticated AI tools that are capable of producing content that appears very realistic. It has become harder to differentiate between real and fake content today, as some AI-generated photos or videos are of people with altered, replaced, or added faces, voices, or movements. Without regulation, this problem seems likely to exacerbate, thus making Clarke’s bill a timely solution to the problem of AI-generated content in political advertising.

How Disclosure Can Help

One way of solving this problem is by introducing rules for the mandatory disclosure of the use of AI in political advertising. Already, the Republican National Committee has taken the initiative by including a disclaimer in its ad, stating that it was “constructed solely using AI imagery.” Clarke’s proposed bill addresses this issue, and different policymakers demanding disclosure of AI use in propaganda ads have widely supported it. Implementation of the bill does not need to be expensive or difficult for advertisers, given the availability of existing technologies for labeling and watermarking content.

However, a critical issue that must be addressed is the definition of AI in this context. The current bill text defines artificial intelligence as “generative AI,” which means there is still no widely accepted meaning of the term.  As AI technology advances, it is essential to regulate AI-based political advertising quickly. Even without perfect definitions, the passing of this bill would be a step in the right direction to establish accountability and responsibility in AI-developed ads.

Conclusion

Clarke’s proposed legislation to regulate the use of AI in political advertising is timely and necessary. Its implementation would promote transparency and accountability in political ads and provide voters with information about the methods of ad creation, increasing their ability to recognize false content. Moreover, the disclosure of AI use in political ads would ensure that politicians are held accountable for their campaigns, safeguarding democracy. The Federal Election Commission already mandates some disclaimers on political ads, although they are inadequate to address the issues related to the use of AI in political advertising. Therefore, it is necessary to move quickly to avoid the abuse of the latest technologies at the detriment of democracy.



Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

“Steve Blank Takes Bold Risks: A Review of ChatGPT’s Fascinating Interview”

“Philippe Herbert of Banexi Ventures Partners: How to Fund Your Hypergrowth?”