Lawmakers in Washington are currently facing the challenge of addressing the rapid increase in deepfake AI pornography, which has become a widespread issue affecting individuals ranging from celebrities to high school students. Legislation is being introduced to target social media companies and hold them accountable for the dissemination of deepfake pornographic images on their platforms.
Senator Ted Cruz from Texas is spearheading a new bill, known as the Take It Down Act, which aims to criminalize the publication or threat of publication of deepfake pornographic content. The bill also demands that social media platforms establish procedures for promptly removing such images within 48 hours of a valid request from victims. Additionally, platforms must make reasonable efforts to eradicate any additional copies of the images, including those shared within private groups.
Under the proposed legislation, the task of enforcing these regulations would be entrusted to the Federal Trade Commission, which oversees consumer protection laws. The bill will be formally introduced by a bipartisan group of senators, and victims of deepfake porn, such as high school students, will be present in the Capitol to support the legislation. The prevalence of non-consensual AI-generated images has impacted various individuals, including celebrities like Taylor Swift, politicians like Rep. Alexandria Ocasio-Cortez, and high school students who have had their faces morphed into nude or pornographic photos using apps and AI tools.
While there is widespread agreement in Congress regarding the necessity of addressing the issue of deepfake AI pornography, there is a lack of consensus on the most effective approach. The Senate currently has two competing bills on the table. Senator Dick Durbin from Illinois introduced a bipartisan bill earlier this year that would enable victims of non-consensual deepfakes to take legal action against individuals who created, possessed, or distributed the images. In contrast, Senator Cruz’s bill views deepfake AI porn as highly offensive online content, putting the onus on social media companies to moderate and remove such material.
Debate and Opposition
When Senator Durbin attempted to advance his bill for a floor vote, Senator Cynthia Lummis blocked the initiative, citing concerns about its broad scope potentially hindering American technological innovation. Durbin defended his bill by stating that tech platforms would not face liability under the proposed law. Interestingly, Senator Lummis is among the original co-sponsors of Senator Cruz’s bill, along with Senators Shelley Moore Capito, Amy Klobuchar, Richard Blumenthal, and Jacky Rosen. These lawmakers have come together to address the urgent need for legislation to combat deepfake AI pornography.
The introduction of the Take It Down Act represents a significant step towards combating the proliferation of deepfake AI pornography and holding social media platforms accountable for their roles in spreading such harmful content. As the issue continues to evolve, lawmakers are facing the challenge of crafting effective legislation that balances the need for protection with the preservation of technological innovation. The ongoing debate in Congress underscores the complexity of addressing deepfake AI pornography and highlights the importance of finding comprehensive solutions to safeguard individuals from the damaging effects of non-consensual AI-generated images.
Leave a Reply