New ‘Take It Down Act’ Targets Deepfakes

· ·
New ‘Take It Down Act’ Targets Deepfakes
photo via @_techfyle_

President Donald Trump signed the Take It Down Act into federal law, targeting the spread of explicit or fake sexual images shared without consent. The legislation addresses both real and AI-generated content, including deepfakes, or videos that falsely place people in explicit scenarios.

The signing took place at the White House Rose Garden, where First Lady Melania Trump joined the president. She emphasized the need to shield children, families, and individuals from growing digital harms. Lawmakers from both parties stood behind the measure, marking strong bipartisan support.

“Children should never have to face this kind of abuse. This law gives them and their families the tools they need,” said Melania Trump during the event.

The new law focuses on non-consensual intimate content, including manipulated imagery. Deepfake videos, often designed to humiliate or harass, have become easier to produce and harder to trace. Because of this, the law now compels online platforms to act quickly when a victim files a report.

Key provisions include:

  • Criminalizes the distribution of explicit content without consent
  • Includes deepfake porn and other AI-altered sexual imagery
  • Sets penalties of up to 3 years in prison, with higher terms for content involving minors
  • Requires takedown of flagged content within 48 hours
  • Demands that platforms prevent the reposting of removed material
  • Names the Federal Trade Commission (FTC) as the enforcing body
  • Gives tech companies 12 months to implement compliance systems

Lawmakers drafted the bill in response to several high-profile incidents. In Texas, for example, students used deepfake technology to fabricate images of a peer to harm her reputation. Cases like this prompted urgency in Congress.

“No one should have to beg a tech company to remove a fake or private image,” said Senator Amy Klobuchar, who helped lead the bill. “We finally have a law that makes that right clear.”

Concern remains

However, some experts remain concerned. Digital rights organizations argue that the law could lead platforms to scan private or encrypted conversations, creating potential risks for free expression and privacy.

Groups like the Electronic Frontier Foundation and Internet Society are urging lawmakers to clarify boundaries and include stronger safeguards. Without those, they warn, the law could be used to censor or remove legal speech.

Despite those concerns, the law has taken effect. The FTC is now responsible for overseeing platform compliance. Victims of non-consensual or AI-generated intimate imagery can file takedown requests, and websites must respond within 48 hours.

More…

Read More..

Leave a Reply

Your email address will not be published. Required fields are marked *