ARTFEED — Contemporary Art Intelligence

Weaponized Deepfakes Threaten Society with Political Manipulation and Harmful Content

ai-technology · 2026-04-22

Deepfake technology has evolved from theoretical warnings to widespread real-world threats, with AI-generated videos and images now easily created using accessible generative models. These weaponized deepfakes range from sexually explicit material to political propaganda, with a 2023 study revealing 98% were pornographic and 99% depicted women. Specific examples include Elon Musk's xAI chatbot Grok, whose image editing function produced millions of sexualized images, 81% depicting women according to one report. Political manipulation has become rampant, with the Trump administration regularly sharing AI-generated content, including an altered image of Minneapolis civil rights lawyer Nekima Armstrong that darkened her skin and changed her expression. In January, Texas attorney general Ken Paxton shared a fabricated video showing his opponent Senator John Cornyn dancing with Representative Jasmine Crockett. Proposed solutions like technical safeguards, user behavior changes, and legislation face significant limitations, as open-source models bypass restrictions and enforcement remains inconsistent. The problem may intensify during upcoming U.S. midterm elections, with weakened federal agencies and diminished fact-checking organizations creating vulnerabilities. These developments threaten critical thinking skills and institutional trust while disproportionately impacting women and marginalized groups.

Key facts

  • Deepfake technology has become widely accessible through cheap or free generative models
  • A 2023 study found 98% of deepfakes were pornographic and 99% depicted women
  • Elon Musk's xAI chatbot Grok produced millions of sexualized images, 81% depicting women
  • The Trump administration regularly produces and shares AI-generated images and videos
  • Texas attorney general Ken Paxton shared a fabricated video of Senator John Cornyn dancing with Representative Jasmine Crockett
  • An altered image of Minneapolis civil rights lawyer Nekima Armstrong was shared by the White House
  • Proposed solutions include technical safeguards, user behavior changes, and legislation
  • The problem may worsen during upcoming U.S. midterm elections due to weakened oversight

Entities

Artists

  • Elon Musk
  • Ken Paxton
  • John Cornyn
  • Jasmine Crockett
  • Nekima Armstrong

Institutions

  • xAI
  • Trump administration
  • White House
  • Meta
  • The Guardian
  • The Saturday Paper
  • i24 News
  • Security Hero
  • AI Forensics
  • Technology Review
  • The New York Times

Locations

  • United States
  • Texas
  • Minneapolis
  • India
  • Europe
  • Australia
  • Israel

Sources