Japanese authorities have made what appears to be the country’s first criminal case involving AI-generated pornography of public figures. On October 17 2025 the Tokyo Metropolitan Police Department announced the arrest of a 31-year-old man from Akita, named Hiroya Yokoi, who allegedly used generative-AI software to create and sell explicit images depicting more than 260 women—among them J-pop idols, actresses and television personalities.
According to police reports, Yokoi admitted to turning to this work to earn extra money and to pay off a student loan. Between October 2024 and September 2025 he is said to have earned approximately 1.2 million yen (roughly US$8,000) by selling around 20,000 such AI-generated images of some 262 women. Investigators say he learned how to use publicly available generative AI tools from online articles and videos, and then fed images of celebrities into the software to produce fake sexual-content photos which he then shared via his social media account. Premium subscribers could even request custom images of specific celebrities and poses. Yokoi’s scheme reportedly ran from January 7 to June 2 of the current year in terms of the specific offences under scrutiny. He was arrested on suspicion of distributing obscene digital images without permission, in violation of Japan’s law governing the distribution of such content. The relevant charge carries a penalty of up to two years in prison or a fine of up to 2.5 million yen.
What makes this case significant is the intersection of two emerging problems: the proliferation of generative-AI tools that make creation of deepfakes easier and the vulnerability of public figures—particularly women in entertainment—to non-consensual sexualisation. Many observers say this arrest may mark a turning point in how Japanese law enforcement begins to confront AI-enabled exploitation. Until now, most non-consensual or manipulated sexual images operated in a legal grey zone; few high-profile arrests existed where the accused knowingly produced fake sexual images of identified individuals using AI and profited from it. The technology used in this case is textbook “deepfake” in the broader sense—AI that can convincingly map one person’s likeness onto another or generate entirely synthetic but lifelike images. Historically, deepfakes have been used in political misinformation, celebrity face-swaps, and non-consensual pornography. In Japan’s entertainment industry, where idols and actresses often occupy highly visible roles and maintain tightly managed public identities, the damage potential of AI-generated images is especially acute: beyond the direct violation of personal dignity and privacy, such content can undermine the integrity of celebrity branding, fan trust and personal safety.
Critically, this case raises urgent questions about consent, legal liability, and the adequacy of existing legislation. The tools Yokoi used were freely accessible and required relatively modest technical proficiency—he learned his method through online tutorials. The fact that he could monetize the images by charging subscriptions for bespoke content underscores how quickly entrenched systems could be abused without significant barriers. Observers say that the law in Japan currently penalises distribution of obscene material but may lack specific provisions for AI-generated non-consensual images or deepfakes of real persons. This gap underscores the broader global challenge: generative AI is racing ahead of regulation. The arrest may spur legislative and regulatory responses in Japan. For example, the broader implications of this case may prompt discussions around tightening controls on the creation and distribution of AI-generated sexual content, enforcing stricter verification of identity in digital platforms, and enhancing protections for public figures and private citizens alike. It also puts a spotlight on platform responsibility: social-media services and subscription-based sites that host or facilitate distribution of manipulated sexual content may face growing scrutiny over how they moderate and detect AI-driven abuse.
For the celebrity figures involved—though names have not been publicly disclosed—the result is likely a mixture of impact on reputation, emotional distress and potential risk of further exposure. Deepfake images can circulate widely and persist online even after removal efforts. The fact that a premium tier allowed custom requests suggests an active marketplace for such content that may have solicited new images beyond the original 262 women. The victims’ ability to seek redress, take down content and hold perpetrators accountable will become a focal point in future cases.
The arrest of Hiroya Yokoi signals that Japan is beginning to treat AI-generated non-consensual pornography as a genuine criminal matter rather than a fringe issue. It illustrates how the democratization of generative-AI tools has lowered the barrier to creating exploitative content, and how existing laws may need urgent updating to keep pace. For policymakers, platforms, celebrities and ordinary internet users alike, this case is a stark warning: the age of deepfake sex-content is here, and society must grapple with how to respond.

Japanese authorities have made what appears to be the country’s first criminal case involving AI-generated pornography of public figures. On October 17 2025 the Tokyo Metropolitan Police Department announced the arrest of a 31-year-old man from Akita, named Hiroya Yokoi, who allegedly used generative-AI software to create and sell explicit images depicting more than 260 women—among them J-pop idols, actresses and television personalities.
Japanese authorities have made what appears to be the country’s first criminal case involving AI-generated pornography of public figures. On October 17 2025 the Tokyo Metropolitan Police Department announced the arrest of a 31-year-old man from Akita, named Hiroya Yokoi, who allegedly used generative-AI software to create and sell explicit images depicting more than 260 women—among them J-pop idols, actresses and television personalities.
Japanese authorities have made what appears to be the country’s first criminal case involving AI-generated pornography of public figures. On October 17 2025 the Tokyo Metropolitan Police Department announced the arrest of a 31-year-old man from Akita, named Hiroya Yokoi, who allegedly used generative-AI software to create and sell explicit images depicting more than 260 women—among them J-pop idols, actresses and television personalities.