The blackmail email lands like a grenade in your inbox. You’re a sitting Malaysian MP, a YB (Yang Berhormat, “The Honorable”), and there it is: screenshots that appear to show your face in a porn clip you swear does not exist, plus a demand for US$100,000 — about RM420,000 — to stop the video from being released to the public, your party, and your family. Pay in crypto using the QR code, the email says, or watch your maruah (dignity, honor) burn. At least ten Malaysian politicians — including cabinet-level figures — have received an almost identical threat like this in mid-September 2025, according to the government and police. In other words: welcome to Malaysia’s first full-blown AI extortion scandal.
The campaign was fast, coordinated, and clearly designed to weaponise malu (shame). Communications Minister Fahmi Fadzil confirmed that he and several other lawmakers were sent emails claiming to have explicit AI-generated videos using their faces and threatening to leak them unless the money was paid. The messages reportedly used near-identical wording, attached doctored screenshots, and even came from what investigators believe is the same Gmail address. Police say one MP, Subang’s Wong Chen, received a demand for “100,000 USDT” — a cryptocurrency called Tether — with a three-day deadline. This isn’t some random troll in the comments section. This is systematic ugut (threat/extortion).
The victims are mostly men from the governing coalition, especially from Prime Minister Anwar Ibrahim’s Parti Keadilan Rakyat (PKR), which makes the whole thing smell political as well as criminal. If you can’t topple someone with a vote of no confidence, maybe you try fitnah (slander/defamation) — and in Malaysia’s moral climate, nothing stains a lelaki YB (male MP) faster than an accusation of sexual misconduct. What’s changed is that you don’t even need a real scandal anymore. You can fabricate one in an afternoon, with off-the-shelf AI.
Here’s how the trick works. Generative AI models can now produce convincing “deepfake porn,” taking a target’s face — scraped from years of press photos, selfies, TikToks, parliamentary video feeds — and grafting it onto someone else’s body in a sex act. Twenty-four months ago you needed serious technical skills to do that well. Now there are apps that do automated face-swaps, lighting correction, and lip-synching without the user even understanding what a neural network is. The scammer doesn’t need a full believable video to start the extortion; a few still frames or a 5-second loop is enough to trigger panic. The psychology is simple: even if you know it’s fake, will your voters? Will your wife? Will your ustaz?
Malaysia is not alone in this nightmare. Deepfake porn has already been used to attack women politicians globally, to humiliate, silence, and drive them out of public life. What’s new in Malaysia is that this wave is indiscriminate, more like ransomware: send the same threat to multiple targets, demand crypto, and hope someone quietly pays to protect their image. It’s AI-driven sexualised character assassination industrialised at scale.
The Malaysian police have opened investigations. But here’s the ugly legal question: is Malaysia actually ready for this? On paper, there are existing tools. You can investigate under extortion or blackmail provisions of the Penal Code, or under cybercrime and communications laws for distributing obscene content. Authorities have suggested that offenders could face heavy fines and jail. But the law still hasn’t fully caught up to the technology. Right now, Malaysia has no dedicated offence for creating or sharing AI-generated sexual imagery of a real person without consent. The clip may be “fake,” but the reputational damage is painfully real, and the law is still arguing with itself about where that falls — is it defamation? Obscenity? Harassment? Something else? Meanwhile the blackmailers are already transnational, anonymous, and paid in crypto.
Culturally, the scam is diabolically tailored to Malaysia. Politics here runs not just on policy but on perceptions of moral authority, especially for Malay-Muslim men in public office who are expected to project both religious credibility and family respectability. A sex clip — real or not — is instant poison. We’ve seen careers end over leaked hotel-room videos, often circulated right before party elections or crucial parliamentary votes. That history is exactly what these AI extortionists are exploiting. They don’t have to convince the whole country the video is real. They just have to trigger doubt, or enough disgust, to make an MP shut up, drop a reform, or step back from criticizing the wrong person.
There is also the social-media amplification machine. In Malaysia’s political WhatsApp groups, once an image is forwarded with the whisper “jangan viralkan (don’t spread this)” you can be sure it’s already viral. The attackers know that even if the police immediately announce “it’s a fake,” the screenshots will live forever in Telegram gossip channels and anti-government Facebook pages. By the time digital forensics teams finish proving it’s synthetic, the casual voter in Kedah or Johor has already absorbed the headline: “YB caught in sex act.” The correction rarely travels as far as the lie.
For now, ministers are urging all MPs — government and opposition — not to panic, not to pay, and to lodge reports. That’s basic damage control. But the truth is, this scandal is not just a scandal. It’s a forecast. Deepfake sexual blackmail is about to become a permanent weapon in Malaysian politics unless the country moves fast on three fronts: technical capacity (training police and courts to verify synthetic media quickly), legal clarity (explicitly outlawing non-consensual AI sexual material, regardless of “realness”), and cultural maturity — a collective agreement to stop rewarding cheap fitnah as political sport. Because in the end, this is not really about sex. It’s about control. And whoever controls maruah controls power.

My dear Malaysia, land of moral outrage and eternal political drama — you’ve done it again! This time, our austere, pious male politicians are the ones clutching their kain pelikat in horror, screaming “fitnah!” because someone dared to send them AI-generated sex videos of themselves. How deliciously poetic. For once, the men who built their reputations on righteousness and “family values” are the ones sweating under the digital duvet.
Let’s be honest, darlings — the mere thought of being portrayed in a scandalous act has sent our YBs (Yang Berhormats, “The Honorables”) into full panic mode. They are so pure, so above suspicion, that the idea of being seen naked (even artificially) is apparently more terrifying than climate change or corruption scandals. No, no, not our holy gentlemen! They have never entertained an indiscreet WhatsApp chat, never smiled at a pretty intern, never asked their assistant to “come discuss policy in private.” Heaven forbid!
And suddenly, the Police — usually slower than a Grab driver on a rainy Friday — are sprinting into action. Multiple reports, immediate investigations, press conferences! Bravo! This is the kind of lightning-speed policing Malaysian women can only dream of when they report actual cases of revenge porn, domestic violence, or sexual harassment. When a woman’s intimate photo is leaked, the response is: “We’ll see, maybe it’s her fault.” When a male politician gets a fake AI porn clip? “This is a matter of national security!” You can almost hear the sirens of patriarchy blaring down Bukit Aman.
But let’s not underestimate the artistry of the blackmailers. They have tapped into the oldest currency in Malaysia: malu (shame). Nothing topples a powerful man faster than the suggestion he’s been caught with his trousers down. And in a country where the public morality police are often louder than the actual police, a single doctored screenshot is enough to make a whole Parliament quake.
Still, Spicy Auntie has to laugh — not because extortion is funny, but because hypocrisy always is. These same men have passed or supported laws regulating women’s bodies, speech, and clothes. Now they’re discovering how fragile reputation becomes when technology strips away their armor of piety. Maybe it’s time they learned empathy the hard way: by watching their own faces — fake or not — become victims of a system that worships shame more than truth.
So to my dear YBs: welcome to the world women live in every day. Unwanted images, unsolicited exposure, and endless judgment. You don’t like it, do you? Good. Now you finally get it.