ClothOff is no longer just one app; by late 2025 it has become a generic term, like “deepfake nudes” or “AI revenge porn.”
The original clothoff.io domain went offline in June 2025 after Cloudflare, AWS, and every major payment gateway simultaneously dropped it. The Telegram bot @clothoff_bot was terminated the same week. Yet within hours, more than 80 mirror bots appeared with identical interfaces and the same model (some even kept the original logo and coin prices).
These clones now operate on privacy-friendly hosts in Seychelles, Moldova, and Russia, paid for in cryptocurrency. The best-known successors (NudeAI, Uncloth, MagicRemove, RemoveHer) openly advertise on X and TikTok using throwaway accounts that get banned and reborn daily.
Law-enforcement estimates suggest that in 2025 alone, over 120 million non-consensual nude images were generated worldwide using ClothOff-derived tools. In response, the U.S. passed the “Preventing Deepfake Porn Act” in October 2025, making it a federal felony to distribute such images of identifiable people without consent. The EU’s AI Act now classifies any undress service as “high-risk prohibited” from January 2026.
Despite the crackdown, teenagers still trade invite links in Discord servers and Snapchat groups, and new one-click web versions appear faster than they can be taken down. For most people under 25, “send me a ClothOff” has become casual slang, even when they’re using a completely different bot.
In short, the original ClothOff is dead, but the damage it started is now permanent and borderless.