- Rising Deepfake Exploitation: “Nudify” apps are being used in Australia to create non-consensual sexually explicit content, targeting women and minors, despite laws criminalizing such practices.
- Underground Funding & Distribution: These apps thrive through cryptocurrencies, in-game currencies, and obscure websites, bypassing traditional app stores and payment systems.
- Combatting the Crisis: Experts call for stricter regulations, financial disruption of these platforms, and increased public awareness to curb deepfake abuse and sextortion.
In Australia, a disturbing trend has emerged, with “nudify” apps and online services being used to create non-consensual sexually explicit content, particularly targeting women and minors. These apps, which allow users to digitally alter images by placing faces onto bodies performing sexual acts, are being fueled by an underground economy. People are using money, cryptocurrency, and even virtual currencies to generate exploitative deepfake imagery. Despite the country criminalizing the sharing of non-consensual sexual deepfake content last year, the issue persists, with several high-profile cases surfacing in 2025.
The Australian arm of the International Centre for Missing and Exploited Children (ICMEC) has been investigating the financial mechanisms behind these apps. The research highlights how users can easily upload a photo to these apps, which then generate convincing deepfakes by either swapping faces onto existing bodies or using entirely computer-generated models. While some of these apps are still available in app stores, others operate outside traditional platforms, using payment methods like cryptocurrencies and in-game currencies to fund their operations.
ICMEC’s findings also shed light on the victims of these exploitative practices, revealing that young girls are disproportionately targeted. The organization notes that many users of these apps are seeking to exploit or extort others, with a significant number of minors reporting threats involving digitally altered images. These forms of “sextortion” are becoming an alarming trend, with two in five Australian minors who have experienced sextortion indicating that they were coerced using a deepfake image.
The challenge of addressing these apps is compounded by their ability to find alternative financing and distribution channels. While payment processors like Mastercard and Visa have successfully cut ties with some apps, and certain apps have been removed from major app stores, others continue to operate via obscure websites or through adult networks, making them harder to detect and shut down. Even with these efforts, deepfake creators are increasingly turning to decentralized financial systems like cryptocurrency, further complicating enforcement.
ICMEC suggests three major actions to tackle the problem: disrupting the financial systems behind these apps, preventing their advertisement, and educating the public about the dangers. CEO Colm Gannon emphasizes the need for a change in social norms, stressing that creating these images is a violation of consent, just as much as physical sexual assault. While Australian laws are evolving to criminalize the distribution of such content, Gannon advocates for more comprehensive measures, including proposals from the UK to classify the creation of deepfake content as a criminal offense, as a potential model for broader regulation.
As the threat of deepfake abuse continues to grow, the Australian government, tech companies, and advocacy groups face an ongoing battle to combat this form of exploitation. With innovative technology enabling the creation of harmful content, the challenge now lies in balancing regulation with the preservation of technological advancements.