Australia’s Deepfake Dilemma and the Danish Solution

Before this model can be adopted, policymakers must assess its feasibility. The legal hurdles are significant: copyright law protects original expressions, but the human face does not fit neatly into this box. The most prudent path would likely be to create a new right that borrows principles from copyright but is explicitly designed to protect biometric identity. This would require robust exceptions for fair dealing to protect news reporting, satire and artistic expression, similar to current fair-use provisions.

Enforcement is equally challenging, but the true value of a biometric copyright framework is its capacity to impose clear legal liability on platforms that have evaded meaningful responsibility. Clear, property-based rights would provide a mechanism for individuals or the eSafety Commissioner to issue takedown notices for deepfakes with the same legal force as a major film studio issuing a notice for a pirated movie. A platform’s failure to act would cause significant statutory damages, incentivizing proactive removal rather than passive hosting.

The federal government’s power over copyright provides a basis for a national law. Biometric rights would complement, not replace, existing laws. The Privacy Act governs how organizations handle data, while biometric rights would govern how our identities are represented and used. These changes would strengthen the mandate of the eSafety Commissioner, creating an unambiguous category of illicit content and a powerful new tool to protect Australians online.

The primary appeal of this model is its potential to reinforce social cohesion. It offers a tool to protect vulnerable communities disproportionately targeted by digital harassment, helps restore trust in public institutions by making it easier to delegitimize fabrications and fosters a healthier public sphere where individuals can engage without fear of having their identities exploited.

Given the clear shortcomings of our current model, the time is right to consider a new strategy. There is a clear willingness within the Australian government to address this threat: just this month, Arts Minister Tony Burke discussed the threat of deepfakes, saying, ‘I don’t fear technology. I just know we need to be able to respond to technology.’

The government is well positioned to lead this effort by establishing a parliamentary working group to determine the feasibility of establishing a biometric copyright framework. This should involve experts from the Attorney-General’s Department, the eSafety Commissioner and civil society to create a balanced and effective Australian model.

Liberal democracies now have to fill the regulatory vacuum that technology platforms have created. Australia’s greatest strength is its social fabric, and by learning from Denmark, we can reinvest in the shared, verifiable reality that binds our nation together.

Andrew Horton is ASPI’s chief operating officer. Elizabeth Lawler is a subeditor for The Strategist. This article is published courtesy of the Australian Strategic Policy Institute (ASPI).

Leave a comment

Register for your own account so you may participate in comment discussion. Please read the Comment Guidelines before posting. By leaving a comment, you agree to abide by our Comment Guidelines, our Privacy Policy, and Terms of Use. Please stay on topic, be civil, and be brief. Names are displayed with all comments. Learn more about Joining our Web Community.