Miller Thomson LLP

09/17/2025 | Press release | Distributed by Public on 09/17/2025 10:20

Regulating deepfakes through copyright: Would Denmark’s proposed approach be right for Canada

Artificial intelligence ("AI") generated synthetic media, or "deepfakes," pose increasingly significant legal and ethical challenges within the cybersecurity landscape. Deepfakes involve the unauthorized use of an individual's likeness to generate highly realistic yet fabricated images, audio or video. Inevitably, they raise issues around personality rights, which is the right to control the use of one's own name, image, voice and other unique identifying features. As deepfake technology becomes more accessible, understanding the legal protections against the unauthorized use of an individual's likeness or other personality rights in Canada and globally, is essential.

Earlier this summer, the Danish Minister of Culture, Jakob Engel-Schmidt, announced proposed amendments to Denmark's copyright legislation aimed at combating the proliferation of deepfakes (the "Proposed Bill"). Backed by cross-party consensus, the Proposed Bill underscores the growing urgency to regulate emerging technologies under both existing and new legal frameworks. Notably, the European Union's AI Act (also referred to as Regulation (EU) 2024/1689), which came into force in August 2024, already includes a requirement to label deepfake content as being artificially generated or manipulated.

However, while the EU's approach addresses the concerns of consumers of deepfake content, Denmark's Proposed Bill goes further by addressing rights and protections for subjects featured in the content. Most current laws enacted to protect personality rights rely on privacy or trademark based protections, which are limited in scope and duration, rather than copyright principles. Denmark's novel approach of reframing digital identity itself as a matter of copyright offers broader protections. In doing so, Denmark is charting new legal territory that could redefine how societies safeguard personal identity in the age of AI. The question is whether Canada should follow suit.

Proposed amendments to Danish copyright law

The Proposed Bill outlines amendments to Denmark's copyright legislation that would create new pathways for individuals who are the subjects of deepfakes to take legal action. More particularly, it would prohibit the sharing of realistic digital reproductions of someone's voice, image or other personal characteristics without consent and provides the subjects of deepfakes with the right to demand removal of the content from digital platforms.

If the platforms fail to comply with such requests, they would be liable for fines. The Proposed Bill establishes a protection period of the subject's lifetime plus 50 years following the death of the individual whose likeness was used in the deepfake content. Keeping in line with typical copyright infringement exceptions, use of deepfakes for caricature, parody or satirical reasons would be unaffected and therefore permitted under Danish copyright law.

Canada's current approach to copyright and deepfake regulation

Currently, Canadian law does not extend copyright protection to an individual's personality rights. This is due, at least in part, to the fact that the Copyright Act only recognizes copyright in original works that are, among other things:

  1. created by an author, and
  2. fixed in a material form.

In practice, what this means is that while one's voice or face are not on their own protected under copyright, as soon as they are incorporated into a fixed work, such as a recording of a voice or a photograph of a face, then copyright subsists in that work.

Importantly, in most cases, the first owner of copyright in a work is the author of the work. As a result, when a photographer takes a picture of a subject, it is the photographer - and not the subject whose image is being captured - who owns the copyright in the resulting picture.

With this in mind, if the subject of a deepfake wanted to take legal action based on the depiction of their persona without their permission, they would have to rely on other statutory frameworks and common law actions, such as:

  • Statutory Privacy Claims: In some provinces, privacy statutes prohibit the unauthorized use of a person's name, likeness, or voice for the purpose of advertising or promoting goods or services or for any other purposes of gain, as it is considered a violation of privacy. Deepfake content that replicates a person's likeness without consent may fall within the scope of these statutory protections. However, in most cases, the relevant legislation is clear that this type of claim is extinguished on the death of the person whose privacy was violated.
  • Trademark Protections: Under Canada's Trademarks Act, it is prohibited to adopt "in business, as a trademark or otherwise" a mark that consists of or closely resembles the portrait or signature of a living individual or someone who has died within the last 30 years. While not traditionally used in the context of deepfakes, this provision could be relevant, depending on the content and use of the deepfake.
  • Tort of Passing Off: A passing of claim may arise where a deepfake falsely implies that an individual has endorsed or is affiliated with a product, service, or brand, thereby capitalizing on their reputation and goodwill without their consent.
  • Tort of Misappropriation of Personality: A misappropriation of personality claim protects an individual's right to control the commercial use of their personality rights. The claimant must prove they are identifiable, their likeness was used without consent and was exploited for commercial gain. Courts have held that rights pursuant to this tort may survive death, though the duration of survivability has not been settled. Deepfakes that misappropriate an individual's likeness for profit may be subject to this tort, if commercial benefit arises from manipulated but recognizable images.

The future of deepfake regulation in Canada

If Canada were to follow Denmark's lead in expanding copyright law to cover an individual's personal attributes such as physicality, facial features and vocal likeness, it would be a significant shift in how individuals protect their identity in the digital age.

By framing misuse of likeness as copyright infringement, individuals could gain expanded rights and enforcement tools. For example, the time period for exclusive control over one's own personality could be defined in accordance with the Copyright Act's term of one's lifetime plus 70 years, instead of the shorter time limits provided under privacy or trademark law. Additionally, victims of deepfakes could have access to statutory damages, a powerful remedy under Canada's Copyright Act. Of course, this would also mean that the usual copyright exceptions, like fair dealing or user-generated content, would still apply to deepfake cases.

Ultimately, however, adding the concept of protection for personality rights into the Copyright Act as a form of copyright infringement would involve a significant departure from Canada's current model for understanding and recognizing copyright, as it would raise questions around current definitions of authorship, ownership and protectable works. Rather than making additions to our current copyright framework, as Denmark has proposed to do, it might be more effective to address deepfakes in a form of wholesale AI legislation that would address other matters that are unique to AI and its related ethical and legal issues.

Conclusion

Denmark's proposed legislation represents a forward-thinking approach to regulating deepfake technology and raises questions around how Canada proposes to do the same. As these technologies become more sophisticated and widespread, it may become imperative for Canada to amend its existing legal frameworks or develop new ones to ensure that both public figures and private citizens have effective tools to protect and enforce their personality rights. If you would like further information or to discuss these issues, please reach out to a member of Miller Thomson's Technology, Intellectual Property, and Privacy Group.

Miller Thomson LLP published this content on September 17, 2025, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on September 17, 2025 at 16:20 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]