Bruce Willis makes deepfake technology relevant. What do gamers need to know about this technology?

By Ben Hamilton and James Deady 

Deepfake technology is becoming increasingly sophisticated and is attracting interest from the gaming and entertainment industry. But the use of deepfake technology is largely unregulated. What are the legal risks associated with its use?

There has been recent viral speculation in the media that renowned actor Bruce Willis sold rights to his ‘likeness and image’ to US AI (artificial intelligence) firm Deepcake.

Willis collaborated with Deepcake in 2021 to create his digital twin to be used in 15 Russian telecoms. The digital twin was created by using the deepfake technology technique which identified 34,000 images of Willis from various photos and videos from the movies Die Hard and Fifth Element. You can see from this Facebook video that the resemblance is uncanny.

Deepfake technology is becoming increasingly sophisticated and appealing to both the gaming and entertainment industry. It allows sampling of real actors’ expressions and features to generate real-time facial movements based on chat-generated text. This means that the cost of producing a virtual character to be used in a game, say your favourite sports star, could be cut in half.

So, what is deepfake technology and what are the legal risks associated with its use?

What is a deepfake?

A deepfake is an imitation of a video, audio, or photo that appears genuine but is the result of manipulation by artificial intelligence technology. They are an example of the expression ‘synthetic media’ which is images, sound and video that appear to have been created through traditional means but have, in fact, been constructed by complex software. The software is generally advanced forms of machine learning and AI.

You may have come across the @deeptomcriuse TikTok account, which has posted dozens of deepfake videos impersonating Tom Cruise and attracted some 3.6 million followers. The videos look practically identical to the real deal.

But this technology isn’t only limited to TikTok. In 2021, visual artist Chris Umé released a video in which he used the technique of deepfake on FIFA players’ faces. Gaming companies are now exploring future deepfake opportunities within the gaming industry, as who wouldn’t want to play a game with a life-like avatar of your favourite character from Star Wars? An even more immersive gaming experience would be not simply controlling the character, but also having the avatar track your face and mouth movements – something deepfake technology is making a reality.

The legal risks of deepfake technology

Unregulated territory

Deepfakes are rapidly developing in a largely unregulated area of technology. There are considerable concerns about its misuse for the wrong purpose, such as for fraudulent impersonation for financial gain. For example, Patrick Hillman, chief communications officer of blockchain ecosystem Binance claimed that scammers made a deepfake of him to trick contacts into taking zoom meetings with him. The fraudster would then use the deepfake in a meeting with these clients.

Due to these concerns, companies such as Google have banned the training of AI systems that can be used to generate deepfakes on its Google Colaboratory platform. It also released a database of 3,000 deepfakes to assist researchers to build the tools they need to circumvent harmful deepfake videos.

Copyright issues

The issues with respect to copyright are likely to be complex.

In the Bruce Willis example, if a developer is considering using images from Willis’ movies to create the deepfake technology, it may need to seek permission from not only the actor but the production company as well. Generally, copyright in a completed film will be owned by the person who arranged for the firm to be made (producer or production company). Unless there is an agreement in place which provides for a contrary position, it is likely that permission (ie a licence) for use will need to be obtained from the production company. In creating any deepfake technology, due diligence will be required to ensure that the developer is not breaching any form of copyright afforded to a person involved in the creation of the film and/or imagery that will be used.

Compliance with Australian Consumer Law

A developer will also need to keep in mind its obligations under the Australian Consumer Law.

Section 18 of the Australian Consumer Law provides that a person must not, in trade or commerce, engage in conduct that is misleading or deceptive or is likely to mislead or deceive. A developer will need to ensure all necessary permissions have been obtained by the person whose image will be used for the deepfake technology. If the permissions are not obtained and the technology is developed, the developer could face significant liability for making misrepresentations that the person has endorsed the use of their images and otherwise has an affiliation with the technology.

As deepfakes continue to transform, game developers exploring this technology will need to pre-emptively think through their existing contractual arrangements and navigate any new laws that may be introduced. It will be important to ensure that the necessary intellectual property and licencing agreements are in place to appropriately document the ownership rights of a person’s digital likeness.

We consider that these agreements could be quite complex and will require considerable expertise to ensure that the game developers’ rights are appropriately protected.

If you need cyber or intellectual property advice, please do not hesitate to reach out to our team.


Ben Hamilton

Ben Hamilton

Partner & Technology and Digital Economy Co-Lead

Ben specialises in technology law, intellectual property and commercial contracts, trade marks and commercialisation.

James Deady

James is an commercial lawyer specialising in technology procurement, privacy, data security and intellectual property matters.

You might be also interested in...

Corporate & Commercial | 13 Oct 2022

Latest legal trends in gaming

Hall & Wilcox Partner James Deady discusses key legal issues facing the billion-dollar gaming sector, including venture capital investment and M&A activity.

Cyber | 5 Oct 2022

Rocked! Cyber attacks in the gaming industry are getting worse

The Melbourne International Games Week ought to be a week of celebration, but it comes at a time where the industry is heavily targeted by cyber criminals. We outline key developments and what gaming organisations can do to protect themselves.