Funds investing in a business that uses or commercialises AI? Read this first

Insights20 Aug 2025
By Steve JohnsJames DeadyEliza Unger and Stephen Lin

The use and commercialisation of AI is becoming an increasingly attractive prospect for investment funds. However, many businesses still lack the necessary skills and resources to effectively implement and manage responsible AI development and use practices. Together with the rapid evolution of AI systems and use cases, this creates a complex set of risks. It is essential that funds looking to invest in such businesses understand and assess the risks.

In this article, we consider key issues investment fund managers should consider when investing in a business that uses or commercialises AI.

Understand your AI exposure

It is critical when investing in a business that uses or commercialises AI, to understand what AI systems the business uses and how.

For that reason, managers should obtain a comprehensive inventory of all AI systems, models, tools and solutions developed, deployed, licensed or used by the business. This should include systems such as machine learning, deep learning, natural language processing, generative AI and agentic AI. 

For each system, fund managers should understand its purpose, whether it is used internally or offered to customers, its origin (in-house or third-party), and the jurisdictions in which it operates. 

Details of any AI systems under development, including the stage of development, should also be obtained from the business.

Understand data, data security and privacy

Data is fundamental to the development and use of AI. Given this, it is important to ensure that any data used to train, validate and test AI has been properly obtained. Failure to use lawfully obtained data may result in the business being subject to claims, such as for copyright infringement.

The business should be required to identify, in general terms, the data used for training, validation and testing of AI, including its origin (ie proprietary, open source, licensed or user-generated) and the steps taken to ensure lawful use. The business should also confirm that it holds all necessary intellectual property rights to use such data.

It is also important to ensure that the business complies with all privacy laws relating to the use and collection of data used by, or to train, AI systems. Failure to comply may expose the business to significant penalties and reputational damage.

Where data used includes personal information, the business should confirm it complies with all relevant privacy laws and best practices, including any steps taken to de-identify data where appropriate.

Fund managers should also request copies of the business’ policies for data retention, deletion and access. Any third-party data sharing or processing arrangements should also be obtained and reviewed.

Lastly, it is important to confirm that the business complies with modern slavery and other relevant laws concerning the preparation and tagging of any data used to train, validate or test AI. Again, failure to comply may expose the business to significant penalties and reputational damage.

Regulatory and ethical compliance

It is important for funds to confirm that the business complies with all applicable laws relating to the use and commercialisation of AI and that it has procedures in place to reduce the risk of regulatory non-compliance. Failure to do so can result in significant penalties.

In certain jurisdictions, regulatory approvals, notifications or registrations may be required for AI deployment and other regulatory requirements regarding the use of AI may apply (such as mandatory disclosure requirements). Fund managers should determine whether the business operates in such jurisdictions and, if so, request evidence of compliance.

Even where no specific regulatory requirements apply, managers should require the business to confirm its compliance with any relevant ethical guidelines, codes of conduct or industry standards applicable to the business’ use of AI.

Managers should also review any training provided to staff on responsible AI use, as well as the existence of internal policies or committees overseeing AI governance, ethics and compliance. Where such policies exist, their terms should be reviewed to ensure the business’ practices align with industry standards.

Details of any pending or past regulatory enquiries or investigations should be requested and reviewed.

The ways businesses use and commercialise AI are evolving rapidly and the regulatory landscape is constantly changing. it is important for fund managers to identify key areas of legal risk associated with the use and commercialisation of AI systems and to undertake appropriate due diligence.

If you would like more information, or to discuss how we can help you conduct legal due diligence, please contact our team. 

Contact

Relevant Services

Hall & Wilcox acknowledges the Traditional Custodians of the land, sea and waters on which we work, live and engage. We pay our respects to Elders past, present and emerging.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of service apply.