Buying a business that uses or commercialises AI? Issues to consider
Businesses are increasingly using AI in their products, services and day-to-day operations. But many don’t yet have the necessary skills or resources to implement responsible AI practices, such as effective AI governance frameworks. This lack of expertise, combined with the rapid evolution of AI systems, can create unexpected risks for buyers. That’s why it’s important for buyers to conduct thorough legal due diligence on any business that uses or commercialises AI.
This article outlines the key areas buyers should consider when conducting legal due diligence on a business that uses or commercialises AI – in addition to the typical areas buyers cover in legal due diligence generally.
Inventory and description of AI systems
It is essential to have a comprehensive inventory of all AI systems, models, tools and solutions developed, deployed, licensed or used by the business. This should include systems such as machine learning, deep learning, natural language processing, generative AI and agentic AI.
For each system, buyers should understand its purpose, whether it is used internally or offered to customers, its origin (in-house or third-party) and the jurisdictions in which it operates.
Buyers should also ask for the details of any AI systems under development, including the stage of development.
Undertaking a high-level review of the AI systems of the business is critical to understanding how the business uses AI and identifying any risks in its approach (ie. development or use of AI in high-risk use cases without sufficient guardrails, or extensive or unmonitored use of public non-secure AI tools).
Corporate structure
Buyers should review the corporate structure of the business to determine if it effectively isolates AI-related activities where appropriate. This is especially relevant where the activities involve heightened legal, regulatory or reputational risks (such as in healthcare and financial services).
If the business commercialises AI tools and solutions that are developed in-house, buyers should identify which entities hold the intellectual property and which enter into commercial arrangements, and whether these are separate to the entities which conduct the business’ other day-to-day operations.
The review should also consider whether the corporate structure is appropriate for the nature and scale of the business’ AI operations and whether post-acquisition restructuring is required.
Undertaking a review of the corporate structure is important to ensure that any risks posed by AI are appropriately quarantined.
AI agreements
Buyers should review any third-party agreements governing the rights of the business to use and commercialise (as applicable) the relevant AI tools and solutions used in the business (including any licensing, service, software-as-a-service, consulting or other similar agreements). Standard terms and conditions for ‘off the shelf’ type AI tools and solutions used by the business should also be reviewed.
Understanding the legal rights and obligations of the business is critical to understanding the risks that may arise from its use and commercialisation of AI.
Data and privacy
Buyers should be required to identify, in general terms, the data used for training, validation, and testing of AI – including where it comes from (ie. proprietary, open source, licensed or user-generated) and the steps taken to ensure it’s used lawfully. The business should also confirm it holds all necessary intellectual property rights to use this data.
Where any data used includes personal information, the business should confirm it complies with all relevant privacy laws and best practices, including steps taken to de-identify data where appropriate.
Buyers should also request copies of the business’s policies for data retention, deletion and access – any third-party data sharing or processing arrangements should also be obtained and reviewed.
It is also important to confirm the business complies with modern slavery and other laws relevant to the preparation and tagging of any data used to train, validate or test AI.
It is essential that any data used to train, validate and test AI has been properly and lawfully obtained. If not, the business may face legal claims such as for copyright infringement. Non-compliance with privacy and other relevant laws may expose the business to liability for significant penalties and reputational damage.
Security, reliability and incident management
Where possible, buyers should review the business’s policies and procedures relating to the security, robustness and reliability of AI systems. This includes both general security-related documents – such as security policies, assessments and certifications – as well as AI-specific documents, like model documentation and testing reports.
Buyers should ask for details of all known incidents of misuse, malfunction or unintended consequences, as well as internal investigations or remedial actions. It is also important to request evidence of effective incident and risk management and ongoing monitoring of AI system performance.
Understanding the robustness of the AI systems used or commercialised by the business and the monitoring and control measures in place is important to understanding the business’s level of risk exposure.
Regulatory and ethical compliance
In certain jurisdictions, regulatory approvals or requirements, notifications or registrations may be required for AI deployment (such as mandatory disclosure requirements). It is important that buyers determine if the business is operating in such jurisdictions and, if so, request evidence of compliance.
Even where no specific regulatory requirements apply, buyers should request confirmation of the business’ compliance with any relevant ethical guidelines, codes of conduct or industry standards applicable to the business’ use of AI.
Buyers should also review any training provided to staff on responsible AI use and the existence of internal policies or committees overseeing AI governance, ethics and compliance. Where internal policies exist, the terms should be reviewed to ensure that the business’s practices align with industry standards.
Details of any pending regulatory enquiries or investigations should be requested and reviewed. Buyers should also examine the business’ processes for responding to data subject requests or regulatory enquiries.
Importantly, buyers should confirm that the business complies with all applicable laws relating to the use and commercialisation of AI, and that it has procedures in place to reduce the risk of regulatory non-compliance.
Insurance and internal procedures
Lastly, buyers should confirm the target business holds insurance policies that cover AI-related risks.
The ways in which businesses use and commercialise AI are evolving rapidly – and the regulatory landscape is changing just as quickly, with new regulations under consideration or being introduced in many jurisdictions.
Given the pace of change, it is more important than ever that buyers remain up to date with recent developments.
If you would like more information or to discuss how we can help you conduct legal due diligence and assess AI-related risks, please contact our team.
Contact