Getting ahead of AI risks.
AI and data governance
Getting ahead of AI risks
"Accuracy and risk."
- Respondents' biggest concern about using AI in legal work
Why it matters
As AI adoption surges, the legal team are key gatekeepers helping to identify and manage emerging risks affecting data privacy, intellectual property rights, operational resilience, and broader legal and ethical concerns such as fairness and transparency. Their proactive oversight is vital to safeguard assets, ensure compliance, and keep the business ahead of regulatory change.
Travers Smith view
Key blind spots we see:
- Do you know what AI tools your business is using - or what data they’re accessing? Under pressure to deliver efficiencies quickly, some businesses operate with limited visibility over the proliferation of AI tools and platforms used across business units. When AI adoption happens without central legal or risk oversight, you lose the opportunity to properly assess contractual, confidentiality, data protection, and intellectual property risks. This significantly increases the likelihood that sensitive information is leaked via third-party AI tools, or that AI is used inappropriately in high-risk contexts without adequate human oversight.
- Does your business own what it thinks it owns? The legal position on ownership of IP in AI-generated outputs remains unsettled. While the IP in a human creation would normally belong to the author or employer, the copyright status of purely AI-generated works is uncertain because of a perceived lack of originality - and the AI system itself is not currently recognised as an inventor or author for the purposes of IP protection. If your business model depends on owning and commercialising IP, these ambiguities heighten the need for robust controls over how AI is deployed, to avoid the business being unable to enforce or protect its rights. The data used to train AI systems may contain third-party IP or confidential information. Without the correct consents and licences, your organisation could face legal action by those rights holders. The extent to which training an AI system on third-party IP constitutes infringement is currently being tested in courts globally, but these cases have yet to deliver definitive answers, and so organisations must actively manage these risks - through due diligence and contracts - until greater legal clarity emerges.
- Has your business assessed its cyber resilience beyond its own perimeter? Unchecked AI adoption can introduce new entry points and amplify existing security risks across the business and its supply chain. While internal cyber risk might be actively managed, supply chain vulnerabilities are often overlooked - and these are increasingly the source of sophisticated cyber-attacks, data breaches, and operational disruption. This may be a blind spot for some businesses, but it is firmly on the radar of legislators and regulators across the UK and EU: recent legislation, such as the NIS2 Directive and DORA in the EU and the UK's new Cyber Security and Resilience (Network and Information Systems) Bill, are bolstering cyber resilience requirements - and regulators' enforcement powers - in relation to supply chains.
Action points
AI Policy: Businesses should map and monitor all AI systems and tools in use, maintain an up-to-date inventory and require sign-off for new AI deployments. Legal should be part of the assessment process. It is important to have an AI Policy that stipulates which systems are authorised for use and to train staff on the safe and appropriate use of AI and its risks. For more on the implications of AI for legal teams, see below.
IP in AI output: While the copyright position at law remains uncertain, use your contracts to clearly delineate IP ownership for AI-generated outputs and secure rights in training data. The outcome to the UK government's copyright consultation could lead to greater clarity in relation to ownership and infringement, so monitor closely for developments.
Cyber resilience: Encourage the business to check the security and cyber resilience of suppliers with access to your systems, not just as a pre-contractual exercise, or when something goes wrong, but as part of a regular supply chain audit. Build supply chain resilience into your cyber resilience planning and rehearse your response to supply chain attacks.





