AI and its crunch points
AI has fast become a new entrant on the list of concerns for legal professionals since the launch of something akin to an AI arms race following ChatGPT’s launch in November 2022. Whether AI will make lawyers redundant still remains a talking point but falls behind more risk-based and near-term worries around what AI-use policies businesses need in place to protect them, and how to keep up as AI evolves. Fintech lawyers are both excited and nervous to see how AI might impact their working environment (the legal sector as much as the Fintech sector), from product development to improving processes.
The regulatory framework that applies to the development and use of AI, including generative AI, is complex and multilayered. A range of existing laws already apply, including privacy, cyber, intellectual property, antitrust, consumer protection and employment laws, as well as financial sector-specific rules. More recently, countries have begun adopting laws, proposals and bills that specifically focus on regulating AI. At the time of publication, the EU has just reached a historical political agreement on its landmark Artificial Intelligence Act. While it is likely that there is still more technical wrangling to follow to refine the agreement, this will likely move the dial on how the world will look at AI regulation and we can expect some other legislators and regulators globally to follow suit.
Many of these laws and regulatory principles include requirements regarding governance, oversight and documentation. For example, the EU GDPR includes a principle of accountability alongside a number of specific requirements for assessments and record-keeping, and the draft texts of the EU’s AI Act include a number of requirements relating to risk-management systems, oversight, audit and record-keeping. For financial services, it's also important to consider applicable rules and guidelines from the FCA and international equivalents on oversight and governance in areas such as retail banking products, outsourcing arrangements, model risk management, operational risk management and the responsibilities of senior management.
We asked Tech/Digital Associate Nicole Kidney at Clifford Chance for her views on how fintech companies can best start to address AI governance.
Q&A on the challenges of AI for start-ups and SMEs
with Natasha Ballantine, Deputy General Counsel of tech company Foundry
When it comes to AI, what are the key crunch points you’re having to look out for when it comes to deploying AI both internally and externally for product-related matters?
When we talk about AI, it’s important to differentiate between Generative AI, for example ChatGPT, versus other types of machine learning, outside of deep learning.
As a software company we have been looking at machine learning for a few years and currently have plug-ins to our main products which include machine learning. One of the plug-ins we have is called CopyCat. It copies specific visual effects from a small amount of data you provide it and learns to replicate the transformation across a much larger pool. This saves the person using the software a lot of time, building efficiency. From a legal perspective, we have been thinking about what is being input to a model and issues around data protection and intellectual property rights, as well as looking to analyse the output of the model.
In relation to Generative AI, this is a relatively new area and many SMEs are trying to work out what their risk appetite is and what their policies and procedures should be in this area. There are similar considerations in relation to data protection, IP rights and keeping company information confidential and in relation to the output, the laws globally differ on ownership of the output and some tools contain license back provisions to the owners of the tool, so it becomes quite complex.
Q&A with Natasha Ballantine, Deputy GC at Foundry
Foundry is owned by US listed corporate, Roper Technologies Inc. Natasha also works with Fintech iPipeline, owned by the Roper group.
"AI presents great opportunities for Fintech companies, but the fast moving regulatory landscape can seem daunting. There are steps that companies can take now to start to develop and deploy AI responsibly, while enabling adaptability for upcoming regimes."
- Nicole Kidney, Associate, Clifford Chance