The advent of Artificial Intelligence and large language models is changing the way businesses work in immeasurable ways. The ability to automate the process of collection, analysis and summarisation of data is certain to streamline practices and reduce time and costs. It is undoubtedly one of the greatest breakthroughs in modern times and the ripple effects of its creation will be felt by us all moving forward.
However, it is important to remember that these tools are just that: tools! They require oversight and confirmation and should not be used to replace professional judgement.
We are beginning to see the creation of contracts that have been generated using large language models such as Open AI’s Chat GPT, Microsoft’s Co-Pilot or Google’s Gemini (“AI Generated Contracts”) and whilst these contracts may look and feel as though they have been drafted by professionals, they do come with significant shortcomings. To properly consider the potential issues with AI Generated Contracts, I have used a large language model to create my own AI Generated Contract using the following prompt:
‘I need a set of terms and conditions for the provision of services generated. Under the agreement my company will be charging on a monthly basis for the services, I would like my liability to be as limited as possible, I cannot guarantee any time frames but I will do everything I can to provide the services as well as possible. I need this to be in a legal format that I can reuse for multiple customers. It also must include all the standard terms that are needed. The contract needs to be made under English and Welsh Law with the English and Welsh courts having exclusive jurisdiction.’
This is the level of instruction that we would generally consider as standard from new Clients so is broadly consistent with clients’ needs when considering putting together a new contract. Once this prompt was entered into the large language model, the AI Generated Contract immediately began rolling across the screen, which leads to the first shortcoming.
Instantaneous Drafting
The process of instructing a legal professional to draft contracts takes time for any business. Likely, anyone who has been through this process will know that once you make a request, you will be asked a series of questions to clarify the agreement that you need. When creating an AI Generated Contract, there is no such stage; the draft begins instantaneously and will punch out clauses to your hearts content. Whilst incredibly convenient, this removes any clarification, detail consideration or examination of your needs in the contract.
Ultimately, it should be understood that if you are creating an AI Generated Contract, the prompt that you type in will be taken as everything that you need, with all facts being considered and correct. In my example prompt, I stated that I will be providing ‘Services’, I did not specify whether I would be milking cows, providing software maintenance or designing the next generation of spaceships and the system didn’t care. It drafted the contract to fit the provision of generic services.
Whilst this could be construed as a positive, no barrage of questions to slow up the process of getting an agreement over the line, it is important to remember why you need the contract in the first place. It is to form an agreement for a legally binding relationship and cover off as many necessary contingents as possible. Part of the reason to see a professional is to consider points that you may not have known existed, but a professional with education in the area and experience may put before you. This opening foray of questions allows legal professionals to provide you comfort that we have considered what you may need and presented it to you, rather than accepting instructions on face value.
Lack of Detail Within the Clauses
One of the key indicators that we currently have of AI Generated Contracts is the limited amount of detail in the clauses. My AI Generated Contract is no exception, each of the points that I mentioned within the prompt were present and read as though in line with the prompt. However, when scratching the surface, the lack of details begins to harm the strength of these core terms.
As an example, my AI Generated Contract states that my customer will ‘pay the Company the monthly charges as specified in the proposal or invoice’. The proposal and invoice are not defined, I mentioned neither in my prompt and there is no consideration of what will happen if these two documents (should they exist) conflict with one another. Undoubtedly, this may seem like a pedantic point, but clarity is always paramount in any commercial contract. Under this particular AI Generated Contract, any stray email from a team member containing a price could be considered as a ‘proposal’ and could be binding on me and my business.
There is, of course, the argument that (using the example above) in a court, my business may well be proved right and will be able to charge the rate in my invoice, rather than a stray email. However, it seems neither sensible nor economical to take each of my customers to court over the difference between a proposal price and an invoice price. The far cheaper and more convenient solution would be to have an agreement stating which would take precedence.
The example above is only one the issues presented by the lack of detail in my AI Generated Contract. Throughout the document, a lack of definitions and details create uncertain solutions when posed with some very basic scenarios. This lack of clarity is something we as Commercial Solicitors seek to avoid. Our detail orientated approach forces us to look at these agreements and consider the ‘what if’ scenario and add detail accordingly.
Uncertainty of Risk
My AI Generated Contract provides me with a bare bones contract to provide my services. As already established above, it has asked no additional questions to consider what shape the agreement should take, and what is there is fairly minimal, but there is a shape of a contract. I (as a businessperson) need a contract to cover off my risk and I now have what looks and reads as though it covers off the usual points that I would see covered in a contract. And at this point it is fair to think, if it looks like a duck and sounds like a duck…
The problem with this approach is that there is no way to confirm your level of risk. In my prompt I stated that I want my liability level to be as low as possible and the clause that was generated is a minimal clause that seeks to cap my liability at the ‘total charges paid by the Customer in the 3 months immediately preceding the event giving rise to the liability’. Now this drafting alone raises its own questions through lack of detail covered in the section above (multiple events that cause liability, unpaid invoices, high earning months) but there is also a lack of ability to confirm that this liability cap will be suitable and enforceable by asking follow-up questions to a professional.
Large language models are incredibly adept at confidently declaring ‘This will be effective’ for anything that it generates but the honest answer would be far closer to ‘it depends’. Legal professionals have the opportunity to discuss in detail the types of risk you are seeking to mitigate and using their professional judgement to draft terms accordingly. With this in hand, you can enter into contracts confidently knowing where your strengths and liabilities lie. This is far preferable than having to deal with the shock that the robot may have been fibbing when it declared that your liability clause was bulletproof.
Can I Make it Work Better?
There is always the possibility of improving AI Generated Contracts by including more detailed prompts, following up with questions and adding additional details after the fact. However, no matter the detail that is added, the limitations of these systems will (at least in the short term) still be an issue.
The first limitation is the training data. Ultimately, these programmes are built on huge data sets of training data, (obtaining exact details of training data is difficult as these systems rarely share their training data) and this training data consists, at least in part, of existing agreements that have been published on the internet.
The agreements in the training data will vary from the ineffective to the spectacular and everything in between. If you imagine a stadium full of people that have each searched online for ‘Service Contract’ and each printed one, some of these people are legal professionals and others are not. Creating an AI Generated Contract would be similar to asking this stadium full of people to write you a contract using only a single paragraph as instruction. In actual fact, it would likely be somewhere closer to asking 200,000 stadiums full of people, but the point still stands. You will get an approximate answer that contains the opinions of the good and the bad, but not one that can be relied on.
The final and most important limitation of these models is that they are not designed to provide legal advice, they are designed to generate text. They are incredible, sophisticated pieces of technology that can provide time and cost saving solutions, those that use them appropriately will reap the benefits. But they remain tools, to assist with text generation and are no replacement for professional expertise.
At Ellisons, our Commercial Contract Team is increasingly using AI, but solely as a tool for assisting us to create and review contracts for our clients. If you are in need of a contract for your business, please do not hesitate to get in touch by emailing george.wright@ellisons.com.
