AI Addendum FAQs

Our guide to the Juro AI Addendum for existing customers

M
Written by Michael Haynes
Updated over a week ago

General questions

Why am I being asked to sign an AI Addendum?

If someone in your organization has requested access to our new AI Assistant feature, we’ll ask your organization’s authorized signatory to sign an AI Addendum. The AI Addendum is only for existing Juro customers. If you’re a new customer, please speak to your salesperson.

Our AI Assistant is completely new. It has features we didn’t anticipate when you signed your master services agreement (MSA).

The AI Addendum sets out what you can expect from our AI Assistant and what you can’t.

The AI Addendum is essential, and we don’t allow access to AI Assistant without an AI Addendum being signed.

What is the AI Addendum?

The AI Addendum is an amendment to your contract with Juro. It changes the terms necessary to allow us to provide AI Assistant to you.

Why won’t you negotiate the AI Addendum?

We’ve spent a long time thinking about the AI Addendum and the terms required to offer AI Assistant at this experimental stage in the development of AI. This is reflected in the AI Addendum.

We’ll take feedback on board but - at this stage - we can’t offer alternative terms.

What will happen to my existing Juro subscription if I don’t want to sign the AI Addendum?

Nothing. You just won’t be able to access AI Assistant.

AI Addendum: clause-by-clause

Front end

The front end of the AI Addendum is purely mechanical. It explains how the AI Addendum amends your existing contract with Juro.

1. Definitions and interpretation

This explains how defined terms are used in the AI Addendum. Generally, we use the same defined terms as are used in your existing Juro contract.

2. Consideration

Consideration (i.e. an exchange of value) is required to make the variation binding under English law and under some other legal systems.

This clause describes the exchange of value, but right now we don’t charge for access to AI Assistant, so no additional payments are due as a result of signing the AI Addendum.

3. Variation

This clause describes how and when the amendment to your contract with Juro takes effect.

The AI Addendum takes effect when it’s signed by both you and Juro, and takes priority over your existing MSA terms.

4. Governing law and jurisdiction

Your AI Addendum uses the same governing law and dispute resolution process as your existing MSA.

Schedule - AI terms

1. Definitions

This clause introduces some new defined terms that we use.

The key new terms are:

AI Feature = any AI-enabled feature in Juro

Content = any Input and any Output (both defined below)

Input = any content that you put into our AI Features, including contract content, your AI playbook and your requests or prompts

Output = any content that our AI Features produce for you

2. Purpose

This clause explains that the AI Addendum regulates your use of AI Features only, not other features within Juro.

3. Conditions of use

This clause sets out additional things you must and must not do when you’re using AI Features in Juro. We need to ask you to comply with these conditions because the AI Features are different from features that already exist in Juro.

Here are the conditions:

General

Clause

Condition

Reason

3.1

Use the AI Features in accordance with your contract with Juro.

Our contract puts in place rules we need to manage risk in how AI Features are used, both for Juro and for Juro’s technology providers.

Things you must do

Clause

Condition

Reason

3.2.1

Comply with the agreement and any applicable laws.

You must follow our rules for using the AI Features. This means we can end our contract with you or turn off AI Features, if you don’t.

3.2.2

Comply with any rate limits or technical requirements Juro sets.

This helps Juro keep the AI Features available, stable and secure. It also helps Juro manage costs while we provide AI Assistant at no additional cost.

3.2.3

Manually review output to make sure it’s appropriate before you use it.

AI Assistant doesn’t understand your specific situation and isn’t capable of providing legal advice. The output might be wrong, so you still need to review all output and shouldn’t rely on it without first checking it.

Things you must not do

Clause

Condition

Reason

3.3.1

Don’t use our AI Features to compete with Juro.

This protects Juro’s investment in AI.

3.3.2

Don’t use bots or scrape or mine Juro’s AI Features.

This protects Juro’s investment in AI, keeps the AI Features stable, manages cost for Juro, and helps Juro comply with requirements from its technology providers.

3.3.3

Don’t claim something AI-generated is human-created.

Emerging regulation requires AI to be used transparently, not secretly. We also think this is the right thing to do.

3.3.4

Don’t put children’s personal data into AI Features.

The Juro platform isn’t designed to handle children’s personal data.

4. Termination and suspension of AI Features

This clause describes when Juro can permanently or temporarily stop you from accessing AI Features.

You aren’t entitled to a refund or fee reduction if we terminate or suspend any AI Features. This is because we currently don’t charge any additional fees for these features, and we’re clear with you from the outset that this is the case.

Permanent termination of AI Features

Clause

Circumstance

Reason

4.1.1

You no longer have a Growth or Enterprise plan

You must have a Growth or Enterprise plan to access AI Assistant.

4.1.2

You materially breach our agreement.

We must be able to trust our customers to play by the rules. If you don’t, this could expose Juro or its technology providers to liability.

4.1.3

Juro starts to charge customers for access to AI Features.

For now, Juro provides access to AI Assistant at no additional cost. If demand takes off, this won’t be sustainable. So we need to reserve the ability to charge for access to AI Features in future.

4.1.4

Changes in Juro’s relationships with third-party technology providers.

Juro depends on technology providers to provide the AI Features. If those providers no longer wish to provide the required technology on acceptable terms, then we’d have to withdraw the features.

4.1.5

Juro needs to comply with the law or a government request.

If a feature (or part of it) becomes illegal or Juro is asked to withdraw it by a government authority, Juro must be able to withdraw the feature. We think this is unlikely, but AI regulation is developing fast, so it’s not easy to predict what will happen.

Temporary suspension of AI Features

Clause

Circumstance

Reason

4.2.1

You breach our agreement.

If you breach our agreement in a minor way that can be corrected, we might need to suspend access while the issue is resolved. This helps to mitigate any risks posed to Juro and its technology providers.

4.2.2

Your use poses a security risk.

We need to be able to take steps to protect Juro and other users of Juro if there’s a security issue.

4.2.3

Juro suspects that you are using the AI Features for fraud or in a way that exposes someone to liability.

We need to be able to take steps to protect Juro, other users of Juro, and potential victims if this happens.

If Juro suspends access, we must restore your access once the circumstances giving rise to the suspension have abated.

5. Content

This clause deals with who owns Content and how Juro and its technology providers can access, review and use it.

Who owns what?

Input = Not owned by Juro. It’s owned by you (or anyone that’s licensed that Input to you).

Output = If IP rights are created in the Output and Juro owns them on creation, then Juro assigns those IP rights to you.

Note - Some countries do not recognise IP protection for content without a human author, so it’s possible that your Output will not qualify for IP protection at all.

What does Juro do with Content?

Juro uses the Content only to provide and maintain the services.

Juro does not use the Content to develop or improve its services.

Juro uses some contractors to provide and maintain the services, so it can sublicense Content to those contracts (but only if it’s necessary to do so to provide and maintain the services).

This means:

  • Juro doesn’t use your contracts, prompts or other inputs to train any model.

  • Juro doesn’t allow any of its technology providers to do that either.

Who can review Content?

Juro can review Content if it’s necessary to do so to provide or maintain the services. This would only really be necessary (for example) for Juro to enforce policies or if you asked Juro to investigate a bug.

Juro’s third-party technology providers review Content for debugging purposes and to prevent abusive or harmful uses.

At launch, Juro is using OpenAI’s GPT service on Microsoft Azure. OpenAI and Microsoft review Content if potential abuse is flagged. They delete Content after 30 days, unless a policy violation is identified (in which case the Content is stored for longer as evidence of the violation).

6. Disclaimers

This clause manages Juro’s risks arising from providing the AI Features. We can only provide the AI Features because the risk is managed in this way.

The disclaimers aren’t intended to allow Juro to provide a poor service. Rather, they reflect the reality of how generative AI technology works and how models are developed (both things that are outside Juro’s control).

6.1 Customer acknowledgements

Clause

Acknowledgement

Reason

6.1.1

You acknowledge that AI Features can produce identical or similar output for other users.

Generative AI works on a probabilistic basis. This means that AI produces the most likely output based on the given inputs, its model and its training data. This means other users are likely to get similar or identical results if they use similar or identical inputs.

You shouldn’t expect AI to produce unique output. You should also not expect to be able to take action against Juro or other users simply because another user obtained a similar or identical output and put it to use.

6.1.2

You acknowledge that AI Features might produce inaccurate output.

Generative AI can hallucinate. This means an output might be wrong. For example, AI might state a fact that is incorrect or assume that there is a law that exists that doesn’t.

AI simply produces the statistically most likely output based on the given inputs, the model, and its training data. It is not a search engine or reference tool.

You must use output with caution. You should check the accuracy and appropriateness of the output before you rely on it. Juro isn’t responsible for ensuring an accurate output.

6.1.3

You acknowledge that other users’ outputs are not Content owned by you.

Generative AI can produce identical or similar outputs for other users. You should not expect to own outputs created by other users, even if they are similar or identical to your own output.

6.2 Exclusion of warranties

Juro provides the AI Features “as is” and excludes any warranties.

AI Features are highly experimental. Juro is dependent on third-party technology providers to produce, train and make available models to Juro to use in its platform. Those technology providers won’t provide the technology on a basis that would enable Juro to flow through protections to you as a customer.

For this reason, you must be willing to accept the risks inherent with AI Features in order to use them. Organizations that do accept these risks do so in exchange for the productivity upsides that come with using AI Features.

6.3 No third party IP claims indemnity

Juro doesn’t provide a third party IP claims indemnity for AI Features.

Juro doesn’t produce or train the models it uses. The technology providers who do this won’t give Juro assurances about IP non-infringement in the way you’d normally expect from traditional SaaS or data tools. This is because large language models are trained on extremely large datasets compiled from trillions of data points, not all of which are verified to have been licensed for that purpose.

For this reason, Juro isn’t able to provide those assurances to its own customers using AI Features or output from them. You must be willing to accept the risks inherent with AI Features in order to use them. Organizations that do accept these risks do so in exchange for the productivity upsides that come with using AI Features.

6.4 No uptime service level for AI Features

We don’t provide an uptime service level for AI Features.

We’re still testing the stability and availability of AI Assistant at scale, so it’s not appropriate at this stage to provide a service level. For that same reason, we’re also not charging customers anything additional for access to AI Assistant.

Did this answer your question?