Skip to main content
Bot Personality

Tips for creating a bot that does just what you want.

O
Written by Om Kamath
Updated over a week ago

Name

It is always beneficial to start by giving your bot a name. Naming your bot adds a human touch, especially when greeting users or addressing questions related to the bot.

Prompts:

Your name is [Name of your Bot].
OR
You are ‘[Name of your Bot]’.

Description

The description of the bot makes it aware of the context that will be provided through the knowledge base. Being context-aware provides the bot with a framework for answering questions while keeping a specific domain in mind.

Prompts:

Your primary task is to [specify the domain].
OR
Your main objective is to assist me in [specify the domain].

Note: The Bot Name and Description set in the General Section are only for the user’s convenience in differentiating between multiple bots. The bot itself is unaware of these settings. Therefore, it is necessary to explicitly define the bot’s name and description within the Personality Prompt to establish its identity and characteristics.

Boundaries

One potential drawback of using LLMs trained on large datasets is the tendency to generate hallucinated responses. It’s important to note that the data used to generate responses is not utilized for fine-tuning or retraining the LLM on-demand by Cody. Instead, it serves as a contextual reference for querying the LLM, resulting in faster responses and preserving data privacy.

To ensure that the bot does not refer to data points from the original LLM dataset, which may overlap with similar domains or concepts, we have to delimit the context strictly to our knowledge base.

Prompts:

The knowledge base is your only source of information.
OR
You are reluctant to make any claims unless stated in the knowledge base.

There may be some instances where the bot doesn’t require a knowledge base or uses the knowledge base as a source of reference. In such cases, the prompt will change considerably.

Prompt:

Your primary source of reference is the knowledge base.

Response Features (System Prompt)

The features of the response generated by the bot can also be controlled by the personality of the bot to some extent. It can consist of defining the tone, length, language and type of response you expect from your bot.

Prompts:

1. Tone: You should respond in a [polite/friendly/professional] manner.

2. Length: The responses should be in [pointers/paragraphs].

3. Language: Reply to the user [in the same language/specify different language].

4. Type: Provide the user with [creative/professional/precise] answers.

You are free to experiment with various combinations and features. The examples provided are just for your learning purposes, and the possibilities are endless.

Media

One of the most interesting features of Cody — the ability to embed media in the responses. When embedding media such as images, GIFs or videos, it is always recommended to import the media to a separate document or import the entire raw document using the built-in Cody text editor wherein you can add media. You can either copy/paste the media or embed them into the document using URLs.

An image illustrating the media buttons.

After successfully importing the media, you need to specify the same in our bot personality prompt. The prompt can be broken into two parts: Initialisation and Illustration.

Prompts:

Initialisation:
Incorporate relevant [images/videos/both] from the knowledge base when suitable.

Illustration:
Add images using the <img> tag and videos using the <iframe>
For example:
<img src=”[Image URL]”>
<iframe src=”[Video URL]”></iframe>

Fallbacks

There will be times when the bot is unable to find relevant content for the question asked by the user. It is always safer to define fallbacks for such scenarios to avoid providing misleading or incorrect information to the user (only applicable in use-cases where a knowledge base exists).

Prompts:

1. Refrain from mentioning ‘unstructured knowledge base’ or file names during the conversation.

OR

Don't justify your answers.

2. In instances where a definitive answer is unavailable, [Define fallback].

OR

If you cannot find relevant information in the knowledge base or if the user asks non-related questions that are not part of the knowledge base, [Define fallback].

Steps (Optional)

If you want your bot to follow a specific conversational timeline or flow, you can easily define it using steps. This approach is particularly useful when using your bot for training or troubleshooting purposes. Each step represents a particular phase or stage of the conversation, allowing you to control the progression and ensure that the bot provides the desired information or assistance in a systematic manner.

Prompt:

Follow these steps while conversing with the user:

1. [Step 1]

2. [Step 2]

3. [Step 3]

Note: While defining steps, it is recommended to enable ‘Reverse Vector Search‘ for improved replies and allocate an adequate number of tokens to the chat history. This allows the model to consider the conversation history, including the user’s input and the bot’s previous response, when generating a reply.

Data Capture (Optional)

This prompt, in harmony with the conversational flow (steps), is particularly beneficial when the use-case of your bot revolves around support or recruitment scenarios. Currently, there is no long-term memory or database connectivity in Cody that can capture the data and store it for analytical consumption. In the future, with newer updates to the OpenAI API like function calling, we will definitely be bringing in newer features to be able to capture and store the data for a longer term.

For now, you can access the chats of your bot users (through widgets) by navigating to the ‘Guests‘ chats in the chat section. You can then manually analyze the captured data for further insights.

Prompt:

Collect the following data from the users:

– [Field 1]

– [Field 2]

– [Field 3]

– [Field 4]

Ask one question at a time. Once you have collected all the required information, close the conversation by saying thank you and displaying the data collected.

Response Formatting*

A nifty little feature of Cody is its support for formatting bot responses using markdown or HTML tags. By providing your bot with an HTML or markdown format template in the bot personality, it will attempt to format the responses accordingly, whenever necessary.

Prompt:

Response Format:

<h1>[Field Name]</h1>

<p>[Field Name]</p>

<p>[Field Name]</p>

*Formatting works best on GPT-4

Prompt Example

Cody as a Lead Generation Bot

Anatomy of a prompt (labelled).

Demo chat displaying the prompt in use.

Cody as a Marketing Bot

Cody as a Training Bot

To read more personality prompts, please check out our use cases, which contain detailed prompts along with their parametric settings.

Did this answer your question?