Skip to main content

About Tokens

Daniela Gomez avatar
Written by Daniela Gomez
Updated over 3 weeks ago

What is a Token?

A token is a unit of text—such as a word, part of a word, or punctuation mark—that AI language models use to process and generate language. Think of tokens as characters serve as the building blocks that help AI systems understand and produce text. You can use a tokenizer to determine how many characters are present in a given piece of text.

How Many Tokens Can Perplexity Process at Once?

By default, Perplexity can process up to 8000 tokens (around 20000 characters) per query. If you submit longer text (such as by pasting a large passage), Perplexity will automatically convert the input into a file to handle more content.

Note: For very large inputs, uploading a file is recommended to take advantage of the higher token limits available to Pro subscribers.

Did this answer your question?