If you’ve been using AI tools for a little while, you’ve probably heard “tokens” at some point. Tokens are how the different AI models break up sets of text. Similar to how humans break up sets of text into words.
Tokens matter because they are how most things are calculated when using text-based AI tools. For example, context windows, input, output, most things that you’ll be charged for, etc.
If you pay a flat monthly fee or use a free version, you probably don’t care as much. Until you do. When you exceed a pre-defined token limit, things usually stop working or, at least, stop working well.
Anyway, OpenAI has a tool that you can use to see how some of their models turn blocks of text into tokens.
The tool is called a Tokenizer and theirs is found at - https://platform.openai.com/tokenizer
If you’ve never heard of a token and don’t care then you probably didn’t read this far.
If you did, in an effort to make your time worth it, here is a cat meme. Thank you for reading

New Model. Same Confusion.
If every update feels like “cool… but what do I actually do with this?”, you’re not the problem. Get clear on what matters, what to ignore, and how to use AI without chasing every shiny release.

Comments