NOTHING BUT LOVE AND PEACE ACROSS ALL OF EQUESTRIA!!!

An Article Titled 'Tokens' That Actually Talks about Tokens

WTF Is a Token??

TL;DR: Tokens are how an LLM 'sees' text. They are the basic building blocks that can be a word, part of a word, a number, or punctuation.

How text is encoded into tokens

Take, for example, "I am an unhappy cat." this piece of text may be tokenized into:
"I, am, an, un, happy, cat, .".
-# note: yes, those leading spaces are often part of tokens

wtf just happened?

  1. Common words are single tokens: Words like I, am, an, and cat are common enough that they get their own individual token.
  2. Complex/longer words are broken down: The less common word unhappy is efficiently split into two more basic parts: the prefix un and the root word happy. This helps the model understand word construction.
  3. Spaces and punctuation get included: Notice the leading spaces in am or cat. A word with a space before it is often a different token than the word without one (cat vs. cat). Similarly, common pairings like ." are merged into a single, efficient token.

The Building Block Dilemma: Why Your AI's Memory Depends on Smart Tokenization

Why This Matters: Poor tokenization can waste 70-80% of your context window on inefficient tokens, dramatically limiting how much text your AI can process at once. This directly impacts everything from document analysis to complex reasoning tasks.

Think of tokenization as choosing the right building materials for construction—the wrong choice can make your project impossibly expensive or structurally unsound.

Approach 1: The Prefab House Method (Whole Words Only)

Imagine building exclusively with pre-manufactured room modules. Perfect for standard constructions, but completely useless when you need custom work.

This mirrors whole-word tokenization, which breaks down when encountering:

  • Informal language and errors ("omg", "typo123", "O'Connor")
  • Emerging terminology ("blockchain", "COVID-19")
  • Non-English content

Supporting every possible word variant would require an enormous vocabulary—like needing a warehouse full of every conceivable room type, making the system bloated and slow.

Approach 2: The Individual Brick Method (Character-Level)

Now imagine building everything from individual bricks. Maximum flexibility, but two critical flaws:

  • Lost architectural vision: When examining individual bricks, you can't tell if you're building a "hospital" or a "house"—the bigger picture disappears.
  • Massive inefficiency: A simple structure requiring one prefab unit now demands thousands of individual pieces. Your workspace (context window) fills up before you can complete anything meaningful.

The Optimal Solution: Mixed Materials (Subword Tokenization)

Modern systems use a construction approach with three material types:

  • Standard modules: For frequently used complete elements (house, build, the)
  • Adaptable components: For common structural elements like foundations and connectors (-ed, un-, -tion)
  • Individual bricks: Reserved for unique specifications and custom elements

This hybrid approach maximizes both workspace efficiency (fewer total pieces needed) and construction flexibility (can handle any design requirement).

Edit

Pub: 03 Aug 2025 12:36 UTC

Edit: 03 Aug 2025 15:07 UTC

Views: 188