(Image credit: Anthropic)
What you need to know
- Claude is an AI chatbot built by Anthropic, made up in part by ex-Open AI staff members.
- Anthropic has announced that Claude can now process 100,000 tokens in its context window.
- This equates to around 75,000 words, or, an entire novel. By contrast ChatGPT is yet to fully roll out its 32,000 token context window.
You maybe haven't heard of Anthropic yet, but this AI startup is already making waves. Founded by former members of OpenAI, best known for its model behind ChatGPT and Bing Chat, Anthropic's alternative, Claude, has just been given a serious upgrade.
It all hinges on the context window, and Claude can now do something the likes of which we haven't seen yet. Anthropic has increased its capacity to an insane 100,000 tokens. But what does this mean in English?
100,000 tokens equate to around 75,000 words. Or another way to look at it is that Claude can process an entire novel, something Anthropic has done during testing.
"For example, we loaded the entire text of The Great Gatsby into Claude-Instant (72K tokens) and modified one line to say Mr. Carraway was 'a software engineer that works on machine learning tooling at Anthropic.' When we asked the model to spot what was different, it responded with the correct answer in 22 seconds."
This has huge implications for just how Claude can be used. By contrast, ChatGPT still hasn't rolled out its 32,000 token context window in GPT-4, and Bing Chat is based on GPT-4 while technically keeping its limit a secret.
The downside is that this isn't something that everyone can immediately just run out and use to process novels or extremely long, boring documents, like the CMA's decision on the Activision/Xbox deal. The 100,000 token context window is available in the API, so developers will get a first crack at it.
Nevertheless, the sheer scale of this is impressive, and already gives us a look at how these generative AI tools could be used to crunch serious amounts of data in the future.