DEV Community

Discussion on: Claude 2.1 AI model with 200K Context is Live

Collapse
 
ranjancse profile image
Ranjan Dailata

Sorry, there are much more hidden things with the Max Token, there's research going on with the Sliding Window based token generation, however at the moment, it's impossible to build an LLM with the infinite context window, the LLM would go wild and loose context and won't be able to generate the next word per say as per the statistical next word prediction.

More research is required in this space, and it can be done by the dedicated LLM vendors such as Open AI, Anthropic, Cohere etc.