A Professional based out of India specialized in handling AI-powered automations. Contact me at ranjancse@gmail.com, LinkedIn - https://www.linkedin.com/in/ranjan-dailata/
Sorry, there are much more hidden things with the Max Token, there's research going on with the Sliding Window based token generation, however at the moment, it's impossible to build an LLM with the infinite context window, the LLM would go wild and loose context and won't be able to generate the next word per say as per the statistical next word prediction.
More research is required in this space, and it can be done by the dedicated LLM vendors such as Open AI, Anthropic, Cohere etc.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Sorry, there are much more hidden things with the Max Token, there's research going on with the Sliding Window based token generation, however at the moment, it's impossible to build an LLM with the infinite context window, the LLM would go wild and loose context and won't be able to generate the next word per say as per the statistical next word prediction.
More research is required in this space, and it can be done by the dedicated LLM vendors such as Open AI, Anthropic, Cohere etc.