DEV Community

Dr. Carlos Ruiz Viquez
Dr. Carlos Ruiz Viquez

Posted on

**Generative vs

Generative vs. Retrieval-based LLMs: A Tale of Two Approaches

When it comes to Large Language Models (LLMs), two dominant approaches have emerged: generative and retrieval-based. Generative models, like those from Meta AI or Google's BERT, focus on creating new, original content. They use complex neural networks to generate text that has not been seen before, relying on statistical patterns and relationships within the training data.

In contrast, retrieval-based models, exemplified by models like Google's T5 or Microsoft's Turing-NLG, take a more pragmatic approach. Rather than generating new content, they rely on pre-existing knowledge and information. When faced with a query or prompt, they rapidly search through their vast databases to retrieve relevant answers.

Generative Models: The Creative Prodigies

Generative models are ideal for applications where originality and creativity are essential. For instance, chatbots that can engage in conversations, or text generator...


This post was originally shared as an AI/ML insight. Follow me for more expert content on artificial intelligence and machine learning.

Top comments (0)