DEV Community

0 seconds of 46 minutes, 25 secondsVolume 90%
Press shift question mark to access a list of keyboard shortcuts
00:00
00:00
46:25
 
Daniel Davis for TrustGraph

Posted on

3 3 3 5 3

Are LLMs Still Lost in the Middle?

A few days ago, I talked about some of the inconsistency I've seen varying LLM temperature for knowledge extraction tasks.

I decided to revisit this topic and talk through the behavior I'm seeing. Not only did Gemini-1.5-Flash-002 not disappoint in producing yet more unexpected results, but I saw some strong evidence that long context windows still ignore data in the middle. Below is the Notebook I used during the video:

Notebook

Top comments (1)

Collapse
 
sophy_yan_f1a882f19130569 profile image
Sophy Yan

Great insights!

Postmark Image

Speedy emails, satisfied customers

Are delayed transactional emails costing you user satisfaction? Postmark delivers your emails almost instantly, keeping your customers happy and connected.

Sign up