DEV Community

Cathy Lai
Cathy Lai

Posted on

I Lost 2 Hours to an AI Search Answer — Here’s What I Learned About Context Engineering

The Issue

Last week, I tried to use Perplexity AI to teach me about EAS Builds. I got thrown off everywhere was confused for a long time! Here're the sequence of questions I asked:

Then

It's all well and good, within the context of EAS Build variations.

Problematic follow-up question

I assumed the below is also within the context of the conversations above, but it gave me a surprising answer:

And the first reference returned was a 2019 article talked about unrelated structure - preview, production and testing branching of the source code on GitHub, not EAS Build config.

What I thought I asked

  • I’m talking about eas.json
  • EAS build profiles (development, preview, production)
  • distribution: "store"
  • TestFlight vs App Store release

This is a binary signing & distribution question.

What Perplexity answered (what it assumed)

Perplexity cited this article:

How to manage staging and production environments in a React Native app

At first glance, it sounds relevant:

  • staging vs production
  • React Native
  • real-world apps

But that article is about:

  • environment variables
  • API base URLs
  • runtime configuration
  • feature flags

👉 It has nothing to do with EAS Build, TestFlight, App Store Connect, or signing.

Why Perplexity didn’t “infer” my earlier context

This was the key realization:

Perplexity does not reliably carry context across separate queries.

Each question is treated as a fresh retrieval task:

  1. interpret keywords
  2. fetch high-ranking related pages
  3. synthesize an answer

If most indexed content around “staging vs production” is about environment variables, that’s what it will pull — even if you meant EAS build profiles.

The fix: anchoring the context (aka context engineering)

What actually works is pinning the domain so the AI can’t drift.
Things that help a lot:

  • naming concrete artifacts: eas.json
  • naming commands: eas build, eas submit
  • naming platforms: TestFlight, App Store Connect
  • explicitly excluding irrelevant meanings

Example of a much better prompt:

I am using Expo EAS. In eas.json, I have build profiles like development, preview, and production.

Explain the difference between a preview build and a production build in terms of signing and distribution (TestFlight vs App Store).

Ignore articles about staging/production environment variables.

Even better: paste a snippet of the eas.json.

Conclusion

There is still some technical details to learn about AI tools, even though it uses plain English and feels "easy" to use! I definitely need to learn more for it to be useful in solving my everyday issues.

Top comments (2)

Collapse
 
bhavin-allinonetools profile image
Bhavin Sheth

This is such a relatable AI moment 😄 I’ve had similar issues where the tool answered the keywords I used, not the actual problem I meant. The idea of “anchoring context” with concrete terms like eas.json, commands, and platforms is a really practical takeaway. Great reminder that with AI, how we ask is often more important than what we ask.

Collapse
 
cathylai profile image
Cathy Lai

Yes I now realised that context is super important! Good thing that Perplexity gave an original link reference, otherwise I would have gotten the wrong information.

It seems also that a lot of time the information could be from outdated documentation. So definitely a link to the current SDK version in the prompt would be super helpful.