DEV Community

Alex Spinov
Alex Spinov

Posted on

GitHub Just Changed Copilot's Data Policy — Here's What You Need to Know

GitHub just announced a change to how Copilot uses your data. And developers are not happy.

What Changed

Starting April 24, 2026, GitHub will use interaction data from Copilot Free, Pro, and Pro+ users to train its models. This includes:

  • Code snippets you send to Copilot
  • Accepted or modified suggestions
  • Code context around your cursor
  • Comments, file names, repository structure
  • Your thumbs up/down feedback

The Controversial Part

It's opt-out by default. Meaning your data is used unless you explicitly go to Settings → Privacy and turn it off.

This is the part that has developers frustrated. As one HN commenter put it:

"If I'm paying, I want to opt-in, not opt-out."

Who's Affected

  • Affected: Copilot Free, Pro, and Pro+ users
  • Not affected: Copilot Business and Enterprise users
  • Previous opt-out: If you opted out before, your preference is retained

What's NOT Collected

GitHub says they don't use:

  • Content from private repos "at rest" (though they do process private code during active Copilot use)
  • Enterprise repository data

The EU Question

Multiple developers raised GDPR concerns. Is opt-out for training data legal in the EU? The answer isn't clear-cut, but it's a valid concern that GitHub will likely face scrutiny over.

What You Should Do

1. Check Your Settings Now

Go to GitHub → Settings → Copilot → Privacy. Verify your data sharing preferences.

2. Consider Alternatives

If this bothers you, several options exist:

  • Claude Code — Anthropic's coding assistant. Different data policy.
  • Cursor — AI-first editor with privacy mode.
  • Cody — Sourcegraph's AI coding assistant.
  • Local LLMs — Run models locally with Ollama + Continue.dev. Zero data leaves your machine.

3. Review Your Other Tools Too

This isn't just a GitHub problem. Every AI coding tool has a data policy. Read them.

The Bigger Picture

We're entering an era where AI tools are trained on the output of their own users. This creates a feedback loop:

  1. You write code with Copilot
  2. Copilot learns from your code
  3. Copilot suggests your patterns to other users
  4. Those users' modified code trains the next version

Whether this is good or bad depends on your perspective. But you should at least know it's happening.


Did you opt out? Are you switching tools? Let me know in the comments.


Related:

Top comments (0)