Skip to content
Back to Blog
GDPR GitHub Copilot Privacy EU Compliance Microsoft

Is GitHub Copilot GDPR compliant? A plan-by-plan analysis

Published on July 8, 2025 · Updated April 3, 2025 · 7 min read · by Lurus Team

GitHub Copilot is the most widely used AI coding assistant in the world. With GitHub’s enterprise reach and Microsoft’s infrastructure, it also has one of the most mature compliance frameworks in the AI coding tool space. But does that make it GDPR compliant for your specific situation?

The answer is: it depends heavily on which plan you’re using — and even the best plan has a structural compliance challenge that no DPA can fully resolve.

The Key Variable: Which Plan You’re On

GitHub Copilot’s data practices differ dramatically between individual and business plans. This is the single most important factor for GDPR compliance:

PlanTrains on your code?Zero retention (IDE)?DPA available?
FreeYes, by default (opt-out available)NoNo (GitHub Privacy Statement applies)
Pro / Pro+Yes, by default (opt-out available)NoNo
BusinessNeverYesYes (downloadable)
EnterpriseNeverYesYes (downloadable)

If you’re on a Free or Pro plan and have not explicitly opted out of training data use in your GitHub account settings (github.com/settings/copilot/features), GitHub may use your prompts and suggestions to improve AI models. For EU developers working on client code, this is a problem.

Zero Data Retention: What It Actually Means for Copilot

For Business and Enterprise customers using Copilot in an IDE, GitHub provides zero data retention by default for prompts and suggestions:

“GitHub does not retain prompts or suggestions for IDE use on Copilot Business and Enterprise plans.”

This is the strongest retention guarantee among major AI coding tools for enterprise plans. In practice: when you trigger a code completion or chat in your IDE, the request is processed and the response is returned — nothing is written to storage.

The nuance: This zero-retention guarantee applies specifically to IDE-based completions and chat. For other Copilot surfaces (github.com chat, mobile, CLI), prompts and suggestions are retained for 28 days on Business/Enterprise plans — required for thread history functionality.

Usage telemetry (pseudonymous metadata about how you use Copilot) is retained for 2 years on all plans — this is not the same as retaining your actual code.

GitHub’s DPA: The Most Accessible in the Market

Here’s where GitHub Copilot genuinely stands out from competitors: GitHub publishes a freely downloadable Data Protection Agreement that explicitly covers Copilot.

Available at: github.com/customer-terms/github-data-protection-agreement

The DPA:

  • Covers GitHub Enterprise Cloud, Teams, and GitHub Copilot explicitly
  • References EU Standard Contractual Clauses (SCCs) for EEA/UK data transfers
  • Confirms Business/Enterprise: no training on data
  • Is a standard-form agreement — no negotiation required, available to all qualifying customers

For GDPR Article 28 compliance, this is what you need. GitHub’s transparency here is significantly better than most competitors in this space.

The FISA 702 Challenge: No DPA Can Solve This

GitHub, Inc. is a subsidiary of Microsoft Corporation — a US company. This creates an unresolvable structural compliance tension.

Under FISA Section 702, US intelligence agencies can compel US electronic communication service providers to disclose data on foreign targets. This authority:

  • Applies to GitHub/Microsoft regardless of where data is physically stored
  • Cannot be mitigated by SCCs — SCCs address routine commercial transfers, not government access orders
  • Applies even when data is retained for 0 days — in-transit data during request processing remains accessible
  • Is acknowledged implicitly in GitHub’s DPA through the SCC framework, but the underlying legal exposure remains

The European Court of Justice addressed this in Schrems II (2020). For high-stakes data, the theoretical risk of a FISA 702 order — however unlikely in practice — is a real compliance consideration.

For most EU developers: The practical risk is theoretical. GitHub processes enormous volumes of enterprise data under FISA jurisdiction and there is no public record of FISA orders targeting developer code specifically.

For regulated sectors (healthcare, finance, legal, public sector): Your organization’s data protection officer and legal team need to assess this risk explicitly, especially under the EU AI Act framework coming into force.

EU Data Residency: Where Does Copilot Actually Process Data?

GitHub operates on Microsoft Azure’s global infrastructure. There is no Copilot-specific EU data residency option that ensures your code is processed exclusively in EU data centers.

GitHub Enterprise Cloud customers may benefit from Microsoft’s broader Azure data residency commitments, but these are not publicly documented as a Copilot-specific feature. For most Business plan customers, data processes on GitHub’s globally distributed infrastructure.

This is different from a provider that contractually guarantees EU-only processing (such as EU-based AI tool providers).

Practical Guidance by Team Type

Individual developers (Free/Pro):

  • Opt out of training data use at github.com/settings/copilot/features immediately if working on client or employer code
  • Understand that your code goes to GitHub’s US infrastructure regardless of opt-out status
  • The opt-out prevents training, not transfer

Small teams (Business plan):

  • Upgrade from Free/Pro to Business to get zero-retention and DPA coverage — this is the minimum viable GDPR setup
  • Download and execute the GitHub DPA for formal Article 28 compliance
  • Assess whether EU data residency is contractually required by your clients

Enterprise teams in regulated sectors:

  • The DPA is available and comprehensive
  • FISA 702 exposure requires legal assessment — it cannot be contracted away
  • Consider whether “zero retention” for IDE use satisfies your sector’s requirements
  • Document your assessment in your GDPR Records of Processing Activities (RoPA)

Public sector / government:

  • Most public sector procurement requirements in the EU require EU-based processing
  • GitHub Copilot likely does not meet these requirements without Microsoft’s dedicated government cloud, which is a separate product

How Copilot Compares to EU-Native Alternatives

GitHub Copilot’s compliance framework is the most mature in the US-based AI coding tool market. The downloadable DPA, Business/Enterprise zero-retention, and explicit no-training commitment are genuine strengths.

But structural gaps remain:

  • No EU-exclusive processing — data routes through global (primarily US) infrastructure
  • FISA 702 exposure — inherent to being a US company
  • Individual plans (Free/Pro) train by default — many users are unaware

Lurus Code as a German company processes all data exclusively in EU data centers, is not subject to FISA 702, provides a DPA to all customers (not just enterprise), and never trains on code on any plan. For EU teams where EU-exclusive processing is a hard requirement, the structural difference matters regardless of plan tier.

Summary

QuestionFree/ProBusiness/Enterprise
Trains on your code?Yes by default (opt-out available)Never
Zero data retention (IDE)?NoYes
DPA available?NoYes — publicly downloadable PDF
EU data residency?NoNo (global infrastructure)
Subject to FISA 702?YesYes (GitHub is a US company)
Suitable for EU B2B professional use?Not recommendedYes, with assessment

Bottom line: If you’re using GitHub Copilot for EU professional work, Business or Enterprise is the only plan with a workable GDPR setup. Free and Pro plans train on your data by default and have no DPA. Even on Business/Enterprise, the absence of EU-exclusive processing and FISA 702 exposure warrant explicit legal assessment for regulated sectors.


Sources: docs.github.com/en/copilot, github.com/features/copilot, github.com/customer-terms/github-data-protection-agreement — verified April 2025.