The conventional wisdom is that Google Workspace and Microsoft 365 dominate business email. I pulled the DNS records of every domain in Tranco top-1M to check.
Their combined share: 37.94%.
The other ~62% — half a million domains — live somewhere else: hosted at registrars, in regional providers, on cPanel installs at small hosters, or on actual self-managed mail servers. The "duopoly" framing is a thing people say, not a thing the data shows.
This post is about what you actually find when you build a daily DNS snapshot of email infrastructure for the entire public web, what it can and can't tell you, and what's surprisingly broken about it.
TL;DR. Daily snapshot of OpenINTEL's Tranco-1M scan, 2026-01-01. 660,114 domains publish MX, 616,352 publish SPF, 431,133 publish DMARC. Receiving-side share: Google 21.7%, Microsoft 16.3%, Yandex 1.9%, Mimecast 1.5%, Zoho 1.0% — and 25.9% "Unknown" plus another 24% on generic shared-hosting MX patterns. Sending-side: Amazon SES (5.86%) edged past SendGrid (4.66%) by domain count. 20.3% of all DMARC records are
p=nonewith no reporting endpoint — formal compliance, zero protection. Methodology is open and reproducible. Caveats below.
Why measure email infrastructure from DNS
Email is the oldest still-relevant protocol on the public internet, the primary B2B channel, and one of the main attack vectors for phishing. And yet there's no good open dataset of how the ecosystem actually looks: who hosts what, who sends what, and how authentication is configured.
What exists today:
- Valimail's Email Fraud Landscape — annual since 2017, DMARC-only, focused on industry cohorts (Fortune 500, US Gov, healthcare). Closed dataset, behind a lead form.
- Academic OpenINTEL papers — solid methodology, but each is a one-shot snapshot for a specific publication, not maintained.
- BuiltWith / Datanyze — commercial technographics, not reproducible, not for research.
The gap: an open daily snapshot, across all four layers (MX, SPF, DMARC, SaaS senders), at top-1M scale, with transparent methodology. That's what this is.
The data
Source
Base dataset is the daily forward-DNS snapshot from OpenINTEL (University of Twente / SURFnet / SIDN Labs), running since 2015. OpenINTEL queries the entire Tranco top-1M every day for MX, TXT, NS, A, AAAA, SOA, CAA, DNSSEC records and publishes Apache Parquet. Methodology paper: van Rijswijk-Deij et al., IEEE JSAC 2016.
Domain list is Tranco — the research-grade replacement for the deprecated Alexa list, hardened against manipulation (Le Pochat et al., NDSS 2019).
For 2026-01-01: 660,114 domains have valid MX, 616,352 publish SPF (v=spf1), 431,133 publish DMARC (v=DMARC1).
Classification
For each domain we look at four things:
# 1. Mailbox provider — primary MX target (lowest preference)
# matched against ~80 regex rules in mx_providers.py
RULES = [
(r"\.mail\.protection\.outlook\.com$", "Microsoft 365"),
(r"aspmx\.l\.google\.com$|aspmx.*\.googlemail\.com$", "Google Workspace"),
(r"mx\d?\.yandex\.net$", "Yandex 360"),
(r"\.mimecast\.com$", "Mimecast"),
# ... specific rules first, generic fallbacks last:
(r"^mail\.", "Generic / unmatched (mail.*)"),
(r"^mx\d*\.", "Generic / unmatched (mx*.*)"),
]
# 2. ESPs — every include: in the SPF record, matched against esps.py
# (Amazon SES, SendGrid, Mailgun, Mailchimp, Brevo, etc.)
# 3. SaaS senders — same mechanism, separate dictionary saas_senders.py
# (Shopify, Atlassian, Pardot, KnowBe4, Trustpilot, etc.)
# 4. DMARC — _dmarc.<domain> TXT, parsed for p=, sp=, pct=, rua=
# Domain is "enforced" if p ∈ {quarantine, reject} AND pct=100
# (or pct absent — RFC 7489 §6.6.4 default is 100)
If no rule matches, the domain goes to "Unknown / Other" — never dropped — and its MX target is logged for the next dictionary iteration. Domains are never excluded. Total by category = total domains, always.
Receiving side: who hosts inbound mail
| # | Mailbox provider | Domains | Share |
|---|---|---|---|
| 1 | Unknown / Other | 171,044 | 25.91% |
| 2 | Google Workspace | 143,171 | 21.69% |
| 3 | Microsoft 365 | 107,277 | 16.25% |
| 4 | Generic / unmatched (mail.*) | 91,739 | 13.90% |
| 5 | Generic / unmatched (mx*.*) | 59,783 | 9.06% |
| 6 | Yandex 360 | 12,587 | 1.91% |
| 7 | Mimecast | 9,850 | 1.49% |
| 8 | Generic / unmatched (smtp.*) | 7,649 | 1.16% |
| 9 | Zoho Mail | 6,800 | 1.03% |
| 10 | Amazon WorkMail | 4,707 | 0.71% |
Two observations that contradict typical industry talking points:
1. The Google/Microsoft duopoly isn't dominant. Combined: 37.94%. The long tail of registrar email, regional hosters, small SaaS, and self-hosted setups is half the market (25.91% Unknown + 24.12% generic = 50.03%).
2. "Generic" doesn't mean self-hosted. A hostname like mail.example.com is the default for cPanel/DirectAdmin shared hosting (Hostinger, GoDaddy, OVH, hundreds of regional providers). The unmatched MX targets confirm registrars dominate the long tail:
| # | MX target | Domains |
|---|---|---|
| 1 | route1.mx.cloudflare.net | 7,728 |
| 2 | route2.mx.cloudflare.net | 7,727 |
| 3 | route3.mx.cloudflare.net | 7,726 |
| 4 | eforward5.registrar-servers.com | 6,930 |
| 5 | mx1.hostinger.com | 5,110 |
| 6 | smtp.secureserver.net (GoDaddy) | 5,078 |
| 7 | mx3-hosting.jellyfish.systems | 2,212 |
| 8 | mx1.privateemail.com (Namecheap) | 1,658 |
Cloudflare Email Routing alone shows up on ~23,000 domains across its three MX targets. It's a free email-forwarding service with no real inbox, but as primary MX it's massive — and worth its own writeup.
Yandex 360 at 1.91% looks weirdly low. It is — Tranco is biased toward US/EU and global SaaS. Russian domains with low international traffic are underrepresented. Any conclusion about the Russian email market from this dataset will be wrong; you'd need a separate .ru ccTLD slice.
Sending side: ESPs
Share of the 616,352 SPF-publishing domains. Sums to >100% — most domains authorize multiple ESPs (e.g., Mailchimp for newsletters + SendGrid for transactional):
| # | ESP | Domains | Share |
|---|---|---|---|
| 1 | Amazon SES | 36,148 | 5.86% |
| 2 | SendGrid (Twilio) | 28,695 | 4.66% |
| 3 | Mailgun | 25,066 | 4.07% |
| 4 | Zendesk | 24,053 | 3.90% |
| 5 | Mailchimp | 23,606 | 3.83% |
| 6 | Mandrill | 22,008 | 3.57% |
| 7 | Salesforce | 15,426 | 2.50% |
| 8 | Mailjet (Sinch) | 12,720 | 2.06% |
| 9 | Brevo (ex-Sendinblue) | 6,892 | 1.12% |
| 10 | Elastic Email | 4,399 | 0.71% |
Important caveats:
- Domain count ≠ email volume. Amazon SES leads by domains because it's the cheapest IaaS-tier sender — small projects pile onto it. Big mailing lists actually run on specialized platforms with worse domain counts.
- Mandrill is part of Mailchimp. Combined Intuit MailChimp share is 7.40%, formally first. But many Mandrill SPF includes are leftover from migrations — actual usage is lower.
-
All numbers are a lower bound. SPF flattening — replacing
include:chains with raw IP networks to fit the 10-lookup limit (RFC 7208 §4.6.4) — hides the original ESP. Sample checks suggest 5–15% of larger corporate domains are flattened. If they all unflattened tomorrow, the top ESPs would gain a few percentage points.
The third layer: SaaS senders
Business apps that send mail on behalf of their customers:
| # | SaaS app | Domains | Share |
|---|---|---|---|
| 1 | Shopify | 5,446 | 0.88% |
| 2 | Pardot (Salesforce) | 5,191 | 0.84% |
| 3 | KnowBe4 | 3,309 | 0.54% |
| 4 | Trustpilot | 1,966 | 0.32% |
| 5 | Atlassian (Jira/Confluence) | 1,864 | 0.30% |
| 6 | Firebase (Google) | 1,614 | 0.26% |
| 7 | Lark / Feishu | 1,181 | 0.19% |
| 8 | BigCommerce | 1,157 | 0.19% |
| 9 | NetSuite (Oracle) | 1,139 | 0.18% |
| 10 | Qualtrics | 1,104 | 0.18% |
Two interesting datapoints here:
KnowBe4 at #3 (3,309 domains). KnowBe4 is security-awareness training — they send simulated phishing emails to your employees from your own domain (so the simulation is realistic). To do that, customers add KnowBe4 to their SPF. So 3,309 ≈ active KnowBe4 customer count in Tranco-1M, modulo SPF flattening. That's a public proxy metric for the security-awareness market — usually impossible to estimate without vendor reports.
Lark / Feishu at #7 (1,181 domains). ByteDance's Slack competitor showing measurable Western expansion. Useful trendline candidate.
DMARC: the compliance theater
431,133 domains publish DMARC = 69.95% of SPF-publishing domains. Looks great. The devil's in pct=.
Here are the 11 most common verbatim DMARC records:
| # | DMARC record | Domains |
|---|---|---|
| 1 | v=DMARC1; p=none; |
50,658 |
| 2 | v=DMARC1; p=none |
32,837 |
| 3 | v=DMARC1; p=none; rua=mailto:rua@dmarc.brevo.com |
7,172 |
| 4 | v=DMARC1; p=quarantine; |
4,365 |
| 5 | v=DMARC1;p=none; |
3,974 |
| 6 | v=DMARC1; p=quarantine |
3,636 |
| 7 | v=DMARC1; p=reject; fo=1; rua=...@emaildefense.proofpoint.com; ruf=... |
3,333 |
| 8 | v=DMARC1; p=reject; |
3,319 |
| 9 | v=DMARC1; p=quarantine; adkim=s; aspf=s |
2,978 |
| 10 | v=DMARC1; p=reject |
2,753 |
| 11 | v=DMARC1; p=none; sp=none; rua=mailto:dmarc@mailinblue.com!10m; ... |
2,353 |
Add up #1, #2, #5 — all p=none with no rua= — and you get 87,469 domains, 20.3% of all DMARC records. This is what I call DMARC compliance theater: the domain "has DMARC" formally, but there's no monitoring, no enforcement, no reporting endpoint. Usually this is auto-generated by a registrar setup wizard to satisfy the Google/Yahoo bulk sender requirements from Feb 2024.
There's also a fun side effect of how vendors deploy DMARC at scale — their reporting endpoints are visible in DNS as fingerprints:
-
Brevo —
rua=mailto:rua@dmarc.brevo.comon 7,172 domains. Brevo auto-generates this template for customers. -
Proofpoint Email Defense — 3,333 enterprise customers route aggregate reports through
emaildefense.proofpoint.com. -
Valimail —
dmarc_agg@vali.emailon ~3,700 domains. Note: Valimail publicly claims 65,000 customers — most of them are SMBs not in Tranco top-1M.
Practical implication for your own domain: counting DMARC adoption by record presence is the wrong metric. The right metric is enforcement rate — domains with p ∈ {quarantine, reject} AND pct=100. By that measure, the picture is much smaller, and the trend (+1.86% over the last 90 days) is slow.
What this measures, and what it doesn't
To be honest about it:
- DNS configuration ≠ email volume. All numbers are domain counts. We see declarations, not actual traffic.
- Tranco bias toward US/EU. RU/CN/JP/KR conclusions from this dataset will be wrong; you need ccTLD-specific slices.
- SPF flattening undercounts ESPs. Notable for large corporate domains (5–15%).
-
CNAME chains on MX aren't unrolled. A domain with MX
mail.example.com→ CNAME →example-com.mail.protection.outlook.comlands in "Unknown" instead of Microsoft 365. This will be fixed in the next iteration. -
DKIM isn't measured at all. Reading a DKIM key requires knowing the selector (
selector1._domainkey.example.com), which is arbitrary. OpenINTEL doesn't query it. There's no way to get DKIM coverage at this scale without active probing. - Vanity-MX setups for security vendors (Mimecast/Proofpoint customers using their own brand on the MX) are undetectable from DNS alone.
- Snapshots only. Trends come from snapshot diffs. ESP/MX changes between snapshots are invisible.
What you can do with this
If you run a domain:
-
Audit your DMARC record. If it's
v=DMARC1; p=none;with norua=, you're in the compliance theater bucket. Setrua=mailto:dmarc@yourdomain.com, let it run for 2–4 weeks, then move top=quarantine; pct=10and ramp up. Tools: Postmark's DMARC Digests, dmarcian, EasyDMARC. -
Check if your SPF is flattened. If your record has 30 IP networks instead of
include:directives, you're losing visibility for analytics and overloading every receiving resolver. Modern platforms can auto-flatten safely. Manual flattening is technical debt. - Unroll CNAMEs on your MX. Doesn't affect deliverability, but improves discoverability in security audits and industry analytics.
If you build email tooling:
- The combined Generic + Unknown ~50% is your TAM if you're building any kind of "mailbox-as-a-service" product targeting non-Google/non-Microsoft.
- The KnowBe4 / Pardot / Atlassian patterns are reusable: any SaaS that sends from customer domains leaves a fingerprint in SPF. You can build adoption metrics for any vendor with a known SPF include.
If you do research:
- The dataset is open and reproducible. Classifier dictionaries are public. OpenINTEL has reasonable data agreements for academic use. Pull requests for missed providers/ESPs are welcome — they'll be in the next daily run.
What's next
In rough priority order:
- CNAME unrolling for MX targets — should move some "Unknown" mass into Microsoft 365 / Google Workspace and tighten the duopoly estimate.
-
BIMI keys (
default._bimi.<domain>) — the brand-indicator follow-up to DMARC. Requiresp=quarantine|rejectwithpct=100plus a VMC certificate; should be a small but interesting cohort. - MTA-STS / DANE / TLS-RPT — the next layer of email security beyond authentication. OpenINTEL queries these; metrics not yet computed.
- Per-ccTLD slices — fix the Tranco bias for regional analyses (.ru, .cn, .de, .jp).
-
Cross-tabs: domains with
p=rejectper ESP, DMARC adoption among Cloudflare Email Routing users, etc. — 2D slices unlock more interesting stories than 1D rankings. - Historical reconstruction back to 2017 — OpenINTEL has the archive; building it into a time series.
Reproducibility & contributions
Each daily report ships with:
- The OpenINTEL snapshot date.
- SHA256 hashes of
mx_providers.py,esps.py,saas_senders.py. - The top-100 unmatched MX targets and SPF includes — published explicitly so anyone can suggest dictionary additions.
Raw OpenINTEL Parquet is deleted after analysis (per their data agreement); only aggregates persist.
Three classifications I already know need fixing for the next iteration:
eforward*.registrar-servers.com → Namecheap email forwarding (currently generic)
route*.mx.cloudflare.net → Cloudflare Email Routing (currently unknown)
mailstore1.secureserver.net → GoDaddy (currently unknown)
Daily snapshots: https://check.live-direct-marketing.online/email-stats/
If you spot a misclassification or a missing ESP/provider, drop it in the comments or open an issue — it'll show up in tomorrow's run.
What I'd love to know in comments: are you tracking your DMARC enforcement rate? What does it look like for your stack? And if you've seen flattening hide an ESP in your own SPF, how did you debug it?
References
- OpenINTEL — https://openintel.nl/
- Tranco list — https://tranco-list.eu/
- van Rijswijk-Deij et al., "A High-Performance, Scalable Infrastructure for Large-Scale Active DNS Measurements", IEEE JSAC 2016
- Le Pochat et al., "Tranco: A Research-Oriented Top Sites Ranking Hardened Against Manipulation", NDSS 2019
- RFC 7489 — DMARC
- RFC 7208 — SPF
- Google/Yahoo bulk sender requirements (Feb 2024)
Top comments (2)
A quick follow-up I didn't include in the post to keep length down: I checked how route*.mx.cloudflare.net correlates with DMARC adoption — surprisingly, only ~31% of Cloudflare Email Routing users publish DMARC, vs ~70% baseline. Either CER attracts a less-security-conscious cohort, or the setup wizard doesn't push DMARC. Curious if anyone's seen this from another angle.
you might wanna take a look at this research: dmarcguard.io/research/email-authe...
it's more than the top-1M of Tranco List, containing the entire 5.5M and a broader and more comprehensive coverage in that.
shameless plug, I'm the author.