It's not DNS. There's no way it's DNS. It was DNS. — haiku on Chinese ink painting
0 days since it was DNS — it's always DNS sticker
McGonagall: Why is it always you three? DNS, BGP, DHCP — Harry Potter meme
Wait, it's all DNS? Always has been — astronaut meme with timeouts, bad certs, API failures
The #1 DevOps excuse for legitimately slacking off: waiting for DNS cache
Ancient Aliens guy — I'm not saying it was DNS, but it was DNS

Greatest Hits of the DNS Charts

  1. .com / .net ~4h

    An Ingres database failure at InterNIC corrupted the .com and .net zone data, which was then pushed to the root servers. For roughly four hours, essentially every .com and .net lookup came back broken — about 50 million users, most of the commercial internet at the time.

  2. dyn ~11h

    A Mirai botnet DDoS aimed at a major managed DNS provider dragged Twitter, Reddit, Spotify, Netflix, and a long tail of the US internet offline across three sustained attack waves.

  3. akamai edge dns ~1h

    A faulty configuration update tripped a latent bug in Akamai's Edge DNS. For about an hour American Airlines, Fox News, Steam, UPS, and a long tail of banks went dark — no attack, just a bad push to a shared dependency.

  4. meta / facebook ~6h

    A BGP config change withdrew the routes to Facebook's own authoritative DNS servers. Apps, internal tools, and even the office badge readers stopped working until engineers could physically reach the hardware.

  5. aws us-east-1 ~15h

    A DNS automation race inside us-east-1 cleared the regional DynamoDB endpoint record. DynamoDB powers a lot of AWS control planes, so the blast radius swept up Snapchat, Discord, Shopify, Reddit, Coinbase, Ring, Alexa, and most of amazon.com itself.