According to an AP article, some of the 13 root named servers were overloaded recently. Overloading the root servers has long been one of the Top 10 Techniques for DoSing the Internet. It wouldn't actually stop packets from being routed, but trying to resolve www.praetorian.com to IP addresses like 24.75.345.200 would no longer work, and your browser wouldn't be able to get to websites.
However, the root system has been hardened against such attacks, especially after an incident in 2002.
One defense implemented by servers has been to split the workload. In the past, the root name servers would resolve the last 2 portions (like "example.com") and leave the rest for your ISP to continue resolving. Now they just resolve the last portion (like ".com"). Other root-like servers do the second level resolution. For the attacker, this means there are a lot more servers they need to attack.
Another defense, used first by the 'F' root server, is 'anycasting'. It's IP address of 220.127.116.11 does not route to a single machine, but instead to one of roughly 40 machines spread throughout the Internet. If you were to flood it from a single machine, you could only take out one of them. You would need at least 40 separate sources to flood all 40 machines.
Yet another defense are ISPs that cache results. This leads to its own problems, such as return stale data, but this means that if the root system failed, they would still be able to return good enough results.
As a result of all these defenses, it's unlikely that DoSing the root servers would be viable attack. A better offense would be to find a DoS in popular software such as BIND or Microsoft DNS, catalogue all the servers that use it, then DoS them all at once.
One thing that I've always found curious was that the root servers don't use custom software, but instead off-the-shelf platforms like Solaris and BIND. We created the Proventia IPS using custom network drivers, custom TCP/IP stacks, and custom protocol parsers. Using similar techniques, we could create a system for serving 10 times the requests that such systems can currently handle.