People often say that the government created the Internet. This is not true.
The Internet is a trillion dollars of fiber optic cables laid in the ground and under our oceans. Fiber optic technology was developed by corporations, such as Corning Glasworks, not the government. The trillion dollars in capital that was used to pay for laying cable came from Wall Street, not the government.
The one thing you might be able to credit the government with is standards. The early days of computing were a hodge-podge of networking standards. Only computers from the same vendor could talk to each other -- indeed, often only the same model of computers. The situation was like the railroad network in the pre Civil-War South: each state’s rail network had different gauge tracks, different widths, different turn radiuses, different slopes. As cargo was shipped across the South, it needed to be offloaded from one rail network and loaded onto another, several times. After the Civil War, the U.S. government decreed a common railroad standard for the entire country so that it could move troops quickly to anywhere in the country to suppress insurrections.
international standards organizations, created the "OSI" or "Open Systems Interconnect" group. The purpose of OSI was to create a single standard for all networks, to create a world wide "internetwork" that all computers could be connected to. By 1990, developed countries (US, Europe, Japan) had laws called "GOSIP" or “Government OSI Profile” that required all computers purchased by the government must support the OSI network standard. All large corporations, such as IBM and HP, supported this standard with their computers.
What’s important about the Internet is that the OSI standard failed. It’s not the standard of today’s Internet. The government backed the wrong horse, so to speak. Instead, today’s Internet is based on TCP/IP -- a networking standard the government tried to kill off.
Back around 1980, there were many networking standards. One early effort to interconnect computers was known as BITNET. Most big universities had IBM mainframes. BITNET allowed those mainframes to be interconnected, so that people could exchange data and email. Another early effort was uucp, that exchanged email over dialup lines (and other network connections, including TCP/IP and BITNET connections). DEC (Digital Equipment Corp.) was a hub for a lot of this uucp traffic. Much of this funding came from private sources, not the government. If there was a world wide network in that day, it was X.25, a networking standard supported by the telephone companies and used by big corporations.
The government was also involved. It was the height of the Cold War and the era of the “Star Wars” missile defense system. The Department of Defense (DoD) was throwing money at anything that might have military application.
When government agencies funded a research project, it would be a collaboration among researchers at different universities. The DoD wanted them to be able to talk to each other. Since the most popular computer system among their researchers was BSD Unix, the DoD paid a consulting firm (BBN) to add two networking standards to BSD Unix: Xerox XNS (one of many commercial network standards) and TCP/IP (one of many research network standards).
TCP/IP quickly grew to become the most popular research network standard. Unlike commercial standards (like Xerox’s XNS), no single entity controlled TCP/IP. Universities were free to redefine the standards at will. And that’s what they did.
During the 1980s, this was the question of TCP/IP: nobody really controlled standards. Those who might have controlled standards declared the nascent TCP/IP internetwork an "official anarchy". Those who preferred company controlled standards (from Xerox or IBM), or government standards (like OSI), looked down upon TCP/IP, declaring it would never work. How could standards exist without somebody putting their official stamp of approval on it?
But, it did work. De facto standards developed by acclamation, not proclamation. It worked thusly: two (or more) independent groups developed a way for computers to interoperate on a task (such as exchange email), then they would document what they did so that anybody else could interoperate with them. You were free to interoperate with them, or create a different way of solving the problem. When it became obvious that most everyone was using the standard that worked the best, then and only then was it declared as something like an “official standard”. In fact, much of TCP/IP is inspired by corporations. They paid get get something working, and then documented it so that others could interoperate, which then became Internet standards.
This was in sharp contrast to OSI. The way that OSI worked is that everyone would get together and spend years going to meetings, fighting for what they wanted in the official standard.
Eventually something would be created that tried to satisfy everyone, and a standard would be published. At this point, people would try to implement it. I say "try" because it didn’t actually work. Such standards were so bloated with features that they could never be fully implemented, and were full of problems that you would only find while trying to implement the standard. As a consequence, different people trying to implement the OSI standards could never really get their stuff to interoperate with each other.
So both sides thought the other side wouldn’t work. Those working on TCP/IP standards felt that official standards process would never produce something that worked, and the official standards bodies believed that nothing would work without an official stamp of approval. Even while Netscape was going IPO, setting off the dot-com revolution, government (and big corporations sucking from the government teat) believed that OSI was the long term, and that the TCP/IP Internet was just a temporary research project.
So who gets credit for creating the Internet? Government? The military? Big corporations? Universities?
The answer is "all the above". The Internet is the product of a free society, everyone working together, and sometimes working at odds with each other. It's a triumph of an "official anarchy".
Government threw money at many networks, including the TCP/IP Internet. TCP/IP was influenced by many things, among them the government. But what government most gave TCP/IP was its benign neglect as it spent its guidance, vision, leadership, and energy on developing the OSI network. This history important. If you believe those who say that it's government's unique vision that created the Internet, then you would naturally believe that the government should continue with their successful strategy of regulating and controlling the Internet. If you believe, as I do, that it’s the product of "official anarchy", then you would agree that government should continue keeping its hands off the Internet.
You may also be interested in this piece: Who created the Internet: Evolution or Intelligent Design?
(You might be wondering if this is an attack on NetNeutrality -- of course it is -- but I wrote the original draft of this 10 years ago, long before NetNeutrality was discussed. It's funny -- back in the 1980s, our chief fear was that a corporate monopoly [then: AT&T, today:Google] would successfully lobby the government to tell us how to route packets.)
That's not correct. TCP/IP was mandated by NSF. In order to be on NSFNET, and get funding from it, you had to be running TCP/IP. So, reluctantly, everyone switched to it. It was considered a crappy protocol, but what NSFNET mandates, supercomputing centers do.
Aaa..., you can't write history in the passive voice. You can't say "it was considered a crappy protocol", you have to say who, precisely, thought it a crappy protocol.
The NSFnet chose TCP/IP because they thought it was the best of all alternatives that met their needs at the time. I was using the Internet before the NSFnet went live in 1986 -- it was obvious to me that it was the best of all alternatives.
The NSFnet was one of many government funded projects to interconnect computing centers. BITnet was far more popular and better funded project.
The NSFnet was never the most important part of the Internet, although it was often described as such. I remember back in 1987 on an exchange program in Germany doing traceroutes back to my account at Oregon State University -- the traceroutes did not go over the NSFnet backbone.
When I got into networking, it was on an IBM mainframe. Exactly the kinds of "islands of connectivity" you refer to.
I was working on the NASA Science Internet when the NSF stopped trying to control access in 1993, so my view was much more "ground level" than the arguments between OSI and TCP/IP.
One of the T-shirts I have from those heady days of yore, from Cisco, has a dozen different network protocols listed on it. Banyan Vines, DECnet, Appletalk, and even TCP/IP.
You might enjoy what I wrote about it:
If there is anything IPv6 suffers from, it's not a lack of planning. I use it as a perfect example of Second System Syndrome.
Or, "The Planners Are Back"
After having read your article, Robert, as well as that of Curt, I am certainly aware that your original premise is fair. I was interested to see your jab at net neutrality in the end and I understand the philosophical reasons behind it - mission creep with government bureaucracies and all that. One thing you didn't speak to is the problem of "you can't get there from here" which crops up in the monetized commercial internet today. It seems like you're saying that governments *read US Congress, FCC and the like* should leave the networking to those who own the networks. In the main there is good reason for this system to remain in place but it basically ignores the competing interests of those who own both networks and content. It's almost irrelevant which networking protocols won out in the end and who backed what in the beginning - the internet is now THE way we communicate. As such the government clearly has a compelling interest in ensuring a modicum of fairness and interoperability. I guess my disagreement is more a matter of degree than principle because I'm realizing that I don't REALLY want the people walking around congress trying to make the internet "better".
Your argument is your underlying belief that "the Internet is too important to be left to the free-market". We libertarians counter this with the argument that the Internet is to important to be left to government. The argument has nothing to do with the Internet, but is about your more visceral belief in markets/governments.
That Evil Companies will screen their Internet users in order to push Content is a myth. For one thing, such companies really don't exist. They don't exist because it's a bad business model. Whenever they try such evil activities, customers desert them, and they go out of business. No government regulation is needed.
Sony found this out when they bought up music companies believing them to be a synergy with their popular Walkman. To their dismay, they found out that these business interests competed with each other, and the lack of a Sony MP3 player that could play pirated music is what allowed Apple to take Sony's place as the premier gadget company.
There is enough competition in the Internet space such that if "breaking net neutrality" is bad for customers, then customers can go elsewhere. I agree with you that if there is no competition, then there is a problem, but I'd rather government make sure there is enough competition rather than dictating to the companies how to run their business.
I've been led to believe that it (or its precursor) really started back in the 50's so SAC, and other military entities, could still communicate with themselves even after a massive nuke attack. In any case, I highly doubt it was the gubbmint. More like Ma Bell.
Post a Comment