Chapter 2: How the Federal Government Created the Internet, and How the Internet is Threatened by The Government's Withdrawal

Chapter 2: How the Federal Government Created the Internet, and How the Internet is Threatened by The Government's Withdrawal

In a remarkable turn of societal imagination, many conservatives have begun picturing the computer age as the rejuvenation of small-scale entrepreneurial capitalism against the institutions of the nation state. Alvin Toffler has talked about "demassification"; George Gilder has cited the "quantum revolution"; Newt Gingrich has promoted decentralization of government to local regions. A steady stream of conservative analysis has argued that new technology has made government's role, especially the federal government's role, irrelevant and even dangerous to the healthy functioning of the economy. Even The Economist, a magazine with an early enthusiasm for the Internet and usually a somewhat more balanced eye, has described the success of the Internet as the "triumph of the free market over central planning. Democracy over dictatorship."[1] The new conservative view: the private sector is the font of technological and economic innovation; moreover, the federal government should get out of the way and leave economic development to the private sector working occasionally with local governments promoting innovation and job creation locally.

Repressed in this bit of economic myth making is the key role the federal government played in each step of the growth of the computer industry and in the birth and formation of the Internet. Further, this myth making ignores the fact that left to private industry, much of the computer technology would never have come to market and, in the case of the Internet, the result would have been less innovative and less of an economic engine for growth. In fact, it's unclear that the integrated communication and information exchange that is the hallmark of the federally created Internet would have even occurred out of the private visions and competition of industry.

As this dissertation explores the nature of regional economies and the incapacity of local government in the face of private industry, it is important to highlight the real power and need for national government involvement to assure not just equity but economic rationality. And the initial creation of the Internet highlights this contrast in a dramatic way.

The Internet is in many ways the product of central planning in its rawest form: planning over decades, large government subsidies directed from a national headquarters, and experts designing and overseeing the project's development. The government not only created whole new technologies to make the Internet a possibility; it created the standards for forms of economic exchange of information that had never been possible before.

The comparison has been at times to the interstate highway system. However, the analogy would hold only if employees of the federal government had first imagined the possibility of cars, subsidized the invention of the auto industry, funded the technology of concrete and tar, and built the whole initial system. Karl Polanyi has argued that the underpinnings of the rise of industrialization required extensive government involvement--from labor market regulation to creation of international currencies--just to make the launch of industrialization possible. In a similar fashion, it is clear that a Promethean role was required for government to make information networking an economic reality.

The government achieved a technical success where private industry would never have sustained the research and development over the decades required. Where private industry was working to create proprietary standards that would have stunted innovation and blocked shared networks, the government sustained and expanded open standards that encouraged the maximum sharing of resources. This allowed other public institutions like universities to develop a stream of free, quickly distributed software that enhanced the network and allowed continual and rapid innovation.

Beyond the technical success of the Internet, its development illustrated how federal planning and investment was a key to the success of the private economic explosion of activity around the Internet. The government developed not only the hardware and software required but trained most of the initial engineers and entrepreneurs who would launch businesses around the Internet. Almost every major company's electronic networking endeavors have their routes in former government employees and many were direct spin-offs from the government itself, either as agencies or as federal contractors. It has been precisely the "big government" involvement in technology that has worked in symbiosis with entrepreneurial energy to create much of the technological and economic innovation of the last decades. By creating public domain systems and providing public domain technology, the federal government has helped avoid a world of purely proprietary technology dominated by monopolists that would have left little room for the entry of small startup companies or their associated innovation.

It is worth remembering that even as the Internet is being taken for granted, a very different vision of the "Information Superhighway" was being promoted by private industry early on. It was a vision of proprietarily owned information networks like America Online and the Microsoft Network negotiating with cable companies and telephone companies to deliver incompatible information services to the consumer. The headlines in 1993 were not about the Internet and software companies like Netscape but rather about mergers and financial deals between those who controlled the cables to the home. All assumed that those who monopolized control of the physical hardware connecting homes and businesses would reap monopoly profits in selling information services.

Fortune magazine described the ultimately unsuccessful merger of TCI cable and Bell Atlantic telephone this way in 1993: "It was the bold stroke of two captains of industry bent on securing their share of whatever booty washes ashore when the interactive age finally arrives...When the dust settles, there will probably be eight to ten major operators on the highway, some earning their way mainly by collecting tolls for the use of their networks."[2] At the national policy level, twenty-eight industry players from the telecommunications and computer industries formed the Cross-Industry Working Team to advocate for their vision of the National Information Infrastructure (NII) being discussed by the Clinton industry. Announced in their formation was the top priority of assuring that a variety of billing options be built into the NII architecture to assure proper measurement and payment of services.[3]

In many ways, this private vision harked back not to the original federal highway system but to the first transit system that criss-crossed the nation's land--the railroads. And in fact, that historical legacy gives some sense of what a privately designed (if not always privately funded) system would have looked like. In the 1840s and 1850s, the first large railways were built, usually with incompatible track widths where trains entering the same city could not switch directly to another company's rail track. This was not accidental but a deliberate strategy by merchants sponsoring one railroad to avoid having another company (usually sponsored by merchants in a rival city) siphon off freight. It would take decades before the gauges of different train companies were all standardized and freight could be easily transferred from line-to-line for longer distances. Even as such standardization was achieved by the 1880s, giant railroad companies sought to create competing railway systems that could control enough territory to control the flow and pricing of significant portions of freight against competing systems, becoming the first major oligopolies in the US economy.[4]

This vision of toll roads laboriously tying together incompatible proprietary systems was the dominant vision of the Information Superhighway with the Internet often dismissed as a toy for academics. The vision of monopoly carriers was so strong that rival companies called for a Justice Department anti-trust investigation when Microsoft launched its own on-line Microsoft Network (MSN) as part of its new Windows 95 operating system. The fear was that as the gatekeeper of the operating system, it would establish an unfair advantage versus other proprietary systems and gain exclusive control over distribution of content to many if not most consumers. However, the proprietary MSN system introduced in August 1995 was summarily dumped four months later i n favor of making all content and access Internet-compatible. In a few short months, the corporate arch-foe of the Internet had admitted defeat at the hands of the Internet and instead hooked its own fortunes to the common public carrier created by the government.

What this chapter will detail is how and why those open standards had triumphed over the corporate vision of competing standards. Steven Levy, columnist on technology and a longtime chronicler of the free-spirited "hacker" ethic, argued, "the Interne t can never be merely another profit center in their dreams of empire. Their power is based on monopoly, on controlling distribution. But the Net is built to smash monopolies."[5] While Levy may be too optimistic, it is clear that the technical foresight and the economic engagement of the Internet planners had done what Microsoft's rivals had not been able to do and created a system where smaller entrepreneurial companies had a chance against monopoly-oriented players.

The result of that government support has been an explosive boom in Internet-related industry. By reducing the cost of entry, the Internet has been crucial in making sure deep pocket corporations could not lock up whole areas in proprietary systems. Where phone systems, cellular systems, satellites and cable TV remain a competing mass of often incompatible systems, the Internet has forged an integrated system of data communication so compelling that every other technology is rushing to integrate its elf into the Internet in order to take advantage of the free-flowing commerce exploding over its networks.

However, even as the broadest public began enjoying the fruits of the Internet nurtured for decades by the government, these new commercial companies spawned by the new technology became a focus of resistance to government continuing its role in assuring standards and access. New software companies like Netscape - staffed by individuals like Netscape's Marc Andreesan who had been funded originally by the government to create browser software- would fight to take control of Internet standards, while the many new "Internet Service Providers" (ISPs) would lobby hard for continual privatization of Internet backbones and regional access systems. In a few short years in the first half of the 1990s, both the governance structure and integrated backbone system created by the federal government over decades was privatized.

The danger is that with public and private investment in long-term basic research falling, the burst of Internet commerce engineered by decades of public investment may be consuming these fruits of past investment while undermining support for future government investment and coordination for the next generation of innovation. Creaming profits from decades of public investment, most of these new companies have resisted any suggestion of obligation to the broader public. Ironically, most of these same companies that had complained about government's role would within a few years see a new danger of merger and monopolization of the new technology. By 1997, companies like Netscape and Sun Microsystems would be calling for antitrust investigation of Microsoft for its anticompetitive control of Internet standards, while ISPs and others would note the danger of new mergers placing control of over 50% of Internet traffic in the hands of one telecommunications company, WorldCom, as it swallowed up MCI. In a stunningly short time, the industry moved from one of the government nurturing a hothouse of entrepreneurial development and innovation to one of giant companies vying for monopoly control of standards and technology.

This chapter will outline this trajectory of federal planning and engagement, entrepreneurial explosion, government withdrawal and subsequent merger and monopolization of standards and technology. Chapters three and four will explore how the regional dynamics of Silicon Valley, largely supported themselves by the federal government, would be the major counterweight to the proprietary monopolization appearing in the wake of the government's withdrawal from Internet governance.

The Origin of the Internet and Technical Triumph of the Government's Role

The Internet came out of a whole milieu of government funding for technology born largely during World War II and the Cold War. The Internet would succeed because it promoted these key points in government policy:

* Long-range planning and investment for future technical and economic needs

* Promoting a professional network to guide the process that crossed from government to university to business research lab that would act in the broad public interest

* Focusing on open standards that were as inclusive of multiple technologies as possible

* Taking advantage of public space and volunteer energy, especially from universities, to create stream of free, quickly shared innovations

* Building a critical mass of participants to make the network viable for a broad audience

Investment and Planning:

Markets for new technology exist on the margin: on the margin of the income individuals have to spend, on the margin of the costs companies have to develop that technology for the marketplace, and usually on the margins of any interest by private indu stry. Innovation in production that has immediate cost-saving or quality-increasing results is the more typical focus for corporate research and development. For this reason, most of the high technology we take for granted today is built not by private sector initiatives but due to government planners who could fund technology for the ten, fifteen, even twenty-five years in the case of the Internet needed to make the innovation cost-effective and standardized enough for the private market.

The modern computer industry was born out of the government-driven research of that era. While private industry in recent years has pushed forward the almost inexorable process of making computers cheaper and faster, computers still largely follow the design blueprint created in the war and post-war period under government funding and planning. In Britain, researcher Alan Turing used early computer prototypes to break the German Enigma code and would subsequently lay out much of the theoretical foundation for artificial intelligence research. In the US, the multi-talented John Von Neumann would head government research, including on the Manhattan Project, that set computer design in the post-war period, so much so that most of today's computers are referred to in scientific literature as "Von Neumann machines" as indicative of their design. Claude Shannon, an early theoretician on the use of boolean logic in computer circuit design, would lead work on defense-funded projects at Bell Labs that were critical to computer design until he became a professor at MIT in 1947 where he would further work on information theory in computers. Another MIT colleague, Norbert Wiener, would outline the theory and challenges of what he first labeled cybernetics (until he rejected practical computer work due to his fears of its use in weapons of mass destruction.)[6]

All of these individuals, along with many others, and their research were bound together by a haphazard set of government projects and programs put together in the frenzy of World War II and the early Cold War. It was in the 1950s, particularly under the psychological impact of Russia's Sputnik success, that the US government sought to regularize its technological research for maximum success, both technologically and economically. Given the biases in the US against government intervention, it seems inevitable that the engine for industrial policy would be defense-related. For the same reason, even federal intervention in highway construction and public education begun in the 1950s would officially be done in the name of defense.

However, President Eisenhower's personal experience made him distrustful of the bureaucratic interests in the Pentagon which led him in the late to support the creation of new institutions largely independent of the specific military branches. One ex ample was NASA, which ended up with much of the day-to-day applied research of the military at the time, while a new agency called the Advanced Research Projects Agency (ARPA) was created to help coordinate overall R&D spending by the military. The National Science Foundation was created outside the Defense Department at the same period to help fund non-military research, although it would always have a strong relationship with the science-based military agencies.

A key appointment at ARPA came in 1962 when psychologist J.C.R. Licklider was hired to head a behavior sciences office there, an office that would evolve under Licklider's two-year directorship into the Information Processing Techniques Office (IPTO) which would direct the original creation of the Internet. Licklider had become interested as a researcher at MIT in how people and computers could interact to augment human activity and been pulled into the evolving discipline of designing computer inter faces. Originally, he had been distressed by the amount of time he spent as a researcher just obtaining the knowledge from different places needed to even begin thinking. His main research originally was on how the brain processes sound and perceives pitch, but the preparation of graphs for his work overwhelmed all his other work. And he found the use of computers totally alien to his needs as a thinker and researcher and instead proposed a radical design away from data processing to support of human thought.

His pre-ARPA career became embedded in the military-scientific network of the day. As a research at MIT, he worked at Lincoln Labs where much of the university military research was done. Licklider was involved in the Whirlwind program, which was de signing computers for the Defense Departments ground-based anti-missile program (known as SAGE for Semi-Automatic Ground Environment). Whirlwind used the first computers where individuals could get information from a computer through visual displays rather than through punchcards; Licklider became obsessed with a vision of "interactive computing" as an alternative to both the typical automation use of computers and the idea of artificial intelligence where computers would think like people. Instead, he envisioned a new kind of computer-based library to manage information more effectively to improve human thought, working off an earlier vision by Vannevar Bush who had a decade earlier promoted the idea of using microfiche in a similar manner. In 1960, h e published a paper "Man-Computer Symbiosis" that envisioned using computers to assist people in processing data in ways never imagined before. To accomplish this, not only would computers have to communicate better with people but would have to allow people to communicate with each other more effectively. Around the same period, he began working at Bolt Beranek and Newman (BBN), a consulting firm largely staffed by MIT graduates and brilliant drop-outs which was often referred to as the "third university" in the town of Cambridge. BBN had recently obtained one of the first minicomputers, the DEC PDP-1, which gave Licklider even more time to learn different ways to interact directly with a computer.[7][8]

When ARPA director Jack Ruina wanted to computerize military command functions at all levels, Licklider's experience as both a psychologist and expert on computer interfaces made him an obvious candidate for the job. So in October 1962, Licklider became head of the IPTO office at ARPA. His principle focus as director was on expanding the possibilities of his vision of interactive computing, from computer graphics to improved computer languages to time-sharing computers - the use of the same computer by multiple users simultaneously. Licklider was now in the position to support computer labs around the country, usually with thirty to forty times the budget to which they were accustomed.

ARPA would fund six of the first twelve time-sharing computer systems in the country which in turn would help spark the whole minicomputer industry in the 1960s - crucial to the industry and the Boston-area regional economy then but as crucial to the development of the Internet over the next decades. A key creator of the minicomputer, the Digital Equipment Corporation (DEC), had been founded by a team of MIT researchers from Lincoln Labs and would maintain a close relationship with the university. MI T researchers would continue to work closely with DEC, and the ARPA-funded Project MAC would design most of DEC's breakthrough time-sharing minicomputer, the PDP-6, which would be a crucial breakthrough machine for networking efforts across the country.

In 1964 Licklider would recommend MIT graduate student Ivan Sutherland, a pioneer in computer graphics, as his successor who in turn would hire as his assistant (and in 1966 successor) a NASA employee Bob Taylor. Taylor would continue Licklider's vision; in fact, at NASA and at ARPA, he would push forward major funding for Stanford Research Institute to create an "Augmentation Researcher Center" run by Doug Engelbart, a man whose thinking had paralleled Licklider's in his conception of interactive computing. (Chapter 3 will discuss Engelbart's center and its role in the Silicon Valley technology explosion much more extensively). As head of IPTO, Taylor was frustrated that even as he had the power of several different computer systems in his office , all had different computer terminals and could not communicate. Taylor decided that funding a computer networking project should be a high priority, especially since researcher around the country often needed to share expensive computer resources at locations far from their primary research location.

At the same time in the early sixties, researcher Paul Baran had begun planning how to build the technology necessary for the goal of networking computers. Baran worked at Rand Corporation, a company setup to monitor and preserve the US's operations research capability, where he worried about the survivability of US communication networks in the case of nuclear war. Modeling his ideas partially on the redundancy of neural networks in the brain, Baran envisioned the movement from analog signals to digital signals that could perform in such a networked system of digital transmission. Instead of a central switching node where a wire between two points would be reserved specifically for sound signals for that conversation, such a system would be a "distributed network" with each node connected to its nearest neighbors in a string of connections, much like the child's game of telephone. More dramatically, messages would be broken down into parts, travel the network and be reassembled at the opposite end in a system called packet switching. This would allow fuller use of all lines in the network instead of holding lines open from end-to-end for each message. Each node would keep track of the fastest route to each destination on the network (an d be constantly updated with information from adjoining nodes) and help route information without need of central direction.

RAND was enthusiastic about Baran's ideas but when AT&T was approached about its feasibility, AT&T executives dismissed the idea and even refused to share information on their long distance circuit maps; Baran had to purloin a copy to evaluate the ide as which he and RAND were convinced were right. Based on RAND's recommendation, the Air Force directly asked AT&T to build such a network but AT&T still refused saying it wouldn't work (although a faction of scientists at Bell Labs did support the idea.) This may have been technical myopia by the business-oriented executives but it was an economically self-interested myopia derived at least partly by a political box in which the federal government itself had put AT&T. Such a distributed network threatened (and today does threaten) the central economic assets of the telephone industry: central computers and central switches. On one level, AT&T's resistance highlights the fact that corporate research labs, the main alternative to long-term government funding of technology, rarely if ever invest in fundamental technology that will likely undermine the economic monopolies they currently enjoy. Compounding the problem for AT&T was the fact that the first winds of deregulatory attack on the company were b lowing and, specifically, the company was increasingly barred from selling anything having to do with computers attached to the phone network. As will be detailed in Chapter 5, this emerging deregulation and its political division between the phone network and the increasing computerization of telecommunications would have perverse consequences for both technology and regional economies.

The Air Force contemplated building the network by itself but it bogged down in internal organizational problems in getting the project off the ground. In the meantime, British physicist Donald Davies had begun promoting a similar idea of a computer network with "packets" of information. He soon learned of Baran's similar ideas and was encouraged enough to get support by the British Post Office, which ran the telephone system in Britain, to support the concept. Without the political issues of deregulation, the state-run phone system in Britain just treated the project as a simple demonstration project. In 1968, the first computer distributed network was established on computers all located at the National Physical Laboratory where Davies worked.

Taking off on Davies' example, ARPA began its own networking project. Larry Roberts from Lincoln Labs was hired to oversee the computer networking project. There was a certain hostility from many East Coast universities to sharing scarce computing resources which led to the network starting out at four west coast sites: University of California Los Angeles (UCLA), UC Santa Barbara, University of Utah, and the Stanford Research Institute's Augmentation Research Center. The idea was to install a new computer at each site as part of the network: this would avoid direct incompatibilities in the network and allow each campus to focus on a separate interface between their regular campus computers and the local network computer.

In 1968, ARPA advertised a bid for building the subnet computers (which would be called "Interface Message Processors" or IMPs). IBM and other big computer companies declined even to make a bid, saying it was not possible at a reasonable price. Like AT&T, this was partially the myopia of those grounded in older technology but it was also a self-interested economic fear of the new minicomputer technology, supported by the federal government, that was challenging the dominance of companies like IBM. IBM and others rightly feared that networking would make many government agencies and businesses rethink the need to actually own their own mainframe computer.

While the defense contractor Raytheon almost won the contract, in the end Licklider's old consulting firm Bolt Beranek and Newman (BBN) convinced ARPA that it's relatively small operation (600 employees) could do the best job. With its ties to MIT, including a workforce made up largely of MIT graduate students, BBN had a good case for implementing technology largely developed at that university. With a large ($1 million) contract, it was able to take on a project it could never have done on its own w ithout the guaranteed market the government was providing.

By October 1969, the network connection between UCLA and Stanford was established and within months, all four "nodes" plus BBN itself were online. By the time the network was demonstrated publicly for the first time at the International Conference on Computer Communications in October 1972, there were twenty-nine nodes in the network (dubbed at this point ARPANET) clustered largely in four areas: Boston, Washington DC, Los Angeles and San Francisco. What would evolve into the Internet had been born.

Professional Network of Experts:

Beyond long-term planning, a key to assuring that the Internet would expand in a dynamic way was the creation of a network of public-minded experts who helped guide the expansion of the network. Despite odes to the "anarchy" of the Internet, this was a closely supervised anarchy directed to the specifications of government yet marshalling the broad professional, volunteer and eventually commercial resources of the emerging computer elite. In many ways, the very skill of the government in marshalling those resources with a light hand is a source of the sometimes rhetorical amnesia over its role. The smoothness of the Internet's creation and the building of a broad consensus over its shape created so much legitimacy for its design that it was seen as less a creation of "the government" - i.e. "them" - but rather as a broad creation of society as a whole.

The development of the Internet is in this way a perfect illustration of the ideal of "embedded autonomy" described by Peter Evans where a Weberian bureaucracy at ARPA and other federal agencies operated under broad professional norms and where "individual maximization must take place via conformity to bureaucratic rules rather than via exploitation of individual opportunities presented by the invisible hand."[9] The effectiveness of the ARPA bureaucracy was maintained by marshalling support from a broad range of other government agencies, from universities and from the private sector which eventually saw an interest in a broad networking environment for online commerce.

Licklider had actually started this professional network at ARPA in the early 1960s when he reached beyond traditional experts at federal agencies and national labs to gather an association of experts interested in communication technology. He orient ed ARPA to establish contacts with university researchers around the country, establishing what he called presciently the Intergalactic Computer Network which helped connect researchers interested in computer networking.

When ARPANET was created, UCLA was funded to establish a Network Measurement Center to oversee the evolution of the network. Forty grad students at UCLA, many of them to become key leaders in both the public and corporate Internet world, helped run t he center and coordinate with other researchers around developing the standards for running the ARPANET. The new technology itself helped add a whole nationwide group of researchers and graduate students in these deliberations to help mold the evolution of the Internet. This national body became the Network Working Group (NWG) which was expanded after the 1972 "debut" conference to become part of an International Network Working Group to promote international computer networking. Management of Internet "addresses", critical for the decentralized packet switching network, would be housed at SRI in an institution called the InterNIC.

ARPA would replace the NWG by a more formal Internet Configuration Control Board (ICCB) in 1979 to extend the participation in the design of the Internet to a wider range of members of the research community. This was especially important as the ARPA NET expanded to include a range of other government agencies and bodies and evolved into the diversity of the emerging Internet community. The ICCB was later replaced by the Internet Activities Board (IAB) which used a set of ten task forces to include a wide range of experts in the evolution of the Internet. As the Internet was privatized in the early 1990s, the private sector (led in many cases by former researchers for ARPA and its Internet-related funded projects) created the Internet Society in 1992 and the IAB reconstituted itself as the Internet Architecture Board and joined the Internet Society.[10]

At each step of its development, ARPA and associated government agencies expanded participation to an ever widening set of experts and technological leaders who, in turn, would encourage others in their academic, scientific, community or business realm to support the effective development of the Internet. As well, the continual movement of personnel back and forth from academic, government and (eventually) business positions created a cross-fertilization of ideas and a loyalty to the emerging network over any particular organizational loyalty. What is left open to question (as will be described) is what the increasing privatization of the Internet's governance means for a maintaining this same healthy "embedded autonomy" in furthering a public-oriented Internet.

Creation of Standards:

Beyond funding the initial computers and wires that connected the initial sites on the Net, the federal government's guidance ensured the creation of a shared set of standards for communication. It created not just its own initial ARPANET but set standards which could integrate all sorts of computer networks together into what would become the Internet. This was crucial to the explosion of innovation and economic value of the Internet over time.

The reason for this is embodied in what has been called Metcalfe's Law (named after Bob Metcalfe, an ARPA-funded researcher and the founder of the networking company 3Com). This law argues that the value of a network does not increase linearly with additional computers added to the network but instead can be measured as the number of users squared. This means that the simple act of integrating different networks which were previously incompatible is a recipe for explosive increases in innovation and economic value, since networks of networks are exponentially more valuable than the sum of their parts.

This is the reason why proprietary information services like America OnLine and Microsoft Network lost out to the Internet. While those services were incrementally increasing the value of their systems through enhanced "content", the Internet was exponentially exploding its value as each individual computer network was integrated into the broader network of networks. And the key was that only the government had the interest in creating a non-proprietary system where no monopoly rents could be collected. In a sense, the Internet is nothing more than those standards: any computer over any wire can conceivably be hooked up to the Internet as long as that computer knows how to communicate in the Internet's language of information exchange.

ARPA oversaw the first standard network protocol in 1971 to allow a person at one computer to log onto other computers on the network as if they were local users. This soon evolved into the standard Transmission Control Protocol (TCP) which was complemented in 1972 by the File Transfer Protocol (FTP) which allowed individual files to be exchanged between different computers. In 1976, DARPA (with Defense recently added to its name) hired Vint Cerf, an original UCLA graduate student at the launch of the ARPANET and by then a Stanford professor, as a program manager to work with Bob Kahn, a former BBN manager on the ARPANET and now working at DARPA, to create a system for integrating the ARPANET into other computer networks. By 1977, they had demonstrated the Internet Protocol (IP) which could be used to integrate satellite, packet radio and the ARPANET. From this point on, new networks of computers could be easily added to the network. In 1981, DARPA funded researchers at UC-Berkeley to include TCP /IP networking protocols into UCB's popular public version of the UNIX operating system, thereby spreading the Internet standards to computers throughout the world.

An example of the cross-fertilization of staff and ideas outside the government was the case of Bob Metcalfe and Ethernet. Bob Metcalfe had designed the interface to connect MIT originally to the ARPANET and had been hired in the mid-1970s at Xerox Corporation's new Palo Alto Research Center (PARC) which was headed by Bob Taylor, the former IPTO head who had started the ARPANET project. (We will discuss PARC itself much more in chapter 3). Metcalfe was doing DARPA-funded work while trying to figure out how to cheaply network PARC's experimental personal computers. Using models from ARPA's project around radio packet switching, Metcalfe created a system called Ethernet to exchange information between computers in what would come to be called Local Area Networks (LANs). Ethernet was crucial for the expansion of the Internet since local computers could be networked together and then connected to other networks using the TCP/IP protocol and local router computers. Xerox would start selling Ethernet as a commercial product in 1980 (and Metcalfe would found 3Com to sell networking technology), while PARC head Bob Taylor donated millions of dollars of Ethernet equipment to universities to help expand use of networking on campuses.

In all these ways, DARPA helped shepherd open Internet standards into the 1980s and 1990s when they would be used to radically expand the network to a wide range of users. In doing so, it was clear that the professional norms promoted by DARPA and the community of researchers was critical in keep individual profit taking from undermining those open standards. As one example, in 1973 then IPTO head Larry Roberts was hired by BBN to run a company subsidiary called TELENET that would run private packet switching networks. In coming to BBN, Roberts carefully deflected a bid by BBN to take over ARPANET privately. J.C.R. Licklider, who returned from MIT to ARPA to replace Roberts as head of IPTO, soon found himself in conflict with his old employer BBN who was refusing to publish the original computer code for IMP computer routers they had designed. Making matters worse, BBN was becoming more and more reluctant itself to fix software bugs faced by the system (no doubt preferring to concentrate on programming for their for-profit TELENET subsidiary). Licklider, in the name of the openness of the Net, threatened to hold up BBN's federal contract funds unless they released the code publicly. BBN did so, thereby enhancing the tradition of open codes in th e development of standards.

Ironically, as data networks spread in the 1980s, it was the government experts at DARPA and universities who backed the flexible, tested TCP/IP protocol, while big private companies like IBM, MCI and Hewlett Packard adopted an untested, bureaucratically inspired standard created in international committees called OSI. Vincent Cerf who had been hired at MCI to build their message networking system remembers, "So I had to build MCI Mail out of a dog's breakfast of protocols."[11] It was only with the technical dominance of the Internet that most private industry would convert over to the public TCP/IP protocol.

Taking advantage of public space and volunteer energy, especially from universities, to create stream of free, quickly shared innovations

A key part of the success of the Internet was the fact that the public space of the network harnessed the energy of universities, both paid staff and volunteers, to provide a continuous stream of free software to improve its functionality. The Net it self allowed any new innovation to nearly instantaneously ricochet across the nation, even the world, without the friction of the costs of either distribution or purchase. This "gift" economy allowed new innovations to be quickly tested and gain a critic al mass of users for functions not even envisioned by the creators of the system.

The origin of this ethic of shared software, what was called the "hacker ethic" before the term became associated with electronic vandalism, was largely born at the ARPA-funded Project MAC at MIT. Designing new software and sharing the unexpected results became part of the way of life of students, an ethic that would contribute to both the creation of the Internet and the launch of the personal computer revolution. Games like Spacewar and Adventure were created and widely shared as hacker outposts appeared across the country in the 1960s and 1970s.

Probably the most pervasive example of this hacker ethic was the early use of the ARPANET for electronic mail. Not even planned as part of its design, email was created as a private "hack" by BBN engineer Ray Tomlinson in 1972 as a piggyback on the file transfer protocol. Under the tolerant supervision of ARPA, use of the Net for e-mail communication soon surpassed actual computing resource sharing. Stephen Lukasik, ARPA director from 1971 to 1975, saw the importance of e-mail for long-distance collaboration and himself soon began virtually directing ARPA from electronic mail and his 20-lb. Texas Instruments portable terminal that he brought with him everywhere. Partly because of Lukasik's own frustration in dealing with the stream of raw mail, IP TO director Larry Roberts himself wrote the code for the first mail manager software called READ, which was soon supplanted by the popular MSG which invented the reply function.

In 1975, the first e-mail list, the MSGroup, was established to discuss standards and etiquette in e-mail exchange along with promoting free speech in the sometimes uneasy relationship with a defense-funded network. Other e-mail lists along with the broad USENET set of discussion bulletin boards would follow this.

The game of Adventure was ported onto the Net in 1976 and became a huge attraction as people began downloading it from all over the country. That game and many to follow would hook many people on computers and help establish the ethic of freely share d software far outside the original hacker enclaves.

What brought the Internet into its own, though, was the "Gopher" software developed at the University of Minnesota in the early 1990s. Building on the existence of individual Internet sites where files and programs could be retrieved after logging in to a particular computer over the network, Gopher was a piece of software that could be used to create personalized lists of files from computers all over the Net and allow computer users to retrieve any file chosen from the list. With this innovation, the Internet became one giant hard drive that could be organized and presented to a particular set of users in whatever way made the most logical or aesthetic sense. Gophers sprang up on computers run by governments, universities, community organizations and businesses beginning to stake a place on the Net. In a visual way, the Internet's vast resources could be presented and reached through Minnesota's "All the Gopher Sites in the World" gopher site. For most commercial users of service providers like America On-Line, gophers were the initial contact they were given with the world of the Internet and it created the hunger for the content they knew existed out of the proprietary walls of those commercial providers.

The next step, and the step that brought the Internet into almost daily headlines, was the World Wide Web. The Web was initially designed at the European Particle Physics Lab (CERN) in Geneva, Switzerland to share information internally--what would b e designated as an Intranet today. However, people quickly saw it as a useful way of sharing information between computer systems much like the Gopher software, with the additional advantage of "hypertext" connections to internal parts of documents. In 1993, computer science students funded at the National Center for Supercomputing Applications located at the University of Illinois created Mosaic, the first Web browser that added the display of graphics to the traditional text display. With an almost unnerving speed, Web sites exploded across the Internet along with the browsers needed to view them.

As Netscape, Microsoft and others have created commercial versions of the Mosaic browser, they have generally been forced to give the software away to individual customers in hopes of selling related servers and other software to corporate customers. This commercial form of the Internet "gift economy" has raised the danger of private companies using such giveaways to distort standards for corporate profits rather than to build on others' work. It is a testament to the durability of Net standards that they have put restraints even on the ability of companies like Microsoft to fully undermine the Internet.

Building a critical mass of participants to make the network viable for a broad audience

Despite the initial planning and funding of the technology, the Internet would not have become a mass phenomenon in the United States without the government guiding its expansion to the point where, with a critical mass of participants creating a diversity of free content, it became a viable alternative to the proprietary corporate systems. It might have remained a dynamic but limited tool of a small handful of researchers and elite government agencies--essentially what existed until just recently in countries like Japan and parts of Europe. Instead, the federal government helped democratize the system to a broad enough base that it developed an expansionary dynamic of its own.

Where ARPA had been the key government agency in funding the creation of the network and guiding its technical development, the National Science Foundation (NSF) played the key role in bringing the Internet to the masses--or at least the academic mass es. The NSF had been born in 1950 to help promote basic science research with a focus on the academic world. As early as 1974, an NSF advisory committee promoted the idea of a computer network to "offer advanced communication, collaboration and the sharing of resources among geographically separated or isolated researchers."[12] Nothing came of this proposal but by the late 1970s, computer science departments were mushrooming. The ARPANET was threatening to split the computer science community into communication haves and have-nots; only 15 of 120 of the campuses with academic computer science departments had ARPANET connections by 1979. Having an ARPANET connection became a key factor in whether a school could lure top grad student s and faculty. There was also the fear that high industry salaries combined with poor campus facilities was threatening to drain academia and leave no one to train the next generation of computer scientists. This was one of the first examples where information networking technology was threatening to centralize resources, both intellectual and economic, in fewer geographic places rather than decentralizing them.

The solution proposed was the Computer Science Research Network (CSNET) founded by six campuses in 1979. At first, the campuses were planning a private system with no connection to the ARPANET, but the NSF pushed for a proposal that gave Universities the options of full ARPANET connections, a more limited connection, or an e-mail only connection based on the resources available to each campus. The NSF agreed to manage the network for the first two years (passing it onto a university consortium after that) while putting $5 million into the CSNET project to get it off the ground. By June 1983, more than seventy campuses were on-line and by 1986, nearly all the computer science departments in the country along with a number of private computer research sites were on-line. NSF then turned to funding other campuses without computer science departments to become part of a broader network. Approximately 1,300 of the roughly 3,600 four-year institutions in the U. S. were assisted at a cost of about $30 million. Even as the NSF has phased out much of its participation in the Internet, institutions of higher education, including technical and community colleges, could apply for $20,000 in connection assistance, although this would only be a small portion the total cost since a school must provide access to all qualified faculty and students. But the grants ended up being a good leverage for broader access to the network.[13] What is surprising in many ways is how little (relatively) was spent on this assistance, especially when compared to the tens of millions spent on proprietary networking in the private sector during the same period, and how much bang for the buck resulted. Of course, the very nature of open standards meant that every expansion of the network would so expand its value that institutions would be eager to pay a portion of costs themselves to attach to this ever more valuable resource.

At the same time, NASA, the Department of Energy and other agencies were funding several special purpose networks which, combined with the ARPANET and CSNET, officially became designated the Internet as the TCP/IP protocol tied them together.[14] The same forces that drove adoption of TCP/IP protocols in the US helped bring their adoption internationally, thus encouraging the international Internet community.

The most visionary step taken by NSF came in 1985 when it built a backbone data connection between five supercomputer centers recently established. NSF agreed to build and pay for the backbone only on the condition that the regional centers build regional community networks to assure broad access to the backbone and the other regional networks. In 1987, the NSF awarded a 5-year, $14 million grant to Merit Inc. - a consortium of 8 state-supported colleges and universities - to manage this backbone an d link all regional networks to the supercomputing sites and the overall backbone.[15] NSF not only funded the backbone but also helped fund the startup of the non-profit community networks along with funding universities not currently on the CSNET to participate. With the free NSF backbone provided as a common good, these community networks expanded rapidly. Out of this initiative came the basic regional network connections to the Internet--NYSERNET in upstate New York, BARRNET in the Bay Area, CerfNet in San Diego, SURANet in the Southest, NorthWestNet for the Pacific Northwest.[16] These regional access networks would be available not only to universities but to participating non-profits and businesses who wanted to communicate with the broad technical network on the Internet, although commercial traffic over the Net was barred by NSF rules.

The Merit consortium would subcontract work for the backbone network to MCI (whose data networking unit had been set-up by Vint Cerf, one of the creators of the IP protocol) and IBM. As interest in the Internet grew as the decade of the 1990s opened, the NSF authorized Merit, MCI and IBM to create a new partnership Advanced Network & Services, Inc. (ANS) to manage the backbone and in 1991 allowed ANS to create a for-profit subsidiary to provide businesses a backbone service unfettered by the NSF ban o n commercial users. The idea was that a portion of the funds paid by commercial customers would pay for upgrading the backbone and the regional networks carrying commercial traffic. By 1991, there were 350,000 computers connected to the Internet, including 5000 separate networks in 33 countries.[17] The number of computers on the Internet would continue to explode exponentially.

At the same time, the NSF allowed a number of other companies, led by companies called UUNET and PSINet, to form the Commercial Internet Exchange (CIX) to compete in connecting business customers to the regional Internet services. Later in the chapte r, we will deal with the dangers in this policy that began privatizing the Internet, but the immediate results was a continual expansion of those using the Internet for data networking.

Economic Triumph of the Government & the Pitfalls of Privatization

The first part of this chapter has outlined the technical and social reasons for the triumph of the government-backed Internet. However, the importance of the government was far more than just creating the Internet itself as a public good that the private sector could then exploit. The relationship was much more integrated between government and the private sector, with the government being the direct source of much of the direct economic activity and even companies that became the innovators in the new cybereconomy. Mitch Kapor, the founder of the software company Lotus, has argued, "Encouraged by its successor, the rapidly expanding government/academic National Science Foundation Network (NSFNet), the commercial internet...represents the natural development and expansion of a successful government enterprise."[18]

Conversely, as the government has privatized large parts of the Internet, the issue is raised whether a key source of technical and economic innovation is being undermined in the country. As well, the dynamics of private commercial Internet companies show the very specific regional dynamics of economic development that, if not balanced by continual federal government involvement will just reinforce regional winners in the economy and accentuate the class divide within regions. The next two chapters will illustrate that regional dynamic in the development of the Internet, specifically in regards to the Bay Area. The rest of this chapter will emphasize the overall role the federal government has played in making the Internet-related industry possible and the pitfalls of privatization.

The economic role of the federal government in building the Internet industry has been in three broad areas:

* Training people with the technology skills necessary to create and staff the new markets

* Helping to directly create most of the initial Internet companies, either by providing the technologies and staff, funding startups through government contracts, or as direct spin-offs of government institutions.

* Creating the Internet as the overall framework for innovation and expansion of the commercial sector

Training

Any theory of the economy that assumes that market demand drives innovation faces a major theoretical problem: if the response demands new skills to create the product, what is the process by which those previously unneeded and unused skills in society come under the control of an individual company? Some incremental increase in skills can derive from experience within the firm and established skills are in some cases passed on through workplace-based apprenticeship programs, but for any skills requiring years of training or previously unheard of skills, that training is inevitably derived from sources outside the firm. Without that exogenous source of skilled workers, companies would never have been able to respond to the market opportunities represented by new technologies such as the Internet.

Many of the earliest programmers in the nation came out of the Defense Department's 1950s SAGE program, which was designed to defend against manned bomber attacks. While its military usefulness has been questioned, the program ended up being crucial to training the first generation of computer programmers before almost any academic computer science curricula even existed. Out of the SAGE program was created the Systems Development Corporation (SDC)--a spin-off from RAND to write the SAGE software--which seeded programmers across other companies. To give some sense of the imbalance in private versus public training, in 1954 all the computer manufacturers combined provided just 2500 student-weeks of programmer training, while three years later SDC was providing 10,000 student-weeks of computer training alone. By 1959 there were 800 SAGE programmers and by 1963, SDC had 4,300 employees and 6,000 former employees were in industry. Xerox Data Systems is just one early example of a computer comp any heavily seeded with government-trained SDC alumni.[19]

In the case of the Internet, the creation of firms has been an endless parade of government trained researchers staffing key new Internet companies, from ARPA's Larry Roberts being hired to setup BBN's TELENET data networking to Vint Cerf running MCI' s data networking project to Marc Andreesen being hired to build Netscape. As companies have looked to ramp up new industries, government agencies and government-funded university research labs have been a constant source of employees for cutting edge industries.

Internet Businesses who started Life as a Government Contractor or Literally as Part of the Government

Just as most of the initial employees in early commercial Internet companies have been federally trained, a tremendous number of the initial Internet companies began life as government contractors or literally as private spin-offs from government entities. This fact exists for a simple reason: until technology development is fully mature, private industry has little ability to afford the years of development. Technological maturity is dependent usually on several engineering iterations where key insights become public and no single company can recoup investment on anything but the last iteration of research that finally commercializes the technological insight. Given the social benefit and public ownership of most of these key development stages, n o single company can usually recoup the costs of research development.[20]

And, since large near-monopolistic companies are the only entities other than the government that can come even close to recouping such research costs profitably, the existence of government investment and contracts has been the key route for smaller startups to even have a chance against larger players. The government has supplied the technology and a guaranteed initial market for many such startups over the years, allowing them to become stable companies and able to compete with their bigger competitors. There has been a strong symbiotic relationship between the federal government and entrepreneurial companies seeking to enter the marketplace.

When AT&T and companies like IBM refused to build the envisioned data network back in the 1960s, it was left to ARPA to find a company willing to take on the task. BBN, largely staffed by alumni of government-funded labs at MIT, saw a government contract as a chance to become a bigger player in the data world. It would later use the technological expertise developed on that government contract to launch the private TELENET data network system. Chapter 3 will discuss in more detail how a range of Silicon Valley firms like Netscape and Sun Microsystems developed off of technology and staff funded by agencies like DARPA, converting that last cycle of innovation into commercial success.

What is remarkable is that as the Internet was privatized, all the major companies who would take over running the physical architecture of the Internet would have deep lineages as offspring of government initiatives.

Obviously, the core ANS backbone built by the Merit university consortium with MCI and IBM was initially funded in 1990 by the government and its for-profit subsidiary thrived in the commercial world as the official backbone of the Internet. In 1994, America Online (AOL) purchased the ANS commercial subsidiary along with its customer base for $35 million. AOL was attempting to move from being a proprietary online service to being a premiere Internet service provider and had previously depended on other services for Internet access.[21] With ANS's national network for Internet access, AOL quickly surged past consumer access competitors like Compuserve and Prodigy to over 10 million customers by 1997.

One reason that ANS was sold to America Online in 1994 was that MCI was using its involvement in the ANS project to develop its own national backbone Internet service. Having hired Internet Protocol co-developer Vint Cerf to head its data networking division, MCI would takeover much of the traditional university and government Internet service as the NSF ended funding for the NSFNET backbone service. Most of the regional service providers setup by the NSF were using MCI by 1994 with almost 40% of Internet traffic being carried over MCI wires.[22] Even with the end of the NSFNet, MCI's relationship with the government would not end. In 1995, the National Science Foundation would launch the very high-speed Backbone Network Service (vBNS), known also as Internet II, to create a new generation IP network connecting the original supercomputing centers around which its original backbone had been built. With a five-year, $50 million contract from the NSF to work on the project, MCI would once again have a key advantage in developing cutting edge expertise on the government's dime.[23]

Early in the 1990s, two direct government spin-offs had become the most direct competitors with the NSF backbone for the Internet. As commercial traffic began to appear around the regional non-profit Internet providers created by the NSF, the federal government helped spawn two commercial backbone providers who could connect Internet traffic nationally: PSINet and UUNET. These companies would be the prime instigators of the Commercial Internet Exchange (CIX) to create a commercial system of Internet traffic exchange to compete with the NSF backbone. Performance Systems International Net (PSINet) was formed by a subset of the officers and directors of the New York area's regional NYSERNET provider. By 1995, it had Internet access points in 157 US c ities and had expanded international to Canada, the United Kingdom, Israel and South Korea.[24] As well, a company called UUNET Technologies started life as a spin-off from a Department of Defense funded seismic research facility but be came one of the key backbone providers of national Internet services. After a short alliance with Microsoft, UUNET was bought for $2 billion in 1996 by MFS Communications, which in turn was soon purchased by WorldCom as the core of its Internet service i n competing in the global telecommunications battle.[25]

Another early and strong competitor in the Internet backbone service market was Sprint. Much to its later regret, GTE sold off its 50% share in Sprint for $1.1 billion to its partner United Telecommunications (which took on the Sprint name) in 1991 just as the data networking market was beginning to explode. Sprint had a long history in data networking, unsurprising since GTE had in the 1970s purchased BBN's TELENET service, which became the core of Sprint's original data service. With ARPA-trained talent from TELENET, the SprintNet service would make the company a leader in private data networking during the 1980s and positioned the company well as Internet competition heated up to become one of the largest carriers of Internet traffic by the mid-1990s.

Having largely retreated from the data networking scene with its sale of TELENET in the late 1970s, BBN in the early 1990s aggressively reentered the Internet marketplace with its purchase of many of the NSF-created regional service providers across t he country. In 1993, BBN purchased NearNet, the regional Internet provider for its New England home region, which made it the Internet service provider for universities like MIT, Harvard and Yale along with companies and institutions like Massachusetts General Hospital, Raytheon, Polaroid, and Lotus. With its purchase in 1994 of the Bay Area Regional Research Network (BARRNET) which serviced Stanford, UC-Berkeley, Apple Computer, Hewlett Packard and NASA Ames Research Center, BBN had quickly become the largest provider of local Internet access across the country.[26] With its 1995 acquisition of the regional provider for the southeast, called SURAnet, BBN was the Internet service provider for over 1000 leading companies and educational, medical and research institutions nationwide. At the time, nearly 50% of all local business Internet traffic was going through BBN, although that number would soon fall.[27]

The use of the Internet as a base for innovation and expansion of a dynamic private sector

Still, what is most remarkable about the government role in the Internet is not just the specific investment in technology, training or synergistic relationships with specific Internet companies, but rather in its creation of the broad framework for commercial innovation. This went far beyond traditional conceptions of industrial policy leveraging new industries or supplying the raw materials of innovation (whether research or people) but fundamentally shaping an electronic marketplace based on innovation around common standards rather than monopolistic divides between proprietary systems. The Internet is so ubiquitous now that it is hard to conceive of a world where we would need to access different software and different phone numbers to access different kinds of services we might want, but that was the world being constructed by private industry. And it would have been an extremely stunted electronic marketplace purely because of the hassles of integrating different services for the consumer. The advantages of electronic access would have been greatly diminished, which would have held back demand not just for Internet-related goods but other related computer items that benefit from an expansion of an integrated networking world.

Karl Polanyi argued half a century ago that "The road to the free market was opened and kept open by the enormous increase in continuous, centrally organized and controlled interventionism."[28] The reality is that the Internet is n o accident but neither was it a technological inevitability. It was the product of a US federal government, in association with other nation's experts--in guiding its evolution, in demanding that its standards be open and in the public domain, and that its reach be extended broadly enough to overwhelm the proprietary corporate competitors.

It is under such an open system that small companies can create Internet-related software products and know that they will be compatible with other products given the pervasiveness of the standards. The fate of the companies that were building Microsoft's proprietary network--dropped by Microsoft and left with a useless set of products when Microsoft switched to Internet standards--shows the shadow life of companies that depend on the whims of corporate standards. The open standards of the Internet and the easy distribution of products assures that new companies have the ability to at least attempt to take on established players without having the technology itself used as a block against them.

This is critical for a whole range of information-based industries that Stanford economist Brian Arthur has argued are governed by the law of increasing returns for investment instead of the traditional decreasing returns of earlier economic models. The argument (which Arthur submitted as part of a legal brief to the Justice Department against Microsoft's original proprietary system and its incorporation into its Windows 95 operating system) is that because of a range of built-in advantages for early innovators, companies that attain initial control of a market have a massive advantage over latecomers.[29] Because business customers for software demand compatibility with other products they use and because they have to invest training time to use the initial product, those customers are often reluctant to change products, so early entrants to a market often have an overwhelming advantage in holding onto their market dominance.[30] By assuring a degree of compatibility of all programs and cutting distribution costs, the Internet mediates against the worst monopoly effects of this increasing returns effect.

The Perils of Privatization

However, as the federal government withdraws from active intervention to assure open standards, competition and investment in new technology, there is the worry that the larger corporate players could increasingly undermine the dynamism of innovation derived from the Internet.

Falloff in Basic Research

A prime worry is that private industry is eating the seed corn of research investments made over decades without society replenishing the technological commons with new basic research. Both government and corporate spending on long-term basic research is stagnating in both the private and public sector. With the end of the Cold War, the Pentagon has cut back on research spending and civilian agencies are not making up the slack.

Traditional sources of basic research in the private sector are falling prey to increased competition. The 1984 divestiture of AT&T led to a smaller Bell Laboratories and recent deregulation changes have led to even more reductions in basic research at both Bell Labs and the Baby Bell research centers. Bell labs was recently divided between companies spun off from AT&T and what remains emphasizes research that can be rapidly developed into immediate commercial products. Xerox's PARC laboratory has increasingly followed the same pattern.

As business has grown in the 1990s, corporate America has increased spending on new product research but basic research has stagnated at $5.5 to $6.5 billion per year throughout the 1990s, dropping significantly as a percentage of overall revenue.[31] Throughout the computer industry, R&D as a percentage of sales has been brought down by fierce competition of low-price vendors, such as Dell and Gateway, to ride on research conducted by others. This trend is extending to the telecommunications industry. While Microsoft runs a well-funded research center--mainly emphasizing new ways for people to interact with computers--it still invests nearly 99 percent of its $2 billion research budget on elaborations of existing software or testing rather than long-term basic research. Instead of doing basic research in the United States (with the attendant gains for training and spin-offs within the US), US corporations are increasingly focusing on monitoring research in other countries instead of attempting the research themselves. In this spirit, American companies have increased their research and development spending overseas threefold since 1980, the Commerce Department reports.[32]

What remains of US government funding for high technology has been consolidated in a program called the High Performance Computing and Communications Initiative (HPCCI) which was mandated by legislation passed in 1991. This includes funding for agencies including ARPA, the Department of Energy, NASA, NSF, National Institute for Standards and Technology, EPA, and the National Institutes of Health. The total for the program was budgeted at $1.1 billion in FY 1995, although the "bandwagon" effect of th e Internet has encouraged some programs to be funded under the umbrella that have little to do with research around computing and networking. Those overseeing the federal research worry that while there has been success in the past, both budgetary cutbacks and a focus on high-end speed computing may be distracting from a more concentrated focus on networking.[33]

The end of the Cold War is threatening to undermine the back-of-the-hand industrial policy that has long been embedded within the Defense Department and associated agencies that were funded under the rubric of national security. Without that protective ideological gloss, there has been an ongoing reaction that threatens to fatally undermine ongoing support by the federal government in support of networking standards and extended access by all members of society.

Many observers date the research and policy retreat by the federal government to the departure of DARPA director Craig Fields in 1989 under the Bush administration. Bush officials objected to DARPA's "industrial policy" in funding a variety of research projects in conjunction with industry, including high definition television and other technologies in the intermediate stage between conception and commercial production. His ouster was followed by other DARPA officials such a DARPA's manufacturing head who testified before a Congressional panel that he was told he focused too much of strengthening the industrial base at the expense of defense. This was all seen as a retreat from the engagement by the federal government in moving technology from the laboratory into effective development in the economy. [34]

Withdrawal from Internet governance and ensuing problems

It may have been a symbolic coincidence that 1989 was also the year that DARPA shut down the remaining nodes of the old ARPANET and merged them into the NSF backbone (itself slated to be turned over to the private sector), but the general direction of government withdrawal from active governance in the direction of the Internet was clear. Along with beginning to privatize backbone Internet service (which would be completed by 1995), the National Science Foundation began turning more and more functions over to private companies as profit-making ventures.

Since the creation of the original ARPANET, management and the allocation of IP addresses had been done under contract by the InterNIC at SRI. In 1992, the InterNIC was handed over to a for-profit company called Network Solutions, Inc. (NSI) located in Virginia. In 1995, NSI was allowed to begin charging $100 to register any customer on the Internet for any two-year period. With revenues projected at $72.7 million in 1998, there was increasing controversy over a private company making money off of a monopoly, while other companies were buying catchy names and reselling them to other companies often for thousands of dollars with no compensation to the broader network. Even as debates raged over alternatives such as competitive address registries, the more fundamental issue remained of how to deal with the endless policy issues that kept popping up, from trademark disputes to the security of the system as multiple databases would be used to manage the network.

With the shutdown of the NSF backbone in 1996, the nineteen regional networks located around the country were largely squeezed out of existence as they were either bought out by companies like BBN or faded out under the onslaught of the larger corporate players. With the government no longer enforcing the automatic sharing of Internet traffic routed to people in different parts of the network, the large corporate players were forced to draw up peering agreements to carry each others' traffic but this often left smaller players out of the game.[35] The very complication of pricing "free" information transit in an unregulated system where everything depends on individual agreements to share traffic increasingly overwhelmed the system. By the middle of 1996, private Internet backbones were getting clogged, not so much because of lack of capacity but because individual companies were seeing little incentive to upgrade their own system to transmit traffic for others. In the absence of either clear property rights over shared information traffic or comprehensive regulation, the Internet was developing classic symptoms of the "Tragedy of the Commons" as people increasingly referred to the "World Wide Wait."[36]

With non-profit regional service providers squeezed out, this has eliminating a local shared resource for comprehensive and equitable coordination to bring other non-profits, schools and other entities into the Internet (an issue we will return to in Chapter 7). Individual schools and government entities have, in most cases, been forced to turn to commercial service providers is a ad hoc way without the planning that extended the Internet to meet the needs of successive public institutions in earlier stages of the Internet. There is fear that the federal abandonment of regulated expansion of capacity is leaving the way open for phone companies or cable companies to justify expanding capacity for elite customers at the expense of their local ratepayers. At the same times, private internet backbones are being built to assure speedier information transit for selected customers, threatening the open equality built into the framework of the Internet.

The Threat of Monopoly on the Internet: WorldCom

Even as some celebrated odes to the anarchy and competition due to the privatization of the Internet, the rise of corporate monopoly in the late 1990s began to threaten a new order to the networked world that would undermine the most basic values that had built the Internet. The liquidation of smaller companies by larger ones had begun with AOL swallowing ANS, GTE swallowing BBN and a host of other mergers creating larger and larger corporate players. Up to a point, this was rationalized as a shaking out of competition and the creation of companies large enough to deliver the integrated services desired by customers.

But with the bidding between WorldCom and GTE for ownership of MCI - with WorldCom emerging victorious - many saw a threat of monopolization that could undue the Internet. WorldCom had already grown into a dominant player in Internet traffic; beyond its purchase of UUNET, it initiated a complicated deal with AmericaOnline in 1997 in a $1.2 billion takeover of the CompuServe online service. Not only did WorldCom acquire Compuserve's Network Services data networking unit, in exchange for giving Compuserve's residential customer base and $175 million to AmericaOnline, WorldCom acquired ANS from AmericaOnline.[37]

When MCI Communications agreed to be acquired by WorldCom for $36.5 billion in late 1997, it was merely the most recent in a wave of mergers and consolidations that had occurred in the wake of the 1996 Telecommunications Act of 1996. However, by combining UUNET, ANS, Compuserve's network services, MCI's Internet unit along with a number of smaller Internet companies, WorldCom was now bidding to create a company that would control up to 60 percent of all Internet traffic in the United States.[38]

Opponents and even supporters of WorldCom declared that a monopoly was the likely result of the merger. Everyone from competing phone companies like Bell Atlantic to Ralph Nader to unions like the Communication Workers of America worried that this was not merely the case of a big dog threatening others by its size, but that the nature of Internet privatization would give the merged entity a stranglehold on key access points to the Net. As long as the National Science Foundation was exerting strong regulation of the Internet, the temporary dominance of companies like ANS as a backbone or BBN at the local level made little difference since all parts of the network were required to share traffic without discrimination. For a entity like the Internet dependent on cooperation as well as competition, the new lack of regulation became an invitation for a company to emerge that would use its size to extort monopoly prices as the price for cooperation.

Part of the NSF privatization in 94-95 had been the creation of key "network access points" or NAPSs - switching platforms where Internet traffic is transferred between different networks. While not the only interconnection points, the NAPS are the fastest and most efficient spots for transferring data or for independent local ISPs to hook up to the Internet. An assortment of different companies had been given contracts to operate specific NAPs, but with the merger WorldCom would control five NAPs, including the two dominant NAPs: San Jose in the heart of the Internet explosion and Washington, DC which had become the main connection for traffic going to Europe. Bell Atlantic would file its opposition to the merger, arguing, "As owner of five of the NAPs, WorldCom has the ability to influence the terms by which traffic is shared not only between its network and other networks, but among other networks as well. An ISP cut off from the WorldCom NAPs is in dire straits..."[39]

This threat was not academic. Earlier in 1997, WorldCom's UUNET had announced it was terminating "peering" agreements - the commitment to automatically share electronic mail and Internet traffic - with all networks other than its fellow backbone providers like Sprint and MCI. Such peering had been the Internet's underlying strength from its origin, since its very decentralization was based on cooperation in making sure each message got to its ultimate destination regardless of who owned which networ k. The most prominent challenge to UUNET's decision was David Holub, the President of Whole Earth Networks which ran one of a venerable electronic service and ISP called the WELL. He argued vociferously and publicly that since the WELL was already paying UUNET to place its computers physically at the NAP, UUNET should not be able to use that ownership to extort more money for the right to transfer information as well. Holub's public fight so frightened the Whole Earth Networks board of directors that they fired him, marking the new fear in the ISP community of angering the large backbone providers and NAP owners who now dominated the Internet.[40]

Ironically, even as companies like WorldCom were ending peering agreements with smaller ISPs, they were collaborating with those same ISPs to demand that the Federal Communications Commission maintain the essentially free interconnection to the local phone infrastructure that had been granted to Internet carriers when it was an academic and public entity. Even as WorldCom is charging local ISPs for access to its infrastructure, it is able to exploit the local phone infrastructure over which most Inter net traffic is initially carried while paying nearly nothing for its upkeep. (We will return in Chapter 5 to the dynamics of deregulation and the regional politics of local infrastructure in shaping regions in the global economy.)

Adding to criticism of the merger are fears that WorldCom not only is seeking a stranglehold on the Internet but is uninterested in extending its benefits or broader telecommunications services to average consumers. WorldCom Vice Chairman John Sidgmore stated in late 1997 that WorldCom was eventually going to abandon MCI's plans for expanding residential access to telecommunication services: "Our strategy is not in the consumer business."

With WorldCom's focus on business customers and its key strangleholds on network access, the earlier vision of cooperative regional networks promoting broad access to the Internet is increasingly giving way to global multinationals using control of regional chokeholds to serve elite customers and their own bottom line, mostly at the expense of the average citizens who paid for the Internet's creation in the first place.

The problem is that in the wake of privatization and the NSF's withdrawal from active governance, the general competition between Internet companies was already leading to Internet traffic congestion and disputes that had raged out of control of the informal structures for governance left in place. Many conservative commentators who had once attacked government telecommunications monopolies have now praised WorldCom as the savior of the Internet through consolidation and rationalization. George Gilder, the rightwing guru of technology, rapturously praised the merger, "Mr. Ebbers [WorldCom's CEO] will be the salvation of the Internet...Like John D. Rockefeller and Michael Milken before him, Mr. Ebbers has shown the magic of entrepreneurial vision and guts...[Only Ebbers can] build a net that can bear the burdens of continual exponential growth."[41] In the absence of a strong government presence to assure basic rules of cooperation and equal access, Gilder is probably correct that monopoly is the only solution, though it is a solution that will produce a very different Internet from the one promoted by its original creators at ARPA.

The Threat of Monopoly on the Internet: Microsoft

While the rise of the Internet has been a threat to its desktop monopoly Microsoft built in the 1980s, the withdrawal of strong government-backed standards also made it an opportunity to expand its reach to a degree impossible before. With forecasters expecting between $80 and $160 billion in electronic commerce by the year 2000, the Internet has become the decisive realm of computer competition for the future.[42]

The threat was obvious: with its tradition of open computing standards connecting computers of all kinds, the Internet looked ready to make proprietary operating systems for individual machines an anachronism. As the Internet broke into national consciousness in 1994 and 1995, it appeared that millions of computers were connecting to one another with Microsoft having nothing to say in the matter. The rise of Netscape and a host of other new Internet companies seemed to promise a new era of competition including a whole new cast of companies. The final nail in Microsoft's coffin seemed to be the failure of its proprietary Microsoft Network service in the face of open Internet standards.

However, the opportunity for Microsoft from the growth of the Internet is that that if it can seize control of those standards, this will reinforce its control not only of the desktop but of corporate computing that is increasingly dependent on the Internet. Combining its traditional tactics of hardball agreements tied to its desktop monopoly, control of the programming tools used by most programmers, and an avalanche of technology and company acquisitions - amounting to more than one major technology per month by 1996 and 1997 - Microsoft launched a bid for control of the Internet that was dizzying in its breadth and depth.

The most visible part of the battle for control of the Internet is the so-called "Browser war," the fight largely between Microsoft and Netscape Communications over which piece of software is used by computer owners to surf the Internet. Netscape dominated sales of browsers since 1994, but it rapidly lost market share to Microsoft and as its Explorer browser was more and more integrated into Microsoft's operating system, most analysts expected Microsoft to take over this software market.

But browsers are more than a piece of software- they have a decisive impact on all standards for web design. Browsers are the means for any computer user to "read" information from a World Wide Web computer server at some distant place. If the dominant browser is designed not to "read" a certain kind of information - a kind of graphics, software effect, etc. - then web page designers will be loathe to use that kind of information or technology, while they will tend to support software standards that are compatible with the dominant browser. And if you are a software company like Microsoft selling web servers, web design and an array of Internet commerce software, you have an overriding interest in controlling those Web standards.

Unfortunately, the government's withdrawal from strong support for Internet standards had created the opportunity for Microsoft's bid for control. Belatedly in late 1997, even as the FCC was exploring whether the WorldCom-MCI merger would undermine competition, the Justice Department launched its own investigation of Microsoft.

But as with monopolization in the physical architecture, it is unclear what a blanket call for competition means in an institution like the Internet where cooperation demands consistency throughout the network. Without active government involvement assuring such consistency, many people would defend Microsoft monopolistic practices as serving a purpose for customers. In a world of rapidly changing technology, Microsoft's monopolistic grip on standards and different market segments give consumers some assurance of stability and interconnection between products.

Why the Feds Withdrew from Standards on the Internet

So why the withdrawal in the first place? The retreat of federal involvement has been based on a combination of ideological opposition, private industry desires, and the disappearance of a stable government bureaucracy able to assume the role of regulator. This is leaving Internet development increasingly in the hands of self-interested companies with little support for the "information have-nots" who have little chance for a voice in the future shape of the network since it's direction will have be en largely decided outside political arenas where they have any representation.

The ideological assault on federal involvement in further developments of the Internet is strongly related to the end of the Cold War and the withdrawal of the "national security" fig leaf that had covered much industrial planning by the federal government since World War II. It was probably not a coincidence that technology policy promoter Craig Fields was fired from ARPA the same year as the fall of the Berlin Wall. While the Clinton Administration made some gestures in asserting a public interest in the development of what they called the National Information Infrastructure (NII), privatization proceeded apace under them. What limited funds the Clinton Administration allocated for encouraging community and local government development of the Internet was vociferously opposed by conservative Republicans and, with the Republican takeover of the Congress in 1994, those funds were initially zeroed out and in the end sharply limited, even as local need for the funds exploded with the expansion of the Net. The Telecommunications Act of 1996 further dismantled regulations that might have been used to require lifeline Internet service or other features of local utility structures that will soon be a thing of the past as cable, telephone and Internet services providers cannibalize each others' markets targeting the up-market consumers.

Private industry had significantly benefited from government spending on the Internet in the period when it was not commercially viable and the government was the main market for Internet-related computer services. However, as a private market for Internet services appeared around the structure of the Internet, private industry has seen a vibrant public sector as a threat to their control of information markets. Companies that had started life as extensions of the government saw the opportunity for independence and extremely high profits as the government's role receded. As Peter Evans notes in his work, the success of government intervention in nurturing new economic sectors is often rewarded by the creation of a private sector interest in blocking further government action.

Similarly, the success of the private sector helped fragment and undermine the ability of the "Weberian elite" to successfully work in the public interest. Partly this is due to the outside hostility of ideological opponents and of business, which politically wants to curtail the power of the public sector as the private sector succeeds commercially. With Defense involvement in high technology under assault and Republicans trying to abolish the Commerce Department where most of the NII programs have been coordinated in the Clinton administration, there has been little chance for a long-term view by public servants watching their political backs.[43] As importantly, though, was the way private opportunities for profit turned many of the engineers and ARPA employees from public service into representatives of the private companies now pushing for limiting the federal role. From Bob Metcalfe, who became rich through founding 3Com to Vint Cerf who has become a major spokesperson for MCI, the founders of ARPANET who initially cultivated the ethic of freely sharing information and software are now fighting for profit share and private ownership of intellectual property.

Regional Production Districts as an Alternative to Monopoly?

If there is any alternative to monopoly in the wake of the privatization of the Internet, it is the vibrancy of cooperation of Silicon Valley's Internet firms. If opportunism and monopolization is the rule of global competition in the absence of strong government intervention, open standards and non-opportunistic competition seem to have more life embedded in the institutions and relationships of a specific region like Northern California. It is clear that while national government planning and intervention was the driving force in the creation of the Internet, its expansion as a commercial enterprise was supported and defended against proprietary attacks by the range of Silicon Valley firms committed to its open standards.

In turn, the economic and technological strength of Silicon Valley has deep roots in both government spending and technical engagement over the years. Chapter 3 will outline in detail the ways federal involvement built the region's general technological strength and, in particular, how regional firms and the Internet expanded in a synergistic relationship between government intervention and private sector entrepreneurial activity. This synergy had profound effects not only on the region but on the global shape of the economy. In that sense, government support for such regional economic networks can be seen as one more form of pervasive government intervention that deeply shapes the economy.

The question becomes whether that regional cooperative model, built over decades with government support, can survive the privatization of the Internet. Chapter 4 will explore the regional efforts to revitalize that cooperative model in the new era to support the Internet against proprietary alternatives. These efforts will be compared to the broad national experience in using such regional business-to-business consortia as an engine for economic and technological growth. A crucial issue is even if businesses in a region benefit from such cooperative regional models, what are their relationships to broader sectors of the regional economy in promoting jobs and economic equity? If this regional community is one that benefits only an economic elite, it is a rhetorically shrunken conception of community that, in the absence of the original government intervention that expanded economic opportunity, may be little better for the majority of the population than the monopoly alternative.

------------------

Endnotes for Chapter 2

[1] "Survey of the Internet: The Accidental Superhighway." Economist. July 1st, 1995, p.4

[2] What that TCI-Bell Atlantic merger means for you. Fortune v128, n12 (Nov 15, 1993):82-90. [3] Messmer, Ellen. "Industry heavyweights team up to design information superhighway." Network World v10, n51 (Dec 20, 1993):20.

[4] See chapters 3,4,5 in Chandler, Alfred Jr. The Visible Hand: The Managerial Revolution in American Business. Cambridge, MA. Belknap Press of Harvard University Press. 1977.

[5] Steven Levy, "How the Propeller Heads Stole the Electronic Future". New York Times Magazine, September 24, 1995, p. 59

[6] Rheingold, Howard. Tools for Thought- The People and Ideas Behind the Next Computer Revolution. Simon & Shuster. New York. 1985.

[7] The history in this chapter derives from a wide range of sources detailed in the bibliography but an invaluable source is: Hafner, Katie and Matthew Lyon. Where Wizards Stay Up Late: The Origins of the Internet. Simon & Shuster, New York. 1996 along with Rheingold, Howard. Tools for Thought- The People and Ideas Behind the Next Computer Revolution. Simon & Shuster. New York. 1985. Other sources used include: Levy, Steven. Hackers: Heroes of the Computer Revolution. Anchor Press. Garden City, New York. 1984; Vinton G. Cerf. "Computer Networking: Global Infrastructure for the 21st Century." Copyright 1995 by Vinton G. Cerf and the Computing Research Association. On the Internet at http://cra.org/rese arch.impact; Hardy, Henry Edward.The History of the Net. Master's Thesis. School of Communications Grand Valley State University. September 28, 1993; Zakon, Robert Hobbes'. "Hobbes' Internet Timeline v2.5." http://info.isoc.org/guest/zakon/Internet /History/HIT.html. 1993-6

[8] Also see Cerf, Vinton, as told to Bernard Aboba. "How the Internet Came to Be." in "The Online User's Encyclopedia," by Bernard Aboba, Addison-Wesley, November 1993; Sterling, Bruce. "Short History of the Internet" The Magazine Of Fantasy And Science Fiction, February 1993.

[9] Peter Evans. Embedded Autonomy: States and Industrial Transformation (Princeton, Princeton University Press: 1995), p. 49.

[10] There is a more extensive history of this evolution of professional governance of the Internet in: Kahn, Robert E. "The role of government in the evolution of the Internet. " Communications of the ACM v37, n8 (Aug 19 94):15-19.

[11] Hafner, Katie and Matthew Lyon. Where Wizards Stay Up Late: The Origins of the Internet. Simon & Shuster, New York. 1996, p. 248.

[12] Haffner and Lyon, Ibid.

[13] Press, Larry. In unpublished manuscript "Seeding Networks: the Federal Role", to be published in CACM. lpress@isi.edu.

[14] See Baker, Steven. "The evolving Internet backbone. (history of the Internet computer network) UNIX Review v11, n9 (Sept, 1993):15 for more info.

[15] Gottlieb, Joseph. "Education: The Right Formula." Network World v5, n22 (May 30, 1988):28-30.

[16] See Kahn, Robert E. "The role of government in the evolution of the Internet." Communications of the ACM v37, n8 (Aug 1994):15-19 for other information on this later development of the Net.

[17] Anthes, Gary H. "Commercial Users Move onto Internet." Computerworld v25, n47 (Nov 25, 1991):50.

[18] Schrader, William; Kapor, Mitchell. "The significance and impact of the commercial internet. "(Global InteNet: Global Internetworking Strategies and Applications special section) Telecommunications v26, n2 (Feb, 1992): S17 (2 pages).

[19] Press, Larry. In unpublished manuscript "Seeding Networks: the Federal Role", to be published in CACM. lpress@isi.edu

[20] See US Government. Evolving the High Performance Computing and Communications Initiative to Support the Nation's Information Infrastructure for a broad discussion of the economic benefits dervied for private industry fr om the existence of government research.

[21] Messmer, Ellen. "America Online buys Internet service provider." Network World v11, n49 (Dec 5, 1994):4.

[22] O Shea, Dan. "MCI focuses on the Internet." Telephony v227, n22 (Nov 28, 1994):6.

[23] Bernier, Paula. "The science of high-speed computing." Telephony v228, n18 (May 1, 1995):7,15.

[24] Raynovich, R. Scott. "LAN Times talks to PSINet's William Schrader about Internet connectivity and the competition." LAN Times v12, n22 (Oct 23, 1995):8 and Kahn, Robert E. "The role of government in the evolution of the Internet." Communications of the ACM v37, n8 (Aug 1994):15-19.

[25] Balderston, Jim. "GTE, Uunet merge Internet services." InfoWorld v18, n29 (Jul 15, 1996):12.

[26] "BBN To Acquire Barrnet From Stanford." Cambridge Telecom Report. Jun 27, 1994.

[27] "BBN Planet Launches Internet Services Nationwide." Cambridge Work-Group Computing Report. March 20, 1995.

[28] Polanyi, Karl. The Great Transformation: The Political and Economic Origins of Our Time. Boston. Beacon Press. 1944., p. 141.

[29] Arthur, W. Brian. "Increasing Returns and the Two Worlds of Business." Harvard Business Review. July-Aug 1996.

[30] "A World Gone Soft: A Survey of the Software Inudstry." Economist, May 25, 1996.

[31] Uchitelle, Louis. "Companies Spending More on Research and Development." The New York Times. November 7, 1997.

[32] See Louis Uchitelle, "Corporate Outlays for Basic Research Cut Back Significantly." New York Times. October 8, 1996 and US Government. Evolving the High Performance Computing and Communications Initiative to Support t he Nation's Information Infrastructure.

[33] US Government. Evolving the High Performance Computing and Communications Initiative to Support the Nation's Information Infrastructure.

[34] See "DARPA official hits policy." Chilton's Electronic News v37, n1871 (July 29, 1991):4, Anthes, Gary H. "Feds ax high-tech point man: DARPA move seen as slap at industry funding. " Computerworld v24, n18 (Apri l 30, 1990):1 (2 pages), Robinson, Brian. "Bush, DOD attacked for firing Fields." Electronic Engineering Times, n588 (April 30, 1990):1 (2 pages), Baker, Stan. "Industry wonders: what's next?" Electronic Engineering Times, n588 (Apr il 30, 1990):84, Robertson, Jack."Transfer of DARPA head ignites furor." Electronic News v36, n1807 (April 30, 1990):1, and Robinson, Brian. "DARPA: we'll risk it." Electronic Engineering Times, n534 (April 17, 1989):19.

[35] Armstrong, Lisa. A new tune for the Internet pipers? Communications International v21, n12 (Dec 1994):8.

[36] Murphy, Jamie and Charlie Hofhacker."Explosive Growth Clogs Internet Backbone." New York Times. July 29-30, 1996 and Murphy, Jamie and Brian Massey. "No Shortage of Bottlenecks on Information Superhighway." New York T imes. July 29-30, 1996.

[37] Pappalardo, Denise. "WorldCom adds to its 'Net riches." Network World v14, n37 (Sep 15, 1997):6.

[38] Kirsner, Scott. "WorldCom/MCI: Good Giant or Bad?" Wired News. Oct 2, 1997. Schiesel, Seth. "MCI Accepts $36.5 Billion Worldcom Offer." New York Times. November 11, 1997.

[39] Bell Atlantic. Filing at the FCC in response to the WorldCom-MCI merger. Jan. 5, 1998.

[40] Leonard, Andrew. "Gobbling up the Net" Salon Magazine. June 12, 1997.

[41] Rosenberg, Scott. "WorldCom buys up the Internet. The Net becomes WorldCom's fiefdom." Salon Magazine. Oct. 9, 1997.

[42] "In search of the perfect market." The Economist. May 10, 1997.

[43] See Hellerstein, Judith. "The NTIA needs to rethink its role in the new telecommunications environment." Telecommunications (Americas Edition) v30, n8 (Aug 1996):22 for the trials of the NTIA agency in the Clinton administrati on. 132