Chapter 3: Federal Spending and the Regionalization of Technology Development

Chapter 3:
Federal Spending and the Regionalization of Technology Development

What created the Internet and why is Northern California at the center of its development? Nothing to do with the government, at least according to Wired, the hip techno-monthly that promotes the Net as the embodiment of a new paradigm in human development, unshackled by government, scarcity or even geography. Monthly covers feature a parade of cyber-moguls from hip anti-establishment outfits like CitiCorp and TCI Cable, self-made men riding the wave of cyberspace based on nothing other than pluck and a vision of a libertarian world of freedom.

There is an almost charming Gatsbyesque quality to this denial of the past in plotting the brave new unregulated future of cyberspace. Less charming to many is the fact that so many people not only buy the magazine but the ideology it reflects in so many high-tech firms. Richard Barbrook and Andy Cameron of Britain's Hypermedia Research Center have labeled this viewpoint "The California Ideology" in a widely circulated article that notes the convergence of Gingrich social cutbacks with an ideology that forgets the defense-funded past of the Silicon Valley region. This ideology "promiscuously combines the free-wheeling spirit of the hippies and the entrepreneurial zeal of the yuppies" while ignoring the increasing inequality fueled by information technology.[1]

Responding to the article, Louis Rossetto, Editor & Publisher at Wired, denounced the "laughable Marxist/Fabian kneejerk" ideology that overlooked the way Europe's "statist" support for technology contrasted with the way the US had developed its technology based not on government spending but on capital markets and free market development. "In point of fact," Rosetto argues, "it was the cutback in American defense spending following the Vietnam War and the subsequent firing of thousands of California engineers which resulted in the creation of Silicon Valley and the personal computer revolution."[2]

The levels of intellectual repression embedded in this sentence are astounding--from the myth that companies like Apple Computer emerged from the garages of their designers like Athena from Zeus's head to the assumption that the concentration of engineering talent in the Bay Area was a fact of nature much like the pleasant climate. And it represses the fact that the very Internet boom that is Wired's raison d'être, like the semiconductor technology that runs its hardware, are inescapably the products of government support in the United States.

At the most obvious level, government spending was the engine of economic and technological growth in the region, from federal contracts that built the intercontinental railroad to defense contracts that spawned the first wave of electronics companies during and after World War II to the spending on the Internet funneled through key Bay Area institutions like Stanford, UC-Berkeley and Xerox PARC. For decades in Silicon Valley, winning a defense contract was almost the only reason a startup company had a chance to enter technology markets against already established firms.

As critical, though, was the role of the government in fostering the long-term business and social networks that made the region's technological innovation possible. Political scientist Robert Putnam has noted that the social capital that fuels regional economic growth is not the result of short-term events but is part of long-term persistent patterns that evolve over long periods of time and embed themselves within the fabric of a region. It was a synergy between federal government and local actors, aided by periodic infusions of federal cash, that slowly evolved the interconnected cooperative model of technological development that became Silicon Valley's hallmark.

Nowhere is this clearer than in the way the region benefited from and in turn contributed to the federal government's shaping of the open standards of the Internet, standards that in turn reinforced the region's dominance of the new Internet-related businesses. As was detailed last chapter, open standards allow dramatically expanded technological possibilities and economic growth, but the benefits of such standards-driven growth tend to accrue to regionally concentrated firms where collaboration can fully exploit the opportunities from open standards. Competing proprietary standards controlled by corporate titans can be directed from geographically scattered enclaves (such as Microsoft's Bellview or IBM's Armonk headquarters), but the Internet as an open system backed by the federal government had the seemingly paradoxical effect of reinforcing the importance of a handful of specific regions, especially cementing Silicon Valley in a lead role.

This chapter will outline how these three government-driven factors - spending, promotion of social networks, and standards - were crucial in creating the Silicon Valley that has become in turn so crucial to the birth and expansion of the Internet. Starting with a short history of the role of the government in the emergence of Silicon Valley, that history will be examined in the context of debates over regional economic development. In turn, that early technological development would help shape a network of firms and collaborative behavior that would attract new waves of public investment and private commercial development fueled by information networking and culminating in the explosion of Internet-related businesses this decade. While internal regional factors were not irrelevant to these results, the overall history makes clear how much regional development is dependent on national and global decisions outside the control of local actors.

Early History of the Valley: Building the Social Capital of Technological Innovation

If there is any clear idiosyncratic indigenous reason for the rise of Silicon Valley it is indisputably the founding in 1887 of Stanford University, the font of both technology and engineers who would fuel much of the technological hyperactivity of the region. Yet Stanford's origin itself, though a private university, is hard to separate from the government. Its founder and benefactor Leland Stanford was not only a Senator and Governor in the state of California but was also one of the four founders of the Southern Pacific Railroad, itself the economic power behind the throne of state government for decades until the Progressive movement of Hiram Johnson broke its hold in the early part of this century. And the fortunes of the Southern Pacific were made by the massive federal subsidies paid for the Western leg of the transcontinental railroad. In fact, Leland Stanford himself drove in the golden spike at Ogden, Utah that completed the intercontinental link. Out of the profits of that venture, Leland Stanford was able to endow Stanford with $20 million, one of the largest philanthropic gifts ever made in his day, along with a 7,200 acre farm near the redwoods of El Palo Alto.[3]

In this way, the federal government of a century ago indirectly invested through its first economic linkage of the continent in the founding of an institution that would in turn help electronically link the nation and world a century later. And from the beginning, Stanford University had the practical orientation to science and technology that would be its hallmark. Early on, the Bay Area was a hotbed of radio experimentation and, with the earthquake of 1906, a number of firms in the budding industry were encouraged to move south down the peninsula towards Palo Alto and the science center of Stanford. In 1909, a former Stanford student Cy Elwell traveled to Copenhagen to ask for the rights to manufacture a special telegraph system called an arc transmitter. On his return, Stanford President David Starr Joran was impressed enough with the idea that he invested $500 of his own money and other faculty and local businessmen followed in supporting the founding of the Federal Telephone and Telegraph Company. Federal would go on to hire a succession of Stanford graduates for years until the company moved East in 1932 (to eventually be absorbed into ITT). 1909 was also the year that San Jose established the first commercial radio station in the nation.

In the meantime, the company would serve as a focus for the emerging radio and audio industry of the region. In 1911, the firm hired Lee de Forest, the man who had patented the first vacuum tube along with 300 other patents in his lifetime, who had come west after being fleeced by his own company's board of directors. He stayed at Federal for only a few years but helped build the reputation of the region's companies. The sinking of the Titanic led in 1913 to the passage of the Radio Law requiring radios on all commercial passenger ships and, combined with military demands in World War I, the region's radio industry expanded rapidly.[4] Not for the first time, the mandating of technological standards by the federal government would benefit the region.

A second generation of electronic pioneers was being born in this period. Frederick Terman moved to Stanford in 1910 with his father Professor Lewis Terman (famous for his co-invention of the standard IQ test) and Frederick soon joined the ranks of ham radio hobbyists in the area. He received a BA from Stanford and a doctorate from MIT. He had meant to pursue a career on the East Coast but on a trip home, he came down with tuberculosis and, through an initial part-time teaching load, ended up as a professor at Stanford himself. In 1924, he was made head of a new "radio communications laboratory" which would become a center for technological innovation on the West Coast. In 1932, Terman wrote a classic text on Radio Engineering and by 1937 he was a full professor in charge of the Department of Electrical Engineering.

Upset about graduates moving back east for jobs in the 1930s, Terman encouraged a number of his students to start up companies locally. Two of them, William Hewlett and David Packard, had graduated in 1934 and moved back east. Four years later, Terman recruited them back to the region with fellowships to complete electrical engineering degrees. As a master's thesis, Hewlett created a frequency oscillator, a device to produce controlled electrical signals at whatever frequency desired. Terman personally lent the two $538 and arranged a $1000 bank loan to get them off the ground on their own company. With Hewlett's wife baking the first painted panels in their oven, Hewlett-Packard was born as a company.

World War II and its Aftermath Fuels the Explosion of Bay Area High-Tech Firms

But like so many firms to follow, it was only the massive military spending of World War II and the post-war period that launched these firms into the stratosphere of growth. Hewlett would be called to active service in the war in 1942, ending up in the development division of the War Department's special staff where he made key contacts for the postwar period. At home, Packard started filling government orders that by the end of the war had Hewlett-Packard earning $2 million per year with 200 employees.

Another regional company that exploded in the war was Litton Engineering Laboratories, which a former employee of Federal Telegraph had started when Federal moved East in 1932. Manufacturing glass vacuum tubes, Litton found himself running a million-dollar company by the end of the war as radar expanded the demand for his product. After the war, Litton incorporated and as a military contractor would grow to be a billion-dollar company by the 1960s.

The other key company that made its reputation in the war were the Varian brothers, Sigurd and Russell, who worked in the Stanford physics lab run by their collaborator Bill Hansen, who developed a lightweight radar tube called the Varian Klystron to help Allied aircraft fly in the dark. (Stanford would receive $2 million in royalties for its support of the brothers). While the brothers worked at Sperry Gyroscope during the war, they returned to the area and founded Varian Associates in 1948 to become a powerhouse in the electronics instruments business.[5]

Overall, the federal government pumped $35 billion into California between 1941 and 1945 and the state has continued to receive 30 to 40 percent of all Department of Defense R&D contracts since then. This spending built up not only electronics but created a broad-based manufacturing economy in the Bay Area. Of $5 billion in ships built during the war, $3 billion worth were built in the Bay Area alone.[6] The geopolitical need to take on Japan in the Pacific led the federal government to create, almost overnight, a radically expanded manufacturing base in California that would fuel growth for decades and create a ready local market for the electronic products of the emerging Silicon Valley.

During the war Frederick Terman had been recruited by Vannevar Bush to direct a massive research team in the East dedicated to jamming enemy radar. Recognizing a coming post-war wave of federal funding, Terman returned to Stanford after the war dedicated to making sure it and other California institutions got their full share of defense contracts. In writing guidelines for sponsored university research, he argued, "The war...brought to the West the beginnings of a great new era of industrialization. If Western industries and Western industrialists are to serve their own enlightened and long-range interests effectively, they must cooperate with western universities and, wherever possible, strengthen them by financial and other assistance."[7] Berkeley was also emerging in this period as a science powerhouse, riding partially on the reputation of Berkeley physicist Ernest Lawrence who had won the Nobel Prize in 1939 for his development of the cyclotron, a way to accelerate stream of protons to break open the nucleus of targeted atoms.

But Terman did not stop at attracting research money; in collaboration with his contacts in the federal government, he developed a homegrown industrial policy to economically link the University to new high-tech endeavors. He began promoting the idea of using the extensive University land, which the original bequest barred from being sold, as an industrial park for high-technology firms. Varian Associations, where Terman served as a board member, became the first leaser of land in 1951 when they agreed to a $16,000 per year lease for ninety-nine years on four acres--what would become one of the sweetest land deals in the Valley as prices rose in the region. When Hewlett-Packard set-up headquarters in the Park in 1954, it became the nucleus of the emerging Silicon Valley. By 1955, seven companies were in the Park, thirty-two by 1960, and seventy in 1970. By the 1980s, all 655 park acres were leased with ninety tenant firms making $6 million per year for the university. This was important not just because it linked Stanford to emerging high-tech firms (with over 60% of Stanford's engineering faculty consulting with government or business over the years). As well, the unrestricted income from the leases could be used by Terman for his ambitious faculty recruitment goals to raise Stanford to the forefront of academia.[8] The cluster of firms around Stanford would become a business community whose links to one another would become increasingly important over the years as technological collaboration became more and more important.

The other critical institution in the area was the Ames Research Center established in Mountain View in 1940. It would later become part of NASA and its presence in Northern California encouraged Lockheed to establish its Lockheed Missile and Space Company subsidiary in the Valley in 1956 in order to be close to Ames. When Lockheed set up its new Missile and Space Division in Sunnyvale in 1957, Stanford agreed to supply faculty members to train workers in return for Lockheed helping build Stanford's aeronautical engineering department.[9] Lockheed would quickly become the region's largest employer. Added to this was state government support for the rising engineering talent of San Jose State University. The combination of Ames, San Jose State, Hewlett-Packard, Stanford, and now Lockheed became an irresistible force in attracting new research centers to the region. Throughout the 50s, the electronics giants began opening divisions in the South Bay: General Electric brought its nuclear division to San Jose, Westinghouse brought its heavy equipment division to Sunnyvale, Sylvania built a plant in Mountain View, and IBM opened a research lab in downtown San Jose to design its first computer disk memory system.[10]

Silicon Comes to the Valley: Semiconductors and Defense-Driven Entrepreneurs

It was the end of the 1950s that would create the semiconductor industry that would give the region its enduring public fame. It was at Bell Labs in 1948 that a team of researchers, led by William Shockley, that the first transistor was invented to replace the large, bulky vacuum tube in electronic devices. The federal government immediately gave Bell Labs a government contract to expedite development of the transistor. By 1950, when Bells Labs had figured out how to produce the transistor more commercially, AT&T was feeling the heat from an anti-trust suit filed in 1949 (which would ultimately lead to AT&T in 1956 being barred from selling transistors and computers in the open market). In response, AT&T in the early 50s widely licensed the technology and gave detailed seminars in how to manufacture the new transistors.[11]

In this way, the federal government not only funded the development of the technology but also assured that it would be widely in the public domain. Combined with defense funding channeling funds to far-flung regions far from Bell Labs headquarters, the new semi-conductor technology would facilitate high-tech booms in a number of regions far from traditional manufacturing regions in the East and Midwest. One of the first companies to take advantage of this new technology was Texas Instruments which grabbed a top Bell scientist and developed the first silicon version of the transistor for sale to the military in 1952. By concentrating on supplying the military, Texas Instruments quickly became the largest merchant supplier of semiconductor devices, although because of the open licensing from Bell Labs, over twenty-six companies were competing with it by 1956. With the military funding its own experiments in miniaturizing transistors, it was clear that a rich prize of military contracts would exist for the firms that delivered a cheap, small integrated transistor.

For the future of semiconductors in Silicon Valley, the key event was the return in 1955 of William Shockley to his boyhood home in Palo Alto, where his Dad had taught mining engineering at Stanford. With his return he would attract much of the initial talent to the region that would set in motion a whole industrial explosion in semiconductor production in the region.

Shockley's effect on the region was truly a surprise, especially for the physicist himself who expected to become a millionaire but instead saw the whole leadership he recruited to the region quit and form their own company. Shockley was described as a "genius, but a real prick" to work for, so eight of his top employees left to form Fairchild Semiconductor with $1.5 million in support by Fairchild Camera and Instrument Corp. in New Jersey. In 1959, Fairchild was given the contract to supply transistors for the Minute Man I missile program's guidance system. Robert Noyce took the helm of the new enterprise and it was his invention of the integrated circuit that same year (along with Jack Kilby of TI who shared the patents) that would make Fairchild's fortune.[12] Having paid premium prices throughout the 1950s that paid for its evolution to a mass production item, the military began buying integrated circuits in a volume that would make the industry boom. Considering that IBM refused to use integrated circuits in its 1960 model 360 series of computers, the military support was critical for the new industry. In a Rand study on the effect of the military on the semiconductor industry, Anna Slomovic has written, "Two government procurement decisions were responsible for moving integrated circuits into large-scale production. In 1962 NASA announced that its prototype Apollo guidance computer would use integrated circuits. Shortly thereafter, the Air Force announced the use of integrated circuits in the Minuteman II guidance package."[13] With nearly 100% of integrated circuit purchases in 1962, the military's initial support was crucial in supporting the iterations of development that would expand the market for commercial applications throughout the next decade.

In the 1960s, the federal government would indirectly push integrated circuit technology into the consumer market when it began requiring televisions to be able to receive UHF channels, channels that traditional vacuum tubes often failed to receive reliably. Fairchild built a prototype television using solid state circuits that helped convince manufacturers to buy their integrated circuits. Much as legislation requiring radios on ships after the Titanic had led to an initial technology boom out of the region, so too did this new standards mandate by the federal government help spur the expansion of commercial sales of the Valley's semiconductor products tied to solid state technology.

As the 60s progressed, Fairchild's employees would begin leaving to start-up their own companies in the hot emerging market for semiconductors. Partly because of the corporate parents' refusal to spread stock profits among the employees and partly because of the immense profits to be made running their own companies, even the eight founders would flee Fairchild. By 1967 the energy of Fairchild was gone, replaced by a slew of new firms, including National Semiconductor, Advanced Micro Devices and Intel, led by Noyce himself. The defense department contributed to this splintering of Fairchild by its policy of requiring a second source for all computer chips; with second-source contracts to be had, technological startups had ready markets once they secured a bit of financial backing. With Intel's invention in 1972 of the 8008 microprocessor--a device to combine all needed circuits for computer processing on a single chip--the first computer product was created for which the military would not be the first major customer. Its successor the 8080, produced in 1974, would become the basis for the rise of the home computer industry in the Valley, while the density of engineering talent and electronics production would become an almost irresistible magnet for a whole range of computer startups in the next two decades.

As will be detailed, the key concepts and technologies driving the commercial personal computer industry - from basic design to the computer mouse to icon-based operating system - were the products of government-associated labs at places like Stanford or Xerox PARC. Another key part of government support in the region was the continued expansion of education and training of local talent to staff these new ventures. While Stanford and Berkeley get most of the press attention, as important was the evolving state and community college system in California (itself enjoying the benefits of expanded federal tuition support for students). By the 1970s, San Jose State University was training as many engineers as either Stanford or Berkeley and six surrounding community colleges offered technical programs among the best in the nation. Community colleges, particularly, would work with local businesses by contracting to teach private courses for their employees, even holding courses at company plants to enable employees to attend after hours.[14] These evolving education to business links were further reinforcing the economic networks that were so crucial to collaborative innovation in the region.

Coincidentally, just as the computer industry found its first mass commercial market after decades of dependence on government R&D and purchases, ARPA was expanding government support for computer networking and the Internet that would launch the next computer revolution two decades later.

Augmentation vs. Automation: The Internet Origins in Silicon Valley

If the earlier waves of technology in the region were driven by the geopolitics of defense, the networking wave of computing that eventually exploded in the 1990s derived partly from defense support as well, but sprang from the most visionary reaches at ARPA rather than direct weapons procurement. The defense and semiconductor industries in the Bay Area made it a magnet for attracting both funds and top talent for the energy unleashed by ARPA and, in turn, the energy of open standards unleashed by and guided by the federal government would spur the growth of companies in the region who would be in a position to dominate the global Internet-related industry when ARPA's work came to fruition decades later.

One of the earliest thread's in the origin of the Internet dates back to an Atlantic Monthly article published right after World War II by Vannevar Bush, the prominent MIT researcher who defined much of the vision for government support of science in the post-war period. In this article, he had posed the problem that many scientists were unable to keep up with all the knowledge they needed and proposed a machine using microfilm he called the "memex" to enhance the powers of human memory and association. One of his most important readers turned out to be Doug Engelbart, a young sailor who read it as World War II came to an end in 1945. Moving beyond microfilm, he became fixated on the idea of augmenting human capacities through use of computers to better link people both to stored ideas and to other people working together in groups. He contrasted his vision of using computers to "augment" human knowledge with the more mainstream focus on "automation" of individual tasks.[15]

Coming out of the navy, Engelbart would work at Ames Research Center from the late 1940s until 1951, when he would leave for a doctorate at UC-Berkeley. He eventually ending up at Stanford Research Institute in 1957, a creation of Frederick Terman's dedicated to defense-related research and assistance to West Coast businesses. There he would receive a small initial grant from the US Air Force Office of Scientific Research to pursue what Engelbart called an "augmentation laboratory" to explore how people and computers could share knowledge. He would publish his ideas for using computers not just for the traditional computation and data processing for which they were currently used but as a tool that could act as an adjunct to individual creativity and as a link between human beings. While broadly ignored, it paralleled the ideas of J.C.R. Licklider's "Man-Computer Symbiosis" article, which had led to Licklider being recruited to ARPA. In 1963, Bob Taylor, then at NASA, pushed through new funding for Engelbart's project and in 1964, having moved over to ARPA, he was able to fund Engelbart's project with a million-dollar timesharing computer plus half-a-million dollars to staff and run his proposed laboratory. Out of Augmentation Research Center (ARC) would come an array of researchers who would go on to become leaders of their own research teams at universities and commercial R&D divisions in the region and across the country.[16]

Considered an odd-duck by his SRI colleagues, Engelbart was suddenly running what could arguably be called one of three most important computer labs in the country--the others being MIT where explorations of timesharing computers were already spawning revolutions in the minicomputer industry and Stanford's Artificial Intelligence Lab (SAIL). Engelbart was building on the tradition of interactive computing largely funded at MIT by a $3 million per year ARPA grant focused on time-sharing called Project MAC. It was MIT hackers at Project MAC who largely designed DEC's breakthrough PDP-6 timesharing minicomputer and would spend endless hours making it perform tasks beyond the expectation of any of its creators.[17] Probably the most important precedent for Engelbart was Ivan Sutherland's SKETCHPAD program which created the first graphic manipulation of computer images, allowing users to resize and manipulate graphics. And it was Ivan Sutherland who had been hired to run ARPA after J.C.R. Licklider and who put Bob Taylor in charge of ARPA's Information Processing Techniques Office from where he would develop the ARPAnet project.

Engelbart worked from Sutherland's precedent to concentrate on the manipulation of text and ideas. ARC would combine hardware design, "building the tools", that would in turn support an "intellectual workshop" to create the breakthrough software needed for Engelbart's vision. Working with seventeen colleagues and going through three rapid cycles of hardware revolution, by 1968, he was ready to publicly demonstrate the results at an engineering conference called the ACM/IEEE Joint Computer Conference in San Francisco. And the results stunned the audience.

Hooked up by microwave communication to the computers back at SRI, Engelbart would demonstrate the array of tools developed at ARC: the first "mouse" used as an input device, a windowing environment that could rapidly switch between a menu of information sources and models of information, word processing on screen. None of these had ever been seen before and, in an age when most programmers were still interacting with computers through punch cards, the idea of word processing was a revelation.[18] What was demonstrated was only the showiest example of a set of tools developed to facilitate communication and shared information-based work among intellectual collaborators. ARC was already using text-editing to share common data through hypertext storage (the method of linked pages used in the World Wide Web) and ran an electronic mail communication system with dedicated e-mail distribution lists among the researchers--all of this years before these innovations would come to the ARPAnet. ARC would also pioneer video-conferencing years before it was developed commercially.

It was in 1968 as well that Bob Taylor, now in charge of the ARPA office developing the Internet, would publish with J.C.R. Licklider an article entitled "The Computer as a Communication Device" which outlined a vision of ordinary people using new tools to share information, arguing that "When minds interact, new ideas emerge." They would cite ARC as the prototype of their vision.[19] And it was in 1968 that Bob Taylor asked Engelbart to make ARC the Network Information Center (NIC) for the ARPAnet. Engelbart saw the ARPAnet as the perfect vehicle for extending his vision of distributed collaboration, so in 1969, SRI would become the second computer on the ARPAnet. As the NIC, Engelbart would help identify and organize electronic resources on the Internet for the easiest retrieval. Until 1992 (when the NIC functions were awarded to other companies), the function of the NIC at SRI would include administration in assigning IP network addresses and domain names for all servers, essentially creating the yellow pages for the Internet.[20]

Surveying the initial implementation of the ARPAnet in a speech in 1970, Engelbart could already envision the evolution of the networked community where, "there will emerge a new 'marketplace,' representing fantastic wealth in commodities of knowledge, service , information, processing, storage, etc."[21] In the early 70s, ARC began collaborating with business managers in envisioning the office of the future. However, when ARPA cut off funding to ARC in 1975, the project rapidly shrunk down to Engelbart plus a lot of software. SRI had separated from Stanford in 1970 and SRI had never having been that excited about Engelbart's vision in the first place. SRI preferred a vision of "office automation" to replace, not augment workers. So SRI sold the software to a company called Tymshare and Engelbart went along to help sell his software as part of Tymshare's general office automation service. Engelbart would eventually end up as director of the Bootstrap Project at Stanford, still pursuing his dream of using technology to enhance human potential.[22]

What is startling about Engelbart's achievement, often ignored due to the institutional liquidation of ARC, is how many of the conceptual computing breakthroughs and initial implementations were achieved by his team. To name just a few critical to the networked economy:

* pioneering distributed electronic mail and e-mail lists five and seven years before ARPAnet.

* implementing word processing a decade before it began to appear in offices.

* designing the mouse as an input device sixteen years before Apple introduced it to the world.

* creating a windowing environment twenty years before Microsoft.

* envisioning hypertext-linked documents in a distributed environment a quarter-century before CERN and Mosaic.

All of this was paid for by the federal government due to the vision of Bob Taylor and ARPA. As importantly for Silicon Valley, this federal investment would contribute to making the region a magnet for new visionary talent and a wellspring of the networked economy.

Xerox PARC: Making it Personal

When Xerox CEO Peter McColough announced his intention in 1969 to make Xerox the "architect of information" for the future, the attraction of the already existing Silicon Valley companies and the research energy of Stanford, especially the ARC project and Stanford's Artificial Intelligence Laboratory, were irresistible. When Bob Taylor was hired in 1971 to run PARC, he knew whom to recruit. Many of the earliest PARC researchers would come directly from the nearby ARC lab. But Taylor's experience organizing annual meetings of major ARPA researchers and hosting a separate annual conference for key graduate students involved in ARPA projects meant he knew where the best minds were across the country. The establishment of Xerox PARC in Palo Alto became a critical bridge in pulling together in one place fifty to sixty of the hackers and visionaries who had been working in separate streams of research for the last decade.[23]

If the attractions of Silicon Valley would pull people to PARC, there were significant factors pushing researchers there as well. Federal research funds were becoming increasingly militarized as the Vietnam War dragged on, so funds for pure computer research were drying up. The Congressional passage of the Mansfield Amendment forced ARPA to provide specific justification for most computer projects, thereby ending the open-ended hiring of hackers at places like MIT that had fueled much of the creative innovation there.[24] On the flip side, many computer researchers with anti-war views found their employment by the Defense Department more and more unacceptable; many were looking for alternatives and Xerox became a top option. Bob Taylor himself had left ARPA in 1969, disgusted with the war after being sent repeatedly to Vietnam to deal with the controversy over the Army's "body count" reports.[25] He spent a year at the University of Utah, but like others with anti-war views who he would recruit, Xerox looked like a golden opportunity to continue the computer and networking research to which he was devoted. Perversely, even as military demand was powering employment in Silicon Valley, the militarization of research was also helping the region by driving a cadre of anti-war hackers from the East Coast to the West.

Out of this concentration of talent would be produced most of personal computer technologies and design concepts which exist today. Applying the concepts developed at ARC, PARC researchers in 1974 created the Alto personal computer, a computer that would include a mouse, graphical icons, and a windowed environment. Given the commitment of ARC and ARPA alumni to collaborative computing, the PARC researchers created Ethernet data connections between all their computers to allow file exchange and electronic mail. PARC also created the first laser printers, usable by any computer on the network, and pioneered what is called client-server computing to maximize the computing power of all computers on the network. PARC designed software for the Altos included word processing, page layout, and electronic mail management, all using graphical user interfaces for ease-of-use. In summary, PARC had created the modern personal computer and office system well over a decade before any company would commercially apply their ideas.[26]

PARC's work was fueled by support from government-funded institutions, both through direct funds for a number of projects and through collaboration-inspired breakthroughs. The most important, as described in the last chapter, was the use of the ARPA-funded ALOHANET radio networking system as a model by ARPA alumni Bob Metcalfe in designing Alto's Ethernet networking technology. Until ARC disappeared, PARC researchers continued to work with colleague's still at that institution, although PARC's focus on personal computers led to a greater and greater divergence in vision. Where Engelbart at ARC was focused on more centralized systems that would enhance groups working together in universities and industry, PARC became more and more focused on maximizing the power in the hands of each individual computer user.[27] In a sense, this is symbolic of tension and splits in approaches in the computer industry that would persist for the next twenty years. And with the commercialization of the Internet, as will be noted at the end of this chapter, the fundamental debates are being replayed in debates over personal computers versus network computers.

PARC contributed not only many of the key technologies and concepts for the personal computer revolution, it would contribute many of the key company leaders who would commercialize those concepts. Alan Kay, a key Alto designer, would become chief scientist at Atari and later a senior researcher at Apple. Steve Jobs would visit PARC in 1979 and his guide that day, Larry Tesler, would join Apple in 1980 to work on the Lisa and Macintosh projects that would first bring graphical user interfaces to a wide computing market. Bob Metcalfe would leave to setup 3Com to sell Ethernet technology to network offices and university computers, a critical prelude to broad participation in the Internet.

More indirectly, PARC and the rest of the Palo Alto-based labs helped foster an environment of shared information that helped spawn many of the earliest computer companies. PARC programmers would contribute programming time to community computer projects, many of them run by countercultural and anti-war activists who saw technology as a new tool for liberation. In turn, when the Intel 8008 chip came to market, a number of those engineering-minded community activists, notably Fred Moore and Lee Felsenstein, would in 1975 organize the Homebrew Computer Club, a legendary collection of students and engineers who would come together to hash out many of the ideas needed to assemble personal computers. Meeting bi-weekly in the auditorium of Stanford's Linear Accelerator, local hackers would mix with researchers from PARC and Stanford's labs, sharing the ideas that would birth many of the personal computer companies in the Bay Area.[28] In many ways, the Homebrew Computer Club deserves its mythic reputation since it symbolizes the convergence of the region's political networks, embodied in radical activists like Felsenstein, with the government-funded technology institutions of the defense establishment. With this overdetermined social network blooming in the region, if it had not been Homebrew members Steve Jobs and Steve Wozniak, someone else was inevitably going to lead the personal computer revolution.

One crucial PARC defector from the Silicon Valley region was Charles Simonyi, the man who had designed the breakthrough Bravo word processing programs for the Alto. After meeting Bill Gates during a trip to Seattle, he would join Microsoft in 1981 as the company's director of advanced product development. At that point, Microsoft's business was essentially translating computer languages for use on personal computers, his premiere product being Gates' translation of BASIC (itself the creation of Dartmouth professors working on a National Science Foundation grant back in 1964). It was Simonyi who laid out the business plan for Microsoft to expand into databases, spreadsheets, and, of course, Simonyi's specialty in word processing. Microsoft Word, designed by Simonyi and a Xerox protégé, would become the keystone of Microsoft's push into software applications. Charles Simonyi would describe himself as "the messenger RNA of the PARC virus" at Microsoft.[29]

Xerox itself never really successfully commercialized the early ideas generated at PARC. Partly this is due to the company's separation of its research and manufacturing divisions that analysts like Florida and Kenney have so roundly criticized. But the reality was that the research being done at PARC was deliberately non-commercial. The PARC researchers well understood that computer technology was delivering double the power every year, so they were designing not for what was commercial in the mid-70s but for what would be needed and used in the offices of the 1980s.

What the experience of PARC and Stanford's ARC labs show is the indispensability of public investment both to the advance of basic technology and to regional economic innovation. PARC has been lauded for its critical contributions to the advance of computing. And every large corporation has vowed never to make the same mistake Xerox made in funding it in the first place (and have cut back their long-range research budgets to prove it). From Xerox's financial point of view, PARC was a costly mistake. But from the general society's perspective, however, it was an incredibly fortuitous mistake that in a time of cutbacks in non-military research, Xerox PARC prevented the dissipation of the community of ARPA and MIT researchers. By maintaining ties to the national network of ARPA researchers and to the local university institutions while being embedded within the private economic innovation of Silicon Valley (itself the product of previous iterations of government support of research and development innovation), PARC would help set the stage for much of the explosion in the personal computer revolution in the Silicon Valley region. The US needs to use ARC, PARC and their embeddedness within the ARPA network of public-oriented researchers as a model for public investment in research.

Even as Xerox retreated from its commitment to pure research at PARC in the late 70s and early 80s, its legacy and continued DARPA investment in the region would lead to new birth of Internet-driven industry. This new wave of networking companies in the region would not only absorb the technology of networking but by actually promoting the ethic of open standards embodied in the Internet would position the region for economic dominance as the Internet came to replace proprietary networks in the 1990s.

Before we turn to how and why Bay Area firms benefited disproportionately from the Internet, it is worthwhile to revisit some of the theoretical debates on the impact of the government on the high-tech industry and regional development.

Did the Government Create the High Tech Industry?

There has been a large debate about whether government support was the critical factor in the success of high technology in the US and even whether its role has been destructive in a number of instances. Some researchers like Richard Florida and Martin Kenney have argued that military spending has been a drain on innovation, diverting companies from continual innovation in bringing products to market towards military needs.[30] There is little question that using the military as the main tool for industrial policy in the United States has led to waste of potential engineering talent, but it is less clear that much of that engineering talent would have been there at all without the existence of military R&D and a military market for the initial products of high-tech innovation. After World War II, money from the Department of Defense plus the Atomic Energy Commission accounted for 96% of all research money to universities--the key source of training new engineers.[31]

In the 1950s, federal government accounted for two-thirds of all computer-related R&D and still accounted for over half of R&D in the mid-60s. The two most cited commercially-funded breakthrough, the integrated circuit and the microprocessor, were products of companies, like Texas Instruments and the Silicon Valley firms, that were immersed in that publicly funded milieu of technological innovation. Parallel teams at Texas Instruments and Fairchild created the integrated circuit within months of each other, and they repeated this breakneck competition in creating similar microprocessors within months of each other a bit over a decade later. This reflects the fact that most commercial technology is short-term innovation based on publicly funded basic research. Robert Noyce himself argues that at best, his talent may have pushed forward developments by a year or so, noting "these were ideas whose time had come."[32]

While the home computer boom commercialized technology and spread it widely over the last two decades, until the Internet (itself again the product of careful federal support) came along, the industry was designing based on general models drawn from government funded projects decades old. Paul Allen, who was Bill Gates partner in creating Microsoft, has been devoting a large chunk of his wealth (over $100 million) to a "think tank" firm called Interval founded in 1992 precisely because Allen felt that short-term engineering and commercialization has not been able to deliver the breakthroughs of a generation ago. "Everyone out there," Allen has said "is doing things in a one- to three-year time frame, and so we're trying not to do that." Citing the tradition in the early 1970s of Xerox PARC (and hiring some alumni, including Interval's director David Liddle), he sees Interval as an attempt to do the long-term research that has not been occurring.[33]

Even in the commercialization of computers, the government played a strong role, from mandating the UHF standards that created the first major mass market demand for integrated circuits to the anti-trust pressure on AT&T. Through the latter, the government made sure that the technical knowledge of transistors was broadly enough available in the public domain to allow a proliferation of competitive firms involved in the production of semiconductors.

The deeper criticism of the role of defense spending on R&D is not really whether it was responsible for much of the technological breakthroughs of the last half-century: the record is relatively clear that is was. Rather, the question is over whether that policy indirectly hampered day-to-day "process" innovation in high-tech manufacturing. Florida and Kenney argue that the very emphasis on technological breakthroughs has left US companies at an international disadvantage in the competition to efficiently produce and qualitatively improve already existing products. Where once industrial research labs worked hand-in-hand with company manufacturing facilities, the availability of military R&D contracts caused the "decommercialization" of industrial R&D with manufacturing facilities oriented to commercial markets and industrial research aimed at producing technological breakthroughs. With half of big corporation's R&D paid for by the Pentagon with the accompanying cost-overrun budgeting, such R&D was the ultimate safe bet for corporate profits. However, this helped split corporate cultures between making cutting-edge products for the military and making cheap products for the public. Scientists were separated from production and manufacturing came to be seen as a "profit center" where time and resources were not to be wasted on developing innovations in production techniques.

This functional separation soon led to a geographic separation; in the words of Florida and Kenney, "R&D grew further and further estranged from factory production, as companies moved their manufacturing plants to new low-wage, non-unionized locations and then relocated their R&D facilities to suburban campuses."[34] They cite Xerox PARC as the poster child for the failure of corporate R&D because of its separation from the main manufacturing culture of Xerox. This left its breakthrough technological innovations ignored within Xerox, so startup companies like 3Com commercialized its Ethernet networking system and the Apple Macintosh raided its mouse and icon software system.

But Florida and Kenney's analysis of Xerox PARC just reinforces the essential difference between the dynamics of R&D for commercialization of technology by individual firms and the more public-oriented research that is needed for the fundamental technological breakthroughs that power whole regional and national economic transformations. And to whatever extent the separation of research labs from manufacturing facilities hurt US industry nationwide, that separation benefited the Bay Area tremendously given the disproportionate placement of such centers there. So once again, national policy, however unintended, had strong effects in shaping the strength of Silicon Valley's regional economy.

The Regional Economic Effects of High-Tech

Looking at Silicon Valley in a national context, its pattern of development, while unique in some aspects, still follows broad trends in regional economic development among defense-driven areas. The historical military presence in the Bay Area has already been discussed but it's worth emphasizing that even through the 70s and 80s, military spending continued to flow to the region (just as it had nationally in a range of fast-growing regions around the country even during the military cutbacks of the 70s.) Even as some of the semiconductor firms found more and more commercial markets, the largest employer in the region remained Lockheed Missiles and Space Company and by the early 80s, it was receiving new billion-dollar contracts for its Sunnyvale operation. Other major defense contractors included FMC Corporation of San Jose, which was doing $1 billion of military business a year in that period. Ford Aerospace, GTE Sylvania, ESL Inc. were among the range of other defense contractors as well. In 1983 as the Reagan defense budget ramped up, Santa Clara County had direct defense contracts of $4 billion on top of the large indirect purchases of its electronics by military subcontractors in other parts of the country.[35]

In a different vein, a number of commentators and researchers have noted the way the serendipitous birth in Palo Alto of two key figures, Frederick Terman and William Shockley, were crucial to the emergence of the region as a high-tech center. M.J. Taylor has argued that location theories ignore the basic fact that a majority of new firms appear in founder's hometowns.[36] While this individualistic view of regional organization history forces a certain respect for contingency, it has to be examined in the context of the long-term environment that produces innovators interested in a particular area of production and the resources available that make it possible for any business to appear. Terman's and Shockley's early interest in technology had been stimulated by a South Bay environment alive with technical innovation stimulated economically by the federal government in World War I. This interest was reinforced by massive World War II federal spending that would reinforce their interest in staying within the region. In the same way, the appearance of Microsoft in Seattle had much to do with an environment powered by Boeing and the federal government in World War II and its aftermath of technical excitement. This would entrance a young Bill Gates with excitement over technology and be a region with the technical support that would bring him back to the region when he established his own firm.[37] This reinforces the fact that economic development is almost impossible to formulate as short-term economic incentives but is ultimately shaped by national and global economic forces that, as Robert Putnam observes, have usually long-term effects and often unexpected effects on specific regions.

The focus on Frederick Terman as a person also helps to obscure a focus on Terman's role, namely as a conduit for national industrial policy driven through the Department of Defense. The fact that so many unelected "civic boosters" like Terman ended up being the conduits for industrial policy in the post-war United States has helped contribute to a certain degree of collective amnesia about its existence, while also making much of the defense-driven suburbanization of industrial development (and the consequent abandonment of many inner city areas) seem a natural phenomenon rather than a policy-driven result. Frederick Terman the person can be placed in the pantheon of indigenous Silicon Valley heros, but Frederick Terman as the coordinator of national research and defense funds makes clearer the national priorities that helped birth and sustain the region's technological innovation and economic growth.

The other major explanation given for the birth of high technology in the region is the availability of venture capital. Much of the discussion on venture capital is intertwined with debates over the policy of lowering the capital gains tax, since those who advocate the role of venture capital cite quite impressive variations in the availability of venture capital tied to the capital gains tax rate. In 1969, there was $171 million in private funds dedicated to venture capital firms nationally. Then, the tax law in 1969 raised the capital gains tax from 28 to 49 percent. By 1975, only $10 million in new money was dedicated to venture capital. In 1978, lobbying by Silicon Valley and other high tech firms led to capital gains taxes being scaled back to 28 percent. A decision in 1979 by the Department of Labor allowed more pension fund money to go into venture capital. In 1981, the capital gains tax was further decreased to 20 percent. By 1982, $1.4 billion in new money was dedicated to venture capital.[38] While impressive as raw numbers, however, the birth of firms like Atari, Apple and a range of other key firms during the mid-70s when formal venture capital was scarce shows that venture capital was hardly the determinant of innovation and business success in the region. The fact that continued innovation and the expansion of venture capital would follow the 1986 increase of the capital gains tax to 28 percent also throws doubt on the causal link of tax rates and business expansion. Other analysts have argued that venture capital is more a causal result of defense funding infusions into the region combined with the region's already existing role as a banking center.[39] And many business leaders, including Andrew Grove and Gordon Moore at Intel have questioned whether what they termed "vulture capital" hurts or helps the region, since it often just encourages the dismemberment of solid firms as employees churn through different ventures looking for the quick profit.[40]

The interaction of venture capital and defense spending highlights the fact that Silicon Valley evolved not based on internal dynamics (although there were some important indigenous forces) but based on the region's link in a global system of research, production, distribution and financial markets. Researchers David Gordon and Linda Kimball see the specific strength of the Silicon Valley region as non-replicable without those broader global linkages; otherwise, attempts to subsidize such high-tech centers merely create "illusory Silicon Valleys" based on limited branch-plant technologies with little chance of generating the diversity of energy that drives innovation in the Bay Area. They go so far as to argue that "objective factors" generating indigenous high-tech agglomerations matter little and analysts should concentrate on "comprehension of a region's social and economic position within the corporately organized integrated circuit of global production."[41]

How true this statement is overall will be explored in the next chapter in looking at how the Internet as a tool and an economic force is changing business relationships between firms in Silicon Valley and with firms globally. What is clear is that the Internet is strengthening the Bay Area's focus as the center of technology nationally and globally. Where proprietary information services would have thrived in a diffuse set of geographical enclaves, the very openness of the global Internet has had the seemingly contradictory effect of reinforcing a few regions' prominence as coordinators of the technology and standards. The Bay Area assumed this role because it was so focused a product of the same public investment that created the Internet in the first place. In turn, its businesses shaped themselves to support that emerging Internet and were therefore positioned to benefit most dramatically as it expanded geometrically in the 1990s.

Sun Microsystems & Open Standards: Making Virtue a Commercial Necessity

Probably no single private company contributed more to the sustaining of the open standards driving the Internet, or benefited as much from the adoption of those standards, as Sun Microsystems, a seller of high performance computers. Sun would enter, then dominate, the market for what were being called workstations, relatively inexpensive stand-alone machines aimed at engineers that were beginning to replace time-share minicomputers. Started in 1982, Sun would be one of the fastest growing companies in history, making the Fortune 500 within five years and clearing $1 billion in revenues in six years.[42] By 1995, the company would sell 1.5 million high performance computers used as the core systems for networking in government, universities, finance, and engineering. And from the first day of operation, every single Sun computer was shipped with hardware and software designed to be hooked up to the Internet. It was on Sun computers that much of the Internet would be networked in the 1980s and it was on Sun workstations that the first Web browser, Mosaic, would be designed.[43]

That Sun was committed to open standards reflected the company founders' emergence out of the milieu of Bay Area graduate students immersed in the ARPAnet. When Stanford M.B.A.'s Scott McNealy and Vinod Khlosa teamed up with Stanford student Andy Bechtolsheim who had developed a new high performance computer using off-the-shelf components, it was natural for them to adopt UNIX, the popular university operating system, as the operating system for their new computer. And it was natural for them to bring in as a cofounder Bill Joy, the premiere UNIX and ARPAnet programmer at UC-Berkeley in the late 70s and early 80s.

When Sun entered the emerging market for workstations, their main competitor (aside from established minicomputer makers like DEC and Hewlett-Packard) was Route 128-based Apollo, the startup company that in 1980 had pioneered the workstation market. Apollo was committed to proprietary standards of both hardware and software, a traditional strategy in the computer industry. Such a policy guaranteed Apollo high profits on each machine, first from sale of their own technology (as opposed to Sun having to pay others for their components) as well as a continual stream of profits from licensing fees from proprietary software.

Analysts like Anna Lee Saxenian see the adoption of open standards by Sun as a cost-cutting move to break into a market that would never have trusted a proprietary system developed by a company run largely by graduate students; "When Sun Microsystems pioneered open systems in the mid-1980s, it was largely making a competitive virtue out of an economic necessity. As a start-up, Sun lacked the financial resources to develop the broad range of new technologies needed for a computer system."[44] While it is true that Sun took advantage of the diversity of firms in Silicon Valley to quickly and cheaply produce their initial workstation design, it is important to understand that far beyond making a virtue out of a necessity, Sun's founders saw the virtue of open computing standards as a centerpiece of corporate strategy, even as a religious imperative. And making that virtue of open standards a necessity for all computer customers was the key strategy by Sun for defeating rival companies using proprietary systems.

Sun's founders commitment to UNIX's open standards, especially Bill Joy's, derived from the fact that UNIX was the first operating system developed that was independent of specific hardware, thereby giving users and programmers more freedom from the dictates of hardware designers. It could be "ported" to different machines, thereby allowing the same program to run on completely different hardware. Created at Bell Labs in the late 60s when AT&T was still barred from the computer business, UNIX was widely licensed by AT&T, mostly to universities. UNIX was especially popular with ARPAnet programmers working on a wide variety of computers because they needed to created an integrated set of software tools for managing their emerging network. UNIX had developed into a number of lackluster variations, so in the late 70s, UC-Berkeley researchers (largely funded by DARPA) developed an improved version that was dubbed UNIX 4.1 BSD (Berkeley Software Distribution). Bill Joy, the lead programmer in the Berkeley UNIX effort, was again funded by ARPA in 1981 to create a new version of UNIX including TCP/IP networking protocols.[45] With a minimal $150 license fee, Berkeley seeded its UNIX version with its Internet protocols throughout the university world.

Commercial versions of UNIX, however, were splintered between various incompatible proprietary versions. Far from being a widely used standard in business that Sun could just hop a ride on, Bill Joy and the Sun team had to create a standard and sell private industry on the gospel of open computing. They took a number of steps to make the UNIX sold on Sun's computers seen as a real standard. Sun gave away the BSD UNIX and TCP/IP networking software with every computer they sold. When Sun develop a Network File System (NFS) in 1984 that enhanced network computing by making it possible to share files between different computers, they didn't try to sell this advance as normal software. Instead, they licensed it to the industry for a nominal licensing fee and even published the specifications for the software on the usenet electronic bulletin board so anyone could construct an alternative to the NFS file system if they wanted to avoid the license fee. Usable on DOS, VMS and other operating systems, it was a key advance for networking and increased trust by customers that Sun would be an honest guardian of the open standards it was promoting on its hardware. Another key step was made in 1985 when Sun approached AT&T, allowed back in the computer industry, and worked out an agreement to merge Sun's Berkeley UNIX with AT&T's System V, further enhancing the public view of Sun's UNIX as the standard.

The first sales of their initial workstation model, the Sun-1, had nearly all been to universities which, not surprisingly, liked the UNIX standard promoted by Sun. These sales had given Sun both the technical feedback and credibility to help them sell their second version, the Sun-2, to a number of commercial buyers, notably resellers in the engineering and Computer-Aided-Design (CAD) field. Based on its commitment to networking, Sun had made some inroads into finance and automated manufacturing markets as well. Still, by the beginning of 1986, Sun was still a relatively small ($115.2 million in 1985 sales), if significant player even in its own workstation niche within the computer industry. Apollo still dominated the workstation market and both Hewlett-Packard and DEC had introduced new workstations of their own in 1985.

What moved Sun' into dominance of the workstation market and into a position to take on even higher-end computer markets was the federal government's decisive move in 1986 in support of Sun's UNIX standard. By the mid-80s, the federal government was faced internally with a mess of different computer systems that needed to be networked together. Because of the close ties of the Department of Defense to university researchers (largely fostered by ARPA/DARPA), the federal government already had an affinity for UNIX. So in 1986, the government passed regulations that no company could bid on any government computer contract unless their system offered UNIX as an option. This gave Sun a huge advantage in securing a large slice of the $500 million, five-year National Security Agency contract then under bid. Sun's and AT&T's version of UNIX was now the benchmark for selling to the government and university markets (along with many private industry customers who would follow the government's lead in standards.) This was reinforced in 1988 when the Air Force declared DEC's proprietary version of UNIX, called Ultrix, ineligible for government contracts.

With the NSA contract under its belt, 1986 would be the year that Sun passed Apollo as the top seller of workstations. Within a few years, Sun would have over 50% of the federal workstation market, accounting for twenty percent of Sun's sales. [Need to get some breakdown of university plus all levels of government][46] And by 1991, it would add forty state and local districts to its list of customers as all levels of government adopted open computing standards, often passing similar regulations to the federal government.[47]

Apollo along with other workstation makers would do a complete turnabout in 1987 and beginning promoting its "open computing" UNIX systems, but as in the case of Sun's other rivals, it was too late. Sun would use the advantages of close relations with a range of suppliers and collaborators in Silicon Valley (and increasingly around the world) to maximize its advantages in a world of standards that it and the federal government had helped establish. Sun would help promote standards for windowed environments within UNIX and continue to promote networking standards of all kinds, promoting its own computers with the slogan, "The Network is the Computer."[48] This standardization would not be complete, however, especially as rivals like DEC and Hewlett-Packard (which absorbed Apollo in 1989), committed to a combined rival UNIX standard. This division over UNIX standards in the early 1990s would come back to haunt all the workstation makers as Microsoft and Intel began invading their turf.

Sun's most radical move came as it developed, working with computer scientists at UC-Berkeley, a new higher power computer hardware design called SPARC (meaning Scalable Processor Architecture). As soon as it developed the processor, it began licensing it to other vendors in July 1987. Never before had a computer manufacturer unveiled its CPU architecture for the price of a license and a royalty. Working off an operating standard while selling advanced hardware was one thing; allowing clone companies to attempt to beat you on price on your own hardware was something else. But Sun not only licensed the technology but also supported the creation of an independent SPARC Vendor Council to set standards, including competitors such as Toshiba. Just as creating the UNIX standard had encouraged a number of other Bay Area firms (notably Silicon Graphics) to jump into the workstation market in the late 80s, Sun was betting that the sacrifice of some sales within the SPARC market would be more than compensated by an enlarging of the overall demand for SPARC machines, especially as Sun pushed deeper into higher-end computers.[49]

What is key to evaluating Sun's commitment to open standards is to understand that its success was not inevitable. An open standard if established did give a significant advantage to a company like Sun able to collaborate with the myriad of technologically innovative firms in Silicon Valley. But the establishment of that standard in the first place was based not on an economic determinism, as Saxenian argues, of companies driven by the need to respond to rapid product cycles and technological complexity.[50] Rather, it depended upon the fact that Sun was riding a standard saturated across the country's universities under the guidance of DARPA and upon the fact that the federal government had destroyed the viability of competing proprietary systems by completely blocking off the federal government to those alternative systems. It was in that national context and application of federal public policy that Silicon Valley's industrial district-style innovation would win out over Apollo and DEC in Massachusetts.

For a counter example where proprietary standards did quite well in the 1980s, one need only look at the success of the "Wintel" duopoly managed by Intel and Microsoft. Where Sun spent the 80s and 90s licensing specifications of its hardware to everyone who wanted it, Intel spent the same period in court suing anyone, especially Advanced Micro Devices, who might try to duplicate the workings of its microprocessor. And where Sun worked hard to establish trust among customers and software vendors as an honest steward of the UNIX system standard, Microsoft spent the same period being cursed by software vendors and investigated by anti-trust lawyers. By the mid-90s, the result was that Intel had driven nearly every competitor out of the personal computer processor business (with a small resurgence by rivals late in the decade), while Microsoft had come to dominate every software category it chose to enter.

The difference was that in the personal computer market there was no preexisting government-backed standard to adopt and the government never used its purchasing power to punish proprietary standards. In this sense, it is clear that the roots of success of the collaborative system of production in Silicon Valley are not indigenous but are extremely dependent on a supportive national system of standards. The virtue of open systems, therefore, much as virtue in general society, pays off for its practitioners only when general society supports its practice and punishes the "immoral."

With Microsoft increasingly dictating proprietary standards, production of personal computers were based on little cooperation so production spread to far-flung enclaves around the country and the world, from Texas to South Dakota to South Korea. While UNIX companies like Sun did negotiate many international contracting deals, the production and design of its workstations, as with competitors like Hewlett-Packard and Silicon Graphics, remained rooted in the collaborative environment of Silicon Valley as standards and innovation moved forward without any one company controlling the evolution.

As we will return to at the end of this chapter, only as personal computers have been integrated into the open standards of the Internet has Microsoft had to adjust its tight control of its operating system to more open standards, "embracing and extending" the Internet in the words of Microsoft executives. Not coincidentally, this comes at a time when Intel processor power is approaching that of lower-end workstation servers, pitting Microsoft directly against UNIX-based systems for the first time in competing for the networking market. This may in the end be an example of virtue winning out in the end, or, in the absence of continued government support for open standards, the triumph of the false penitent feigning virtue until the retreat of the law in order to win out in the end.

Cisco and the Commerce of InterNetworking

If Sun had to struggle to establish the standards that would help sustain the rise of Internet networking, Cisco Systems would hop the explosion of those standards to become by 1996 a company worth $40 billion, rivaling General Motors in market capitalization.[51] With a name less known than any of the other computer titans, Cisco supplies the technology at the heart of the Internet--the computer "routers" which direct the data traffic from one local network of computers to another, both within a large company or across the Internet. Cisco's routers, successors to the original IMP computers set up by ARPA in the 1970s, now account for over 80 percent of routers connected to the Internet.[52]

As with Sun, Cisco's success came from its origins steeped in the culture of the Internet in the Bay Area. Its founders were Stanford academics Leonard Bosack and Sandy Lerner, husband and wife who were also the computer systems managers for, respectively, the computer science and business schools in the early 80s. The two systems were unconnected, so Borsack set out to develop a device to allow the two local networks of computers to communicate and share data. What he created was the router, a combination of software and hardware that could inexpensively forward data packets from one computer site to another. Stanford had been funding its own networking effort, so was indifferent, even hostile to Borsack and Lerner's efforts. Borsack and Lerner probably would not have launched their own company, despite this, but Stanford then refused to allow them to make similar routers for friends at Xerox PARC and Hewlett Packard. Enraged, they left their jobs.

No established firms were interested in their technology, so in 1986 they mortgaged their house, maxed out their credit cards and persuaded friends and relatives to work based on generously shared stock. But what really made their business possible was the ARPAnet, which allowed them to cheaply let other Net engineers know about their new product. Within a year, Cisco was selling $250,000 of routers a month to universities across the country with no sales force and no paid advertising of any kind. (Cisco didn't buy its first advertisement until 1992.) Cisco would do $1.5 million worth of business in their first fiscal year that ended in July 1987, even turning an $83,000 profit. Technically in violation of ARPAnet guidelines prohibiting commerce on the Net, Cisco was the first major company to build a market from scratch based on direct Internet marketing. Of course, it helped that Lerner and Borsack came out of the same milieu of the Internet as the university engineers doing the buying and that the National Science Foundation and other federal agencies were supplying much of the cash during this rapid expansion of the Internet at universities.[53]

It helped that most networking companies were concentrating on the fight to network desktop computers where a whole mess of proprietary hardware and software systems were battling it out. The original Ethernet desktop networking technology faced new competition from token ring and Arcnet technologies, with each of the technologies splintering into internal differences. Computer companies across the country were fighting to lock customers into proprietary desktop networking systems,[54] even as they ignored the emerging market for networking the networks based on standard Internet protocols. Cisco began finding corporate customers who needed to connect far-flung local networks of desktop computers or who needed their own connections to the emerging Internet for commercial users.

Like Sun, as other companies entered the its wide area networking market, Cisco made sure it had the top technology in order to dominate router networking, but it was the universal standards of the Internet that made technology, not control of proprietary standards, the measure of success. This is in sharp contrast to the fate of a company like 3Com, a technological leader in desktop networking, which nearly foundered after it allied with Microsoft in the late 80s in a fight over proprietary standards for desktop networking. By 1991, Microsoft had abandoned the fight in that round of network standards (returning a few years later with its Windows NT system), forcing 3Com into massive layoffs and the abandonment of almost all desktop networking. 3Com survived but only by committing itself to competing in the open standards of the wide area networking market like Cisco.[55]

Cisco continued to grow rapidly (minus its founders who left in 1990 after a conflict with new management brought in by venture capital investors) and maintained its commitment to staying at the cutting edge of technology. And like Sun which was happy to port their expertise to whatever processor their customers demanded, Cisco readily adopted new networking technology if customers demanded it. If they didn't have the technology in-house, they bought the companies that did, spending billions beginning in 1993 on fourteen takeovers of both startups and established businesses in technologies where they lacked the expertise. This included Cisco's $4 billion acquisition of its Silicon Valley neighbor StrataCom, a leader in the high-speed ATM switching technology that many see as the future of networking.[56]

With all the this growth, Cisco's CEO John Chambers has made Microsoft-like statements about Cisco's software becoming "a de facto standard" and where the company's purpose "is to shape the future of the network industry the way IBM shaped the mainframe and Microsoft did the desktop."[57] Even as competitors like 3Com and Bay Networks complained about the danger, the reality was that unlike with Microsoft, every Cisco product has to work with every other product using established Internet standards. When Cisco invents new ways to improve on those standards, it has to submit them for approval to the Internet standards boards before customers will accept them.[58] There are worries that such boards are losing authority (as the case of Netscape would soon show), but much to Cisco's chagrin, the company has continued to have to fight for dominance the way it's been doing so from the beginning: offering the best product. And it has ongoing stiff competition from the competitors it hasn't bought out yet, many of them like 3Com riding the same Silicon Valley Internet roots as them.

Destroying the Village to Save It: Netscape and the Web Standards War

It was with the World Wide Web that the Internet broke into national consciousness and where Netscape Communications would become the central Bay Area firm around which a slew of new Silicon Valley companies would form. But unlike Sun which rode public UNIX standards to rapid growth, Netscape began its life with a direct assault on the original government-based standards created by the National Center for Supercomputing Applications (NCSA). In this, Netscape would play a three-cornered game against both the NCSA and against Microsoft, who it knew would quickly be coming in with its own controlled standards. Netscape's success would be based on the virtual withdrawal of the government from any serious intervention on behalf of Internet standards. Netscape would develop a distinctly regional economic strategy around standards, drawing on the expertise of a wide range of Bay Area companies in a fight that would start over standards around Web software but would explode into a battle for the very architecture of future computer hardware and software.

As described last chapter, the initial Web "browser", Mosaic, was created at the University of Illinois at Champaign-Urbana where the National Center for Supercomputing Applications (NCSA) was located. The National Science Foundation had officially funded the NSFnet "backbone" of the Internet to link five major supercomputing centers, including NCSA, and NCSA's software development group had concentrated for years on high-performance information-sharing and collaboration software. Even before Mosaic, the NCSA had back in 1985 created software "clients" for PCs and Macs, called Telnet, to allow people to access and use computers connected to the Internet as if the user were locally based. A different computer center at Illinois was responsible, as well, for the popular Eudora client for electronic mail on PCs and Macs. The NCSA had worked to create a graphics-based collaborative tool for sharing documents called Collage, so it was natural for them to create a team to develop a graphics-based version of the Web "HyperText Markup Language" (HTML) protocols created by CERN in Europe.[59] The result of this forty-member team was Mosaic, first introduced on the UNIX platform in January 1993, with Macintosh and PC versions introduced in August 1993. Copyrighted by the University of Illinois, Mosaic could be downloaded for free by individuals and by companies wishing to use the Internet for internal communications.

However, the NCSA did not want to become a help desk for commercial applications, so in August 1994, the University of Illinois assigned future commercial rights for licensing NCSA Mosaic to Spyglass, Inc, a local company created by NCSA alumni to commercialize NCSA technology. The goal was for university researchers to continue developing longer-term technology and standards to be incorporated into browsers, while Spyglass would help license the technology to companies addressing immediate customer needs such as support, speed, and security. Spyglass began widely licensing Mosaic to computer companies including IBM, DEC, AT&T, , NEC, and Firefox Inc,, who was working to integrate Mosaic standards into Novell networking software for the personal computer.[60]

With the licensing agreement requiring Spyglass to closely share technology with the NCSA, Spyglass President Douglas Colbeth noted that the benefits of the commercial version of the viewer, dubbed Enhanced NCSA Mosaic, would be a "stable and standard" product across multiple computer platforms.[61] And the combination of license fees to the university and the potential economic development benefits to the surrounding community made it appear that the Urbana-Champagne region might be about to experience the same government technology-driven boost that Stanford and Berkeley had given to the Silicon Valley region. "This is the classic example of technology transfer that Congress envisioned in setting up the supercomputer center in 1985," argued Larry Smarr , director of the National Center for Supercomputing Applications, at the time.[62] Stable standards and technology transfer to the community made it appear that Urbana-Champagne would be appearing on the map as an upstart technological rival to Silicon Valley.

But it was not to be. Watching Mosaic from the Bay Area, Silicon Graphics CEO Jim Clark, a veteran of the workstation standards wars, understood how much money could be won if a company could take control of the standards of this new Internet tool. So Clark left his company and set out to destroy Mosaic and replace its government-backed standards. He met with Marc Andreesen, a member of the Mosaic team who had been hired at a Bay Area Internet security firm called Enterprise Integration Technologies. Out of that meeting in April 1994 was born Mosaic Communications Corporation (later to be called Netscape). With Clark putting up the capital, Andreesen recruited five other Mosaic team members from NCSA to design what they called in-house Mozilla, the Mosaic-Killer. In six months, Clark's team had created a powerful browser, which the team called Netscape, with easy-to-navigate features and which loaded graphic images faster than NCSA's Mosaic. But Netscape did something else--it included the ability to display text formatting that did not even exist in the HTML standards embedded in the NCSA Mosaic browser. This meant that Web pages designed to work with Netscape would not be readable by all the other Mosaic-based browsers. This would encourage people to use Netscape browsers and, as Netscape developed them, would encourage Web designers to pay Netscape for the server software that developed Web pages using their modified standards. It was this later market of selling Web design tools costing from $1,500 to $50,000 where Netscape intended to make their money.[63]

And then Clark and Andreesen compounded their fracturing of the NCSA standard by giving their version away over the Internet. The University of Illinois had demanded that Clark's company pay for a license before selling their version. Clark later said that he refused because the university was demanding an ongoing per-copy royalty: "I didn't tell them, but we had intended to allow people to download it, and they were going to charge me. The amount varied, but nothing is innocuous when you're talking tens of millions of people."[64] The point of the licenses by Illinois had been, along with collecting a little revenue, to control the standards and make sure that the only free version available was the official NCSA standard. Netscape would essentially "dump" its version onto the Internet, thereby undercutting the rest of the commercial browser companies who couldn't duplicate Netscape's actions because they were fairly paying per copy license fees. So Netscape, being the sole enhanced commercial browser flooding the Internet, was able to destroy NCSA-led standards and take over standards creation itself.

Unlike with Sun Microsystems where the government would decisively support open government-based UNIX standards, the federal government did nothing to support NCSA's standards. Other companies and analysts would immediately condemn Netscape's actions as a monopolistic move[65], but the government made no investigations into possible monopoly practices, no lawsuit alleging intellectual property infringement, no announcements that the federal government would use only NCSA-approved codes in government Web sites, no announcements that it would refuse to buy any Web servers (i.e. Netscape's) based on such non-standard formatting, and no signal from the government at all that they would oppose Netscape's takeover of the standards. Instead, the University of Illinois, after a bit of public grumbling, threw in the towel. They signed an agreement with Clark in December 1994 that allowed Netscape to be sold without a license for the minor concessions that the words "Mosaic" be removed from the firm's title and that no mention of Mosaic be made in marketing the browser.[66] Given the moves towards privatization of government Internet functions in recent years, the failure of decisive policy is not surprising. Criticism had already been leveled against the University of Illinois and NCSA for their commercial relationships,[67] and, in the context of December 1994, the month after Newt Gingrich's anti-government message had stormed to a majority in Congress, there was probably even less appetite by government officials to defend the wisdom of government regulation of standards.

In a perverse way, Clark and Netscape would justify their destruction of the government standards based on the expected weakness of the government in defending them. They predicted that Microsoft would soon use its dissemination of the operating system to take control of standards if Netscape didn't do so first through free distribution. Argued Clark:

At some level, standards certainly play a role, but the real issue is that there is a set of people, a set of very powerful companies out there, who don't play the standards game. For the standards game to work, everyone has to play it, everyone has to acknowledge it's the game. Companies such as Microsoft aren't going to sit around and wait for some standards body to tell them, You can do this. If your philosophy is to adhere to the standards, the guy who just does the de facto thing that serves the market need instantly has got an advantage.[68]

And once Netscape had taken control of the standards from NCSA, in order to gain trust in its management of the Web standards, the company drew on many of the same collaborative practices and resources in the Bay Area as Sun had in gaining trust in its stewardship of the UNIX standards. In fact, Sun itself would become a key partner in the alliance against Microsoft over standards for interactive aspects of the Web. Crucially for Internet commerce, Netscape worked closely with Enterprise Integration Technologies (EIT), Andreesen's old firm, to agree on security protocols for on-line transactions, working through an initially regional consortium called CommerceNet (much more about that organization in the next chapter). Netscape would build the possibility of "plug in" architecture into its browser, so an explosion of new firms could easily follow Netscape in Internet distribution of new products that could instantly be incorporated into individual's desktops. Netscape, having seized leadership of Web standards would continue to work with the old Internet fellowship of engineers embodied in the Internet Engineering Task Force (IETF) and the more recent World-Wide Web Consortium (W3C) based at MIT and run by Berners-Lee, who came to MIT in late 1984.

And as Microsoft entered the game with its own Internet Explorer browser to appear on every Windows desktop, the grumblings over Netscape's occasional forays into proprietary advantage would lessen as the alternative fear of Microsoft taking over the whole computing world loomed. Having come late to the Internet, Microsoft initially directly licensed Mosaic browser technology from Spyglass in December 1994 - a license netting Spyglass about $13.1 million. But when Microsoft began giving its browser away at the end of 1995, the rest of Spyglass's licensing revenue (amounting to $20 million) disappeared as the browser war settled into a two-company fight between Netscape and Microsoft.[69]

In the end, Netscape would argue that the beloved public village of standards was threatened by Microsoft, and Netscape had only destroyed the village in order to save it. And if saving the village made Jim Clark's Netscape a $9 billion company (at least on paper at its stock market high) and snatched leadership of Internet development away from Illinois back to Silicon Valley--well, this was just returning leadership of Internet-based computing to the region government support had made the leader in the first place.

But behind the Internet browser war was a more fundamental divide over the future of computing. In the region that birthed the personal computer, these Bay Area Internet-driven companies would make the replacement of the personal computer by Internet-driven "network computers" the path to further centralize information technology leadership in the region.

Network Computers & Massive Parallel Processing: The Bay Area challenges the Personal Computer

By 1996, the evidence was clear that the Internet was birthing a very different technological strategy for Bay Area firms. Building on the infrastructure of client-server workstation firms like Sun, HP and Silicon Graphics, networking firms like 3Com and Cisco, and a slew of software firms, a whole new explosion of Internet-based firms were appearing in the Bay Area region. A quarter century of investments by the government in the Internet were coming to fruition and, with a disproportionate share of those funds and expertise having been channeled to the region, Silicon Valley was in a position to reassert its authority over the direction of technology.

In May 1996, a new alliance of Netscape, Sun Microsystems, Apple Computer, IBM and Oracle jointly announced the specifications for a new standard of computing, the network computer. In many ways, this proposal was the long-time culmination of the Internet with four key Silicon Valley companies (with IBM essentially tagging along) working to ride their long-time expertise in order to retake control of the desktop. Instead of processing power being concentrated on the desk with software individually installed on each computer, the network computer would be an inexpensive customized processor in a box which would access CPU power over the Internet, downloading software "applets" as needed from a centralized server. Priced under $1000 (hopefully as low as $500), such network computers would be more economical in both homes and offices than the traditional personal computer. What these companies proposed was nothing less than to kill the paradigm of personal computing that the Bay Area itself had introduced nearly two decades before.

Microsoft was officially invited to join the group, but declined for obvious reasons, since the other clear goal was to destroy Microsoft's and Intel's duopoly control over the home and office desktop. If most computing power was be located in the network, there would be no need to constantly upgrade to a faster (read Intel) processor; instead, offices or homes could simply subscribe to higher levels of processing power as needed (or as developed by server designers). Similarly, the network itself would be responsible for upgrades in operating systems and application software, so there would be no need to constantly buy upgrades of (read Microsoft) operating systems and applications.[70] The overall goal was, as Sun had argued in advertising, to make the network the computer, thereby maximizing the resources available to each user and allowing constant upgrading of those resources as technology advanced.

In many ways, this was the return to the vision of Doug Engelbart at ARC, a vision that the Xerox Parc researchers had abandoned in favor of putting maximum processor power directly in the hands of individual users. In the mid-70s, the absence of a broadly available Internet made such a movement to personal computing the natural development, but historically personal computers may be seen as a two-decade long detour. Seeing the connection between their direction and Engelbart's vision, Sun and Apple have been prime funders of Engelbart's nonprofit Bootstrap Project where today he continues the research he started decades ago on collaborative computing.[71]

On the other hand, each of the companies involved in the Network Computer (along with the many more who would immediately signup in support) had more than a theoretical interest in replacing the personal computer. With its whole business strategy tied up in the Intenet, Netscape had the most obvious stake in preventing Microsoft from using its dominance of the personal computer to in turn dominate other Internet-based tools. Apple and IBM had both, in their own way, lost the personal computer market to Microsoft. Despite four million computers sold worldwide by Apple in the previous year, Microsoft's dominance of the desktop was clear enough that even Apple loyalists were pronouncing the company on its deathbed.[72] With Microsoft's control of the Windows operating system, IBM had been reduced to being a clone maker for a computer once described as being "IBM-compatible." With no room to innovate given Microsoft's absolute proprietary hold on the operating system, IBM's $6 billion a year research and development budget was nearly useless in the personal computing market.[73]

Sun Microsystems, in many ways the linchpin of the alliance, saw the faceoff with Microsoft in nearly religious terms. With Windows NT machines increasingly penetrating the network server market that had traditionally been Sun's stronghold, Microsoft was emerging as a clear and present danger to the UNIX standards that Sun had worked so hard to establish. Partly this could be blamed on bitter battles that had partly fractured those standards in the early 90s, but much of Microsoft's march was based on the reality of Windows on so many desktops, increasingly with Microsoft's Explorer web browser, so that using Windows NT on the central server became more and more attractive for many companies. 1996 looked to be the year that NT-based server sales would pass UNIX-based sales.[74]

` Sun's weapon against Microsoft and the linchpin of the network computer was Java, a unique programming language invented at Sun that could run the same program on any operating system platform--thereby making Microsoft's control of its operating system irrelevant. Most importantly, Java programs are designed to run seemlessly across networks: where ordinary programs take up megabytes of hard drive space on conventional PCs, little Java "applets" can be delivered as needed to the desktop. With Netscape's encorporation of Java into its browser, the first effect of this is to make web sites more interactive, but the broader effect could be to harness the Internet to deliver the day-to-day programs people use without needing a full-scale computer to run them.

To accomplish this goal, Sun used every lesson it had from its experience with UNIX. It licensed Java for the asking to any company for nominal fees. It is developing a complete operating system around Java called JavaOS. It launched a subsiary called JavaSoft whose purpose was less to make money than to work with all licensees to keep the Java standards stable. Along with IBM and other companies, Sun pooled together a $100 million venture-capital pool call the Java Fund to seed startups around the language.[75] By the end of 1996, almost every major company, excepting Microsoft, had agreed on a single set of Java standards for all software, a level of standards agreement Sun had never achieved with UNIX. Even Microsoft reluctantly licensed Java to develop applications because of customer demand.[76] The respect for Sun (and the fear of Microsoft) was shown in 1997 when the International Organization for Standardization (ISO) made Sun the official guardian of Java standards, a role usually given to independent associations rather than individual companies.[77]

Aside from helping Sun achieve a "halo effect", as one Sun executive put it, this public-minded standards creation would hold off Windows NT and allow Sun to focus the computer industry on which company could design chips and machines that would best run the Java software created around those standards. With a variety of new Sparc processors in production designed to maximize Java processing speeds, possibly by three to five times the speed of existing platforms like Windows NT, Sun and its allies changed into the competition against Microsoft.[78]

While Network Computers competing head-to-head with personal computers were slow to enter the market, the real promise of network computing and Java soon began appearing in a raft of "smart" consumer appliances, from cellular telephones to stereo equipment to set-top boxes on cable television systems.[79] In early 1998, Cable and Wireless agreed to use network computer technology in cable set-top boxes to be installed in 7 million homes and businesses in the United Kingdom, Australia and Hong Kong.[80] In a real sense, the whole computer, telecommunications and consumer appliance world was increasingly being described as a race between Silicon Valley's model of Java-driven open standards and Microsoft's proprietary Windows maneuvers.

In some ways, the most interesting booster of the Network Computer was the database software company, Oracle Computer, whose CEO Larry Ellison was the main instigator of the alliance. At the simplest level, he was there on the stage announcing the NC because he feared that Microsoft was moving dangerously far into his company's territory. Oracle was the second largest software company in the country, but had rarely competed directly with Microsoft since Oracle made software for every type of computer except the personal computer. With Microsoft moving increasingly into the server arena, an assault on Oracle's software by Microsoft was inevitable.

However, Ellison had much more than a defense strategy up his sleeve. For Ellison the interest in the network computer was in the computers and data that users would be accessing at the other end of the phone line. Ellison had become a billionaire by developing fast relational databases in the late 1970s to assist corporations and government in more easily organizing vast amounts of information (his first contract, typically, had been for the federal government, in this case the CIA). He had used IBM's own research to get a three-year jump on Big Blue, and unlike IBM which initially produced their databases only for their own machines, Ellison wrote versions of his software for every brand of mainframe and networked server possible, ending up dominating the database market. Never giving a thought to the Information Superhighway, it was only when British Telecom approached the company in 1993 to help run an experimental interactive television service. Unexpectedly, Ellison realized being a database creator put Oracle smack in the middle of the interactive age. "Better to be lucky than smart," he deadpanned at the time.

But Ellison was more than lucky; a few years earlier he had invested at least $60 million of his own funds to take control of a struggling Northern California supercomputer firm, nCube, which specialized in "massive parallel processing" linking thousands of microprocessors. Like most supercomputer makers, nCube had built its business selling to government labs and universities, but Ellison saw that a commercial market could be opened up for these machines if he linked Oracles's software to such an endeavor. In 1988, he decreed that all Oracle software would be written to run on nCube supercomputers. nCube was able to score sales to corporate customers like BMW, Shell Oil, and ended up capturing 65% of the market for massively parallel systems in Japan. Other deals were soon inked with Bell Atlantic for interactive television experiments in the US.[81] And as the Internet took off, Ellison saw the potential to sell nCubes bundled with new high-end Oracle multimedia Web servers for large enterprises needing to manage large numbers of users accessing a range of multimedia information.[82]

What was at work here was not only an assault on the personal computer market but on the lower end server market, which Microsoft was invading as well. And Ellison was just following Sun and other Bay Area workstation firms in pushing the technology towards supercomputer levels. The dilemma for mainframes had traditionally been that, despite their reliability and central control crucial for managing large databases, the relative costs of computing power had been so much cheaper on smaller machines, thereby pushing systems towards networks of smaller machines. However, new research, driven by government funding, had been using massive parallel processing to bring down the costs of supercomputing and allow much more flexible computing levels responding to economic needs based on what has come to be called "Scalable architecture." Supercomputer research had always depended on federal government support and procurement, but after cutbacks in the late 80s, federal support for supercomputing began skyrocketing in the mid-90s. Partly to support simulated testing of nuclear weapons (given the Nuclear Test Ban Treaty prohibiting real testing), the Department of Energy's Accelerated Strategic Computing Initiative (ASCI) had its budget ratcheted up from $85 million in 1996 to a planned $295 million by fiscal 1999.[83]

Following long-standing trends in the Bay Area, Silicon Valley firms were following government funding of new technology into new markets and, in the end, a new paradigm of computing. In 1996 Silicon Graphics acquired Cray Research, the premiere supercomputing firm, and Sun bought out key technology from Cray as well that year. Both built on Silicon Valley relationships to leverage themselves into competition for the large "enterprise" level computing markets. Intel also held a large share of the supercomputing market and won the first multiyear pact worth $45 million from DOE's supercomputing initiative to push for its technology. IBM won the second pact worth $93 million, but Silicon Graphics entered into the big leagues with a funding agreement with the DOE as well.[84] All of these technologies would in turn be applied to the commercial markets that will have an increasing thirst for the expanded computing power.

And where Ellison couldn't sell his nCube hardware, he was working to dominate the Internet database markets that were emerging and which would be dependent on advances in supercomputing technology. The need to keep pushing the technology was clear; where many analysts concentrate on lack of wired "bandwidth" to the home as the major barrier to multimedia Internet industries, the inability of central computer servers to manage the media information likely to be demanded is equally serious. To put this in perspective, one of the largest commercial databases in existence by 1993 was American Airline's Sabre computer reservation system, taking up 100 gigabytes of computer memory (roughly 1000 times the memory of a typical desktop computer). However, the same amount of computer memory could only manage an interactive on-line library of 50 movies. It was this reality that was pushing Ellison towards supercomputing servers.

And at the most visionary level, Ellison was promoting what he called Oracle's Alexandria Project, the name evoking the ancient Greek library which sought to contain all the world's published works. The Project aims at no less than using computers to change the way human knowledge is gathered and stored, bringing together unfathomablely large multimedia databases including books, art, films, and news coverage. Needing memories 10,000 to one million times larger than databases Oracle currently sells, such databases would act as centers for global commerce and learning while enabling radically new forms of collaboration.[85] In this vision, the network computer is the first step on the road to this new computing future.

Conclusion: The Engineer's Lament-- Technology and Its Discontents

The Alexandria Project would be the fulfillment of Doug Engelbart's dream of augmented collaboration, itself inspired a half-century earlier from Vannevar Bush's vision of a device to rapidly access the ever escalating information faced daily in our modern lives. It was Vannevar Bush who contributed mightily to the post-war federal government support of the research and development of the computer hardware that would make possible Engelbart's laboratory at SRI in the sixties, as well as subsidize the vibrancy of innovation in the region. In turn, Engelbart's work would move forward the tools for networking that would help attract a whole new generation of innovators to Silicon Valley. With that base of innovators, continued government subsidies of the open standards of the Internet would help '"lock in" technological leadership by the region's innovators who, themselves so much the product of that public investment, would push forward technological change around the standards that they had themselves helped develop. To this day, the federal role in funding research in the region (a yearly total of $14 billion yearly for California), supports continued economic expansion in the Bay Area.[86]

Yet Engelbart has doubts whether, despite the ever-expanding number of computers, this technological change is really in the end improving the economic and social lives of most people in society. Engelbart cites an MIT study that showed technology having dramatic effects on productivity in specific work situations yet having little, even negative effects on productivity overall in society. This reinforces his half-century long frustration that so much of the focus in implementing new technology has been on automating existing tasks rather than using the technology to augment human capabilities in truly new ways.[87] Engelbart's lament in many ways echoes his regional predecessor, Lee De Forest, who would write later in life that the result of his invention of the vacuum tube and radio technology was the debasement of culture and the prostitution of his technological child:

What have you gentlemen done with my child? He was conceived as a potent instrumentality for culture, fine music, the uplifting of American mass intelligence. You have debased this child, you have sent him out in the streets...to collect money from all and sundry, for hubba hubba and audio jitterbug...Some day the program director will attain the intelligent skill of the engineers who erected his towers and built the marvel which he now so ineptly uses.[88]

Engelbart's and De Forest's regret reflect the fact that just as the invention and development of technology is driven by economic and political imperatives often independent of general welfare, so too the implementation of that technology reflects those broader social forces, often at the expense of both social uplift and equality.

The next chapter will explore how the implementation of Internet-related technology in the Bay Area has reflected the social and economic imperatives that have driven its production and how the social forces driving its production in turn have shaped the overall political economy of the region. In discussions of creating a "Smart Valley" in the region, the political and business leaders have echoed Engelbart's hopes of translating the technology into a broader based change in economic relationships, but, as we will see, its implementation has more reflected the social realities of economic inequality both within the workforce and between areas within the region.

-------------------------

Endnotes for Chapter 3

[1] Barbrook, Richard and Andy Cameron. "The California Ideology." Hypermedia Research Center. 1995. http://www.wmin.ac.uk/media/HRC/ci/calif.html.

[2] Rosetto, Louis. "Response To The Californian Ideology." Hypermedia Research Center. 1996. http://www.wmin.ac.uk/media/HRC/ci/calif2.html.

[3] From Rogers, Everett and Judith Larsen. Silicon Valley Fever: Growth of High-Technology Culture. Basic Books. New York. 1986. Along with Everett & Larsen, the other books with good early histories of Silicon Valley include Mahon, Thomas. Charged Bodies: People, Power and Paradox in Silicon Valley. New York. New American Library Books. 1985; Malone, Michael. The Big Score: The Billion-Dollar Story of Silicon Valley. Doubelday & Company, Inc. Garden City, New York. 1985; Hanson, Dirk. The New Alchemists: Silicon Valley and the Microelectronics Revolution. Little, Brown and Company. Boston. 1982.

[4] Malone, p. 15.

[5] Malone, Chap 12.

[6] Hanson and Mahon, p. 160.

[7] Mahon, p. 153.

[8] Everett & Larsen, p. 35-36.

[9] Saxenian 1994.

[10] Malone, Ibid..

[11] Hanson, p. 79.

[12] Mahon, Ibid.

[13] Slomovic, Anna. An Analysis of Military and Commercial Microelectronics: Had DoD's R&D Funding Had the Disired Effect. A Rand Graduate School Dissertation. Rand. Santa Monica, CA. 1991.

[14] Saxenian, 1995. p. 42.

[15] Press, Larry. "Before the Altair: The history of personal computing." Communications of the ACM v36, n9 (Sep 1993):27-33.

[16] Rheingold, Howard. Tools for Thought- The People and Ideas Behind the Next Computer Revolution. Simon & Shuster. New York. 1985. Notes about Doug Engelbart come from Chapter 9.

[17] Levy, Steven. Hackers: Heroes of the Computer Revolution. Anchor Press. Garden City, New York. 1984. Levy's book is one of the most indepth stories of how the social ethic of free and open computing formed in the bowels of MIT's computer lab and the way it crossed the continent to California.

[18] Saffo, Paul. "Racing change on a merry-go-round: MIT Management in the Nineties program reports industry overall is not more productive because of computing technology." Personal Computing v14, n5 (May 25, 1990):67. Saffo details the revolutionary vision of Engelbart and how little modern business has engaged with the full thrust of Engelbart's vision.

[19] Rheingold, 1985, p. 214.

[20] Baker, Steven. "The evolving Internet backbone- history of the Internet computer network." UNIX Review v11, n9 (Sept, 1993):15.

[21] Rheingold, 1985, p. 199.

[22] Warren, Jim. "We, the people, in the information age: early times in Silicon Valley." Dr. Dobb's Journal v16, n1 (Jan, 1991):96D.

[23] Rheingold. Chapter 10.

[24] Levy, p. 142-143.

[25] Hafner & Lyon, p. 151-152.

[26] Press, 1993.

[27] Rheingold, 1995, p. 203.

[28] Levy, 1984, chapter 10.

[29] Manes & Andrews, 1993, chapter 12.

[30] Florida, Richard and Martin Kenney. The Breakthrough Illusion: Corporate America's Failure to Move from Innovation to Mass Production. US. Basic Books. 1990. See Chapter 2 for the main part of this argument.

[31] See Hanson.

[32] From an interview by Mahon, p. 77.

[33] Markoff, John. "Innovative Computer Firm Plans to Spin Off Three New Companies." New York Times. November 13, 1996.

[34] Florida & Kenney, p. 18.

[35] Mahon.

[36] Taylor, M.J. "Organizational Growth, Spatial Interaction and Locational Decision-making" in Regional Studies. 9, 313-23. 1975.

[37] Manes, Stephen and Paul Andrews. Gates: How Microsoft's Mogul Reinvented an Industry-- and Made Himself the Richest Man in America. Simon & Schuster Inc. New York. 1993.

[38] Everett & Larson, p. 63-65.

[39] Gordon, R. and L. Kimball. "The Impact of Industrial Structure on Global High Technology Location." in The Spatial Impact of Technological Change, ed. by John Brotchie, Peter Hall & Peter Newton. Croom Helm. London. 1987.

[40] Florida aand Kenney, p. 74.

[41] Gordon & Kimball, p. 168.

[42] Hall, Mark and John Barry. Sunburst: The Ascent of Sun Microsystems. Contemporary Books. Chicago. 1990. Written by two former employees, this book is a broad look at the internal developments of Sun in its first eight years and is the source of much of the information about Sun in this section.

[43] Schlender, Brent. "Whose Internet is it, anyway?" Fortune v132, n12 (Dec 11, 1995):120-142; European 66-73.

[44] Saxenian, 1994, p. 134-35.

[45] See Haffner & Lyon, 1996, p. 250.

[46] Taft, Darryl K. " Opportunity lies in government downsizing." Government Computer News v11, n4 (Feb 17, 1992):S14.

[47] Temin, Thomas R.; McCarthy, Shawn P. "Sun chief sells Unix and minces no words." Government Computer News v12, n26 (Dec 6, 1993):20.

[48] Hall & Barry, 1990, p. 151.

[49] Button, Kate. "Hub of the solar system. Sun Microsystems' Scott McNealy's attitude towards cloning." Computer Weekly (Dec 3, 1992):39.

[50] Saxenian, 1994, p. 134.

[51] "Bigger faster, better ears." The Economist. December 7, 1996: 59-60.

[52] Fisher, Lawrence M. "Routing Makes Cisco Systems a Powerhouse of Computing." The New York Times. November 11, 1996.

[53] See Pitta, Julie. "Long Distance Relationship." Forbes v149, n6 (Mar 16, 1992):136-137 and Nocera, Joseph. "Cooking with Cisco." Fortune v132, n13 (Dec 25, 1995):114-122; European 76-83 for good accounts of Cisco's rise.

[54] Baker, Steven. "Fiber to the desktop." UNIX Review v13, n3 (Mar 1995):19-25.

[55] Kerr, Susan. "3Com Corp. (The Datamation 100) " Datamation v38, n13 (June 15, 1992):141.

[56] Stevens, Tim. "Multiplication by addition." Industry Week v245, n13 (Jul 1, 1996):20-24.

[57] Fisher, 1996.

[58] "Bigger faster, better ears." The Economist. December 7, 1996: 59-60.

[59] Michalski, Jerry. "O pioneers!" . RELease 1.0 v94, n1 (Jan 31, 1994):5 (8 pages).

[60] Stevens, Tim. "NCSA: National Center for Supercomputing Applications." Industry Week v243, n23 (Dec 19, 1994):56-58. and Patch, Kimberly. "Spyglass takes on Mosaic licensing: will focus on support and security." PC Week v11, n34 (August 29, 1994):123.

[61] Booker, Ellis. "Spyglass to commercialize future Mosaic versions." Computerworld v28, n35 (Aug 29, 1994):16.

[62] Messmer, Ellen. "Spyglass captures Mosaic licensing." Network World v11, n35 (Aug 29, 1994):4.

[63] Accounts of Netscape's startup from: Holzinger, Albert G. "Netscape founder points, and it clicks." Nation's Business v84, n1 (Jan 1996):32; Nee, Eric." Jim Clark." Upside v7, n10 (Oct 1995):28-48.

[64] Nee, Oct 1995.

[65] Steinert-Threlkeld, Tom. "The Internet shouldn't be a breeding ground for monopolies- Mosaic Communications' NetScape giveaway could be prelude to market dominance." InterActive Week v1, n2 (Nov 7, 1994):44.

[66] "University of Illinois and Netscape Communications reach agreement." Information Today v12, n3 (Mar 1995):39.

[67] Messmer, Aug 1994.

[68] Nee, 1995.

[69] Lohr, Steve. " Spyglass, a Pioneer, Learns Hard Lessons About Microsoft" New York Times. March 2. 1998.

[70] Edwards, Morris. "The Network Computer comes of age." Communications News v33, n7 (Jul 1996):44-45.

[71] Warrren, 1991.

[72] Alsop, Stewart. "Please save me from my editors." Fortune v134, n8 (Oct 28, 1996):203-204; European 113-114.

[73] Moltzen, Edward F. "IBM puts 20% of R&D into networking." Computer Reseller News, n705 (Oct 14, 1996):290.

[74] Moukheiber, Zina. "Windows NT--never!". Forbes v158, n7 (Sep 23, 1996):42-43.

[75] Schlender, Brent. "Sun's Java: The threat to Microsoft is real." Fortune v134, n9 (Nov 11, 1996):165-170; European 87-91.

[76] Lewis, Peter. "Alliance Formed Around Sun's Java Network." New York Times. December 11, 1996.

[77] Lash, Alex. "JavaOS headed to consumer gear." CNET News.Com. March 25, 1998.

[78] "Sun spots its chance." Network World v13, n33 (Aug 12, 1996):I33. Gage, Deborah. "Sun launches new servers, support boost." Computer Reseller News, n681 (Apr 29, 1996):67,70. and Moukheiber, 1996.

[79] Markoff, John. "Several Big Deals Near for Sun's Java Language." New York Times. March 24, 1998.

[80] "A New Player in Set-Top Boxes." Business Week. Mar 12, 1998.

[81] Deutschman, Alan. "The next big info tech battle." Fortune v128, n14 (Nov 29, 1993):38-50; European 24-31. "The house that Larry built." Economist v330, n7851 (Feb 19, 1994):73 (UK 85). and Nash, Kim S. "NCube piggybacks on Oracle plans." Computerworld v28, n5 (Jan 31, 1994):33.

[82] DeVoe, Deborah. "nCube to unveil multimedia servers." InfoWorld v18, n8 (Feb 19, 1996):38.

[83] Rogers, Adam. "Getting Faster by the Second." Newsweek (December 9, 1996), pp. 86-89.

[84] Wolfe, Alexander. "Cray enters race for teraflops computer." Electronic Engineering Times, n922 (Oct 7, 1996):1,16.

[85] Deutschman, 1993.

[86] Charles, Petit. "Praise for California at Science Summit- But leaders worry about low test scores." San Francisco Chronicle. (May 29, 1996) p. A15.

[87] Saffo, Paul. "Racing change on a merry-go-round: MIT Management in the Nineties program reports industry overall is not more productive because of computing technology." Personal Computing v14, n5 (May 25, 1990):67.

[88] Hanson, 1982, p. 37. 144