Chapter 5: Banks, Electricity and Phones- Technology, Regional Decline and the Marketization of Fixed Capital

Chapter 5:
Banks, Electricity and Phones- Technology, Regional Decline and the Marketization of Fixed Capital

When Joint Venture was launched to revitalize the Silicon Valley economy, the founders turned to Bank of America to supply an initial $250,000 to get the effort off the ground, a crucial investment that was a catalyst for the effort. Similarly, one of the largest on going supporters of Smart Valley was Pacific Gas & Electric who contributed over $100,000 to the effort, far more than any of the computer companies involved. Pacific Bell, along with millions spent on its own CalREN networking project, contributed almost $500,000 to Joint Venture's efforts to wire Silicon Valley schools, also a larger contribution than any of the other computer companies involved.

Why so much money from these particular companies for a "common good" where so many other computer companies might have been more obvious angels in supporting these regional efforts?

What all three of these companies share is a traditional commitment to the region that has spanned almost one hundred years in all three cases, a commitment tied to the fact that each is a classic regional network conveying financial information, power and communication respectively. Banks, power companies and telephone companies are the classic components of what David Molotch, John Logan and David Harvey have called fixed capital, the bedrock of regional coalitions to promote growth and economic development.1 Add in Southern Pacific railroad in earlier years and these same exact companies were the key parts of regional growth coalitions half a century ago and even seventy-five years ago.

Across the country, local community banks played a key role in recirculating local savings into economic development - backed by policies of the government which kept banks focused on serving those local customers. Expansion of consumer credit would keep local commerce expanding which would in turn feed a cycle of expanded local business and stronger banks which could extend more credit in a virtuous cycle. In California, that expansion was concentrated in the form of Bank of America, which became the largest bank in the world following World War II thriving on exactly that cycle of serving local depositors and regional businesses. Its founder A.P. Giannini would make it a creed in the bank that the bank's success was inevitably tied to the economic success of working class depositors and the prosperity of the region in which those workers lived.

Similarly, like other utilities across the country, Pacific Gas & Electric came into being as a merger of the numerous city and regional power companies finding new economies of scale in integrating regional power production and distribution systems. By its nature, generation and distribution of electricity had to be local, so power companies gained a keen interest in supporting the local expansion of industry and commerce that in turn would spur new energy sales. Hydroelectric and other power projects became key capital investments in regions that would spur employment and a range of subsidiary commerce. Of especial importance in the West was the fact that these integrated regional utilities would deliver subsidized rates to rural agricultural areas, thereby expanding economic development and access to the electricity grid to a range of new customers. In a similar way, hundreds of local phone companies would be consolidated under AT&T, which would be a holding company for the separate state phone companies. Each state company would have its own budget and with strong local public participation on their boards of directors and be governed by broad public interest guidelines set by local state utilities commissions.

All of these companies would becomes fixtures in the political life of their regions, often acting as a political bridge between working class organizations and the rest of the corporate elite. Their constancy of involvement in the civic economic life of the region stemmed from a simple congruence of self-interest and public interest for these companies whose customer bases and financial investments were so thoroughly tied to a single region. This is something that most local industrial companies can rarely say with global markets and (in the back of their minds at all points) global production facilities beckoning them in the future. The fixed circulation of capital between such local companies and their consumers created the basis for the long-term cross-class cooperation that built the civic economic life of regions across the country. And, however imperfectly, it was those cross-class collaborations that brought many poor and working class families into economic life through commitments to broad banking services and subsidies for universal service by local utilities. It was these subsidies in the post-war era that continued to expand the base of consumers for these basic services and gave those families a connection to the broader economic life of the expanding regional economies.

However, just as new pressures of industrial competition have led to the downgrading of private investment in basic research public goods by many private firms, there is an accelerating trend of technological change, competition and deregionalization of these traditional regional anchors of banking, power companies and telephone service. Subsidies that had flowed from big business and wealthier customers to smaller users of these services have begun to be reversed as deregulation and new technology, particularly the Internet in recent years, have allowed sophisticated industrial customers to cream the benefits of new technological innovation for themselves, even as they abandon regionally-based markets that had delivered universal service. And as those subsidies disappear and as previously local banks and utilities become global players through both expansion and merger, it is less and less clear from where any private source of public goods not exclusively focused on business itself can and will originate. In that sense, the participation of the traditional anchors of the regional economy in the Joint Venture Internet projects can be seen less as a continuation of that tradition and more as their emergence as key players in the new "gated community" elite politics of the region.

In the end, technological standards and the investment that make them possible are tied as much to pricing and legal arrangements as they are to specific feats of engineering. As this chapter will explore, earlier banking, telecommunication and utility models of accumulation and network building were based on pricing models that allowed those who committed to expanding the network the opportunity to recover the fixed costs of those investments. In turn, that commitment to fixed investments in the community would be tied legally and financially to subsidies that would expand the number of people brought into the network, thereby increasing the value of the network for all. The bottom-line for such regional actors was that, with the geographic scope of their markets limited and the ability to recover almost the full value of their investments with various government guarantees, it was in their interest to maximize growth in every sector, rich and poor, in that region.

As those geographic restraints have been lifted both politically and technologically, new companies have begun to operate using new technologies like the Internet in multiple geographic markets to "skim the cream" of high-profit customers by taking advantage of already established public investments. These new companies essentially profit from the long-term inherent value of the network as a whole while only having to expend their own funds on servicing the most valuable members of those networks. New technology has gone hand in hand with legal changes in the geographic boundaries of these markets that would push a change in pricing models from fixed cost recovery to a focus on marginal costs irregardless of the total public and private investment needed to maintain financial and utility networks.

Ironically, since political concerns over new network construction, network interconnection and regional investments have not disappeared, "deregulation" has actually ushered in an era of almost constant tinkering by political leaders and regulators seeking to achieve the regional, economic and social goals which were implicit in the more laissez-fair days of delegating day-to-day management of the regional service networks. Instead of having designated regional actors who had a private interest intertwined with the public interest and could be left more to themselves to implement directives that could be periodically reviewed, political leaders are now having to debate and design the "ideal" market needed for each service, police against predatory rent-seeking, and compensate for loss of regional subsidies with a whole range of jerry-rigged measures to assure some degree of universal service.

As analysts like Steven Vogel2 and Jill Hills3 have argued, the move to "deregulation" in no way decreased the level of government involvement in industries like financial services and telecommunications. In fact, much of this political movement, what Vogel labels "reregulation", was in many cases an attempt by specific regulatory agencies to rationalize and strengthen their control of the industry from previously independent monopolies or from other government bodies. In the case of the United States, the creation of new markets in banking, energy and telecommunication industries has amounted to a massive transfer of political control from local government authorities (who traditionally governed these regional actors) to federal regulatory agencies like the Federal Communications Commission. Despite the odes to government decentralization, most of "deregulation" has been the emasculation of the power of local government in favor of national government regulators. This change in political venues translated into an erosion of the cross-class regional collaboration that had been a hallmark of the earlier era of local governance in favor of a stance that promoted global competitiveness of these industries in relation to other countries. Noting that the ideological promotion of telecommunications started in the US, Hills argues persuasively that much of "deregulation" globally can be understood as part of a global strategy by the US for asserting information technology dominance in the world by opening new markets for its corporations.

This chapter will outline how the technology of the Bay Area region would interact both with these elements of regional fixed capital industries and with the global economic changes initiated by these technological changes. National investments in the Bay Area would contribute to the networking technology that in turn, with specific political arrangements enacted to support this new technological regime, would help launch financial revolutions in the banking industry that would undermine traditional regional banking systems around the country, including in Northern California. In turn, the region would face new challenges for local banking that would push Bay Area banks to use the newest Internet technology not only to compete in the new global banking system but seek to dominate it. In the area of power companies, computer technology and the Internet would make possible new national systems of power distribution and coordination that are now beginning to undermine regional utilities. And the phone system would go through regulatory earthquakes as telecom technology would both initiate and be impacted upon by the swirl of new computer technology. In each of these changes, large business customers would overwhelmingly benefit from the new global systems even as local customers would end up shouldering the burden both of network maintenance and the economic losses of the transition. These losses would range from bailing out savings and loans to assuming the costs of defunk power stations to dealing with how to upgrade local telephone service in the new era. Technology was again leaving those trapped in local geography with the fewest options and less power in the new era.

From Frank Capra to Virtual Banking

It's a Wonderful Life pictured the local banker as a fixture of the community--whether as evil oppressor like Mr. Potter or local savior like Jimmy Stewart's George Bailey. Either way, it was hard to think of the community without seeing it as a reflection of the local banks.

These local banks, however, are increasingly a thing of the past, done in by regulatory changes and technology that has encouraged bigger national and even international banks. Electronic banking has emerged as the key strategy for banks in going global and cutting costs, while the disappearance of local bank branches is just the most obvious sign of the deep cost-cutting involved. Automatic teller machines were the first wave of electronic banking to begin to replace the local branch, with Internet banking now coming on fast. With alternatives to the traditional bank branch costing as little as 15% of comparable branch transactions (and with costs dropping quickly), banks are seeking every opportunity to push into electronic banking.4 The two largest California-based banks, Bank of America and Wells Fargo (who both went through massive mergers in the early and mid-1990s), have closed thousands of traditional branches--in the case of Wells, well over half its branches have been closed since 1980.5

1996 would be the year Internet banking would become widely available at banks across the country; only 7 percent of banks were offering Internet banking at the beginning of that year, but by the following year, 59% of banks were planning to do so.6 If any entrant into Internet banking showed the promise (or the threat) of banking without geography, it was the appearance in 1995 of Security First Network Bank (SFNB), the first independent bank with no physical branches and doing all business electronically. The creation of an otherwise obscure Savings & Loan from Kentucky in October 1995, SFNB built on a collaboration with Hewlett-Packard to launch the bank and a software subsidiary, Five Paces, to sell Internet-banking software to other banks.7 Within months, SFNB had a few hundred customers, which grew to 4000 customers in all fifty states who had opened checking accounts by August 1996. An initial $100 deposit gets a customer an ATM card or a VISA debit card and the ability to check balances any time on Security First's home page on the World Wide Web. By the end of 1996, SFNB was offering customers on-line discount stock brokerage and insurance services and even a few physical branches around the country. With 7000 customers and $201 million in deposits by the third quarter of 1996, SFNB was obviously not a big player yet but its success was a warning sign to more established banks. Increasing the threat, Security First was using its software relationship with Hewlett-Packard to make more revenue selling its Internet software to banks across the country than in its own fee income.8 By 1998, banks totaling $800 billion in assets were using Security First's Internet software.9

Nowhere was that threat felt more than in northern California where a majority of Security First's initial customers lived, showing clearly that while the Internet is global, the Bay Area was the focus of both production of the new technology and in being impacted by its global economic impacts.10 Bank of America and Wells Fargo, the largest banks in the region, had suddenly realized that their traditional territory was under threat from banks that can reach their customers from anywhere in the world over the Internet. This is why Bank of America's Mack Hicks would argue, in promoting CommerceNet's vision, that "in the information age there is no boundary for the delivery of services so there is no reason for regions."11 Bank of America moved quickly to establish home banking services on the Internet, but have watched their once much smaller rival Wells Fargo move even more aggressively to use the Internet to leverage themselves into being a major international player. By April of 1997, Wells Fargo was claiming to add 10,000 new online customers every week as it expanded far beyond its traditional California roots at the expense of many local banks in other regions.12

Even in core areas of loans, smaller banks across the country are losing out to bigger banks like Wells as the new technology cuts out the use of local intermediaries to evaluate the loan--a radical change even for larger banks like Bank of America where local branch managers once had the central role in evaluations. Home mortgages are increasingly based on national lending criterion determined and managed by central banking institutions like Fannie Mae and Freddie Mac. Using an infrastructure of fast computers, data base software and high speed communication networks, new "data mining" techniques allow lenders to manage data to identify key patterns and relationships in customer data to determine whether loans should be granted. All of these technologies require technical and statistical sophistication that small banks cannot afford. And as on-line networks strengthen, any intermediary between customers and the ultimate loan approver could be subject to elimination.13

Home mortgages have been moving towards centralization for the last decade and a half but even more significant for regional economies is the recent emergence of a national loan market for small business loans--once the quintessential part of local economic development decision-making. Until just a few years ago, large institutions had little advantage in loaning to small business; each loan was evaluated based on business plans, balance sheets, cash flow and profits. There were few economies of scale. Because of the uncertainty of small business survival, the assumption was that only local banks had any chance of betting on the right loans, so local bankers with knowledge of an area had pretty much free reign in the area of small business lending.

Computer technology has changed all that as small business loans have begun to be evaluated using the same statistical methods as individual consumer loans. Using a system known as credit scoring, lenders no longer perform detailed financial reviews of each borrower. Instead, they use a few key pieces of information to predict the probability that a borrower will repay, then offer the loan to those deemed worth taking a risk on. The cost of processing each small business loan thereby drops from thousands of dollars to merely a few hundred, opening the way to apply mass market techniques to these loans.

Big banks are displacing local banks in this game for two reasons. First, the costs of the computers, software and databases involved in this new system are far beyond the resources of most small banks. "The capital cost is significant, and if you are not doing volume, you can't recover it," noted Bank of America small business lending chief Janet Garufis. In that sense, the balance of power is tilting toward large regional or national lenders. "This hits at the fundamental strategy of community banks in the small business market," says Vikram Capoor, a Washington, D.C.-based consultant with the Advisory Board Co. "It shifts success to the lowest cost provider."14 Most banks involved in small business lending are now using off-the-shelf scoring software such as by Morris Associates and Fair Isaac, which was developed from data pooled from a consortium of 17 large regional banks.

However, Wells Fargo was one bank that pioneered the development of an in-house system to give it a dominant edge over rivals. It began the process in the late 80s as it was trying to escape a dangerous concentration of loans in commercial real estate and leveraged buyouts. Turning to its weak small business lending department, the bank decided that with the right concentration on information technology, it could dramatically increase its presence nationally in the small business market. Using data collected on its own customers as a guide to its evaluation of data to predict losses, it began expanding the database as it expanded small business loans in California beginning in 1991. This was combined with increasing technology in back-office operations to reduce paper and drive costs down. "We engineered out the number of times a human hand touches a loan," said executive vice president Lucy Reid, who heads the bank's national small business direct marketing program. Within California, the results were spectacular. In 1989, Wells had only 1% of all small business loans held by California banks. By 1995, Wells had more than 16% of all small business loans.

By 1994, Wells Fargo was ready to go national using direct marketing techniques to attract customers outside California. By 1995, five million small business loan solicitations were mailed nationally. Outside of California, Wells concentrated solely on loans under $100,000 using credit scoring to sort potential customers into eight risk categories with different loan rates offered to each group. By June 1995, total commercial loans under $1 million at Wells had jumped to 117,392; that year, Wells earned a stellar 32% return on equity in its small business credit portfolio.

Smaller banks are being squeezed not only because they can't afford to invest in the computer technology and in-house databases, but also because they just aren't big enough to treat loans as large as $100,000 as statistics that can repay or default based on probability tables. David Payne, chief executive of $2.4-billion-asset Westamerica Bancorp. in San Rafael, CA, argues "I have to look at each deal to repay as agreed."15 Local knowledge of the economy is no longer the competitive advantage it once was for local banks as low cost, mass market loans become possible due to the new computer technology. Big banks end up with the advantage as they can average out good and bad returns over a much broader capital base.

Direct mail, telemarketing and now Internet marketing such loans nationwide to small business is replacing the local community bank as the source of small capital. And as the confidence of the banks grows using these computerized techniques, they are moving towards targeting larger business loans, further threatening community banks. All of these changes are undermining traditions that linked local businesses and local capital in a shared fate in the growth of their region. As capital nationalizes, local businesses are increasingly able to find loans regardless of the health of local banks, while local banks no longer have to depend on growth in their local region to find places to invest their capital.

Technological Regionalism and Global Banking

There is an irony in the fact that Wells Fargo, once the epitome of banking for the business elite, is emerging as a dominant small business lender. On the other hand, this version of small business lending is in many ways merely a high-tech continuation of earlier elite strategies that have just used computer technology to move the same centralized approach downscale. While its lending practices are increasingly nonregional, Wells Fargo has used tight regional relationships with Bay Area technology firms around Internet standards to leverage itself an emerging dominant position in the new banking order.

The early 90s would see a number of proprietary attempts to establish electronic bank payment standards, chiefly by Intuit (maker of Quicken financial software) creating its own network of banks beginning in 1990. However, by 1995, less than 2% of Quicken users were paying their bills using its proprietary network. "It's hard to get a whole culture to accept a new payment scheme," Intuit chairman Scott Cook would note. As well, Microsoft had envisioned using its original proprietary Microsoft Network as a key to dominating on-line banking. A proposed merger with Intuit was blocked based on antitrust concerns, after which Microsoft abandoned its whole proprietary approach16 in favor of connecting its Money software customers directly to the Internet and offering banks themselves the server software needed to integrate their banking operations on the World Wide Web.17

While a range of competing standards for electronic payment over the Internet would vie for attention in the next few years, by 1996 a Secure Electronic Transaction (SET) protocol emerged backed by Visa, Mastercard, a range of banks, Netscape, IBM, Hewlett-Packard and a range of other technology companies including Microsoft. This effort was tied intimately to regional networking through the CommerceNet initiative. Wells Fargo would become the first bank to issue "digital certificates"--a way of authenticating the identity of customers--to businesses wanting to collect payments over the Net. Whenever sales are made, merchants using Wells Fargo digital certificates would relay information on the sale to the bank, which would then verify the purchaser's credit card over private networks, then relay that information back to the business instantaneously. Wells Fargo is working closely with Verifone Inc., the Silicon Valley firm best known for producing the merchant devices to "swipe" credit cards, to develop the tools to link merchants to Wells Fargo. Not coincidentally, the marketing director for Verifone's efforts is Cathy Medich, the former executive director of CommerceNet, who was instrumental in pushing forward agreement between various firms around electronic payments. And in April 1997, Hewlett-Packard announced its buyout of Verifone for $1.15 billion, a way for HP to further integrate its networking technology into financial systems and strengthen its alliances with banking and other financial sectors.18 In a move analysts saw as paralleling HP's involvement in CommerceNet, Hewlett-Packard launched a new consortium called First Global Commerce with credit card companies, EDS and other partners to promote the range of online technologies needed for rapidly expanded e-commerce that could directly link its customers' homes.19 At the same time, Wells Fargo was also working closely with Verifone in creating "smart cards" as a replacement for cash in a system called Mondex, using a test program in San Francisco to build up a critical mass of users.20

This elite-based approach to dominating global banking is in sharp contrast to the historical legacy of locally-oriented banks, especially the regional traditions of Wells Fargo's arch-rival Bank of America, whose role in the earlier age as a facilitator of the civic organization of economic life was unmatched probably anywhere in the country. The next section will explore the contours of that experience and the political context that shaped it, then turn to the way that technology pioneered by Bank of America itself began to undermine the very structure of regional development it had pioneered in California.

Bank of America and the Birth of the American Dream

A.P. Giannini, the founder of what would become Bank of America, achieved legendary status in his own lifetime, incongruously for a banker, as a populist hero against other bankers even as he would turn BofA into the largest bank in the world. He was born in 1870, the son of Italian immigrants who had worked in the vineyards of the Santa Clara Valley that would later house the semiconductor plants of a later generation. His ambition to start a bank came out of his experience working in his step-father's produce business as he watched banks, even those of fellow Italians, refuse to lend to poor, newer immigrants who needed help more than the established businesses that banks focused their efforts upon. After a brief stint on the board of one small bank in San Francisco, he quit in 1904 to establish his own, the Bank of Italy, using the proceeds of his stepfather's produce business as initial capital. His first customers were many of his old customers in the fruit and vegetable trade, people who had never been near a bank before--workingmen, fish dealers, grocers, bakers, plumbers and barbers. He pioneered making loans as small as $25, something no other bank did. His goal was explicitly to build the local economy and loyalty to his bank and share in the growing prosperity of his clients as their expanding deposits circulated through his bank to new businesses.21

The first test of his vision came with the 1906 earthquake that leveled the city of San Francisco. As other bankers held bank loans for six months to evaluate what was left worth investing in, Giannini hauled $10,000 in cash from the vault in the rubble of his bank, set up a plank on some barrels on the city's wharf, and reopened his bank immediately for business in the open air. He lent money to shippers to bring back lumber from the Northwest to rebuild homes and urged others to do the same, making him a local hero. The combination of his open loan policy and skillful self-promotion in newspaper advertisements would bring new attention and growth to his enterprise. He sought to quickly expand deposits, opening at night and on Sundays to accommodate the schedules of working families.

His next innovation was the creation of branch banking, an innovation in the United States inspired by its success in Canada. The idea was to use local talent focused on local needs, while making available the capital reserves of a larger banking network--a radical departure from the big banks of the East Coast that lent money to far flung corporate interests only from central offices. His first branch was in his hometown of San Jose, his second in San Mateo. He would also acquire banks in Southern California and the San Joaquin Valley. By 1928 he presided over 138 branches with $358 million in assets, but in each branch he retained local advisors and employees and drew upon local visibility to run the day-to-day lending of the branches.

Giannini's expansion could simply be explained by the fact that California legalized branch banking where other states, notably New York, Texas and Illinois, severely restricted the practice. However, similar fears of big banks in California could have and almost did lead to similar legislative restrictions on Giannini's expansion in the 1920s. However, even the harshest critics of branch banking expansion, such as the state superintendent of banking Charles Stern, would grudgingly admit to Giannini "the unquestioned benefit your branch offices have brought to many specific localities."22 Combined with nativist attacks on Giannini's immigrant and Catholic background by non-branch bankers, the same attitudes as in other states could easily have led to the same restrictions on expansion as in those other large states. Yet it was precisely because Giannini was not J.P. Morgan but had built his network around local investment in the bank itself and local economic development that he could marshal the political support to keep expanding. In fact, when Giannini lost control of the holding company for his banks in 1931 to managers tied to J.P. Morgan, he was able to take advantage of the banks' widely diffused stock ownership to mount a legendary grassroots proxy campaign literally up and down the state to rally a majority of the 200,000 shareholders in order to wrest back control of his bank. What was clear was that the policy dynamics in every state was one of attempting to focus banks on local economic development and Giannini's expansion was based on innovating a system that accommodated that public policy regime. In opposition to most large banks, Giannini would strongly support Roosevelt's New Deal, including the Banking Act of 1935. Despite that support of Roosevelt, the same New Deal administration would frustrate Giannini's desires to directly integrate scattered holdings of banks in other Western states directly into Bank of America--reinforcing the focus of the bank on regional development of California.

It was in that regional development role that Bank of America would shine and prosper. The bank would become a key source of startup capital for the largely immigrant-run movie studios of Southern California, an industry largely shunned by other banks, and would use its branches around the state to track the needs of different agricultural sectors to supply capital as needed for regional investments and to stave off various agricultural disasters. The supremacy of local branches would keep the focus on small stores, homebuyers and the growing consumer market for loans--all tying together Giannini's ideas about integrated economic development. By the end of the 1930s, Bank of America was handling over 600,000 small loans each year and would extend startup loans to industries ranging from jewelry to oil wells to furniture to lumber to textiles and clothing. Giannini had almost single-handedly nursed local wineries through Prohibition and would continue to support the expansion of the wine industry throughout the post-Prohibition period. When World War II came, the bank would be a major force in supporting wartime conversion and organizing local civic economic planning. Branches of the bank would organize local pools of manufacturers to apply for wartime contracts and federally guaranteed loans. On the broader regional level, Bank of America strongly supported industrialists like Henry Kaiser in building the new industrial base of the region.

With the end of the war, Bank of America was at the forefront of the new consumer-led boom in credit and spending. This boom was backed by the federal government through a combination of subsidies and a regulatory regime that favored bottom-up consumer led growth. The subsidies came overwhelmingly in the form of Federal Housing Administration Loans that were supplemented after the war by GI loans to veterans. Bank of America alone would extend over $600 million in GI loans and would lead the new mass market in housing and the financing of homes.

More importantly, New Deal regulation would focus banks away from the speculative investment that had led to massive bank failures in the Depression in favor of focusing on bank services to the broader public. Tied to separations between different financial institutions in serving different financial functions and a ban on interstate banking, the heart of the regulations was Regulation Q which limited the interest rates paid by banks to depositors. This was linked to anti-usury laws at the federal and state level that in turn limited the interest rates that banks could charge to loan borrowers. This set of regulations was passed to prevent banks from overextending themselves competing for deposits, a problem that had contributed heavily to the collapse of banks at the beginning of the Great Depression. Since most competition for deposits would have been focused on upper-income depositors, set rates of interest leveled the playing field between richer and poorer bank customers and kept basic bank fees relatively low. This system fit with Giannini's view of a world of small depositors and his bank would thrive on small depositors in the post-war period as the largest bank in the world. Fitting with his vision of sharing wealth, Giannini died with an estate worth less than $1 million, most of which he left to a charitable foundation--a minuscule estate from a man who had created a bank with more than $6 billion in assets that he had personally built from scratch.

While Giannini's legacy would live on for a number of decades, new technology combined with a political assault on the New Deal regulatory regime would undermine this whole system of subsidies and level bank rates that had benefited both regional development and low-income bank customers. A new generation of Bank of America managers would embark on a global banking vision that would abandon much of the community-oriented banking that had been its lifeblood, even as its regional rivals would join them in both the technological and economic leadership of this emerging new era.

How Bank of America Created the Credit Card--and saw it destroy regional consumer credit

In the post-war period, Bank of America had embraced installment credit to finance a range of new consumer purchases, especially auto loans of which Bank of America would make 85% of state loans at its peak. Partly to ease the burden of endlessly evaluating lots of individual small loans, Bank of America in the 1950s began exploring the possibility of an all-purpose credit card. Individual companies like gas stations and Sears had introduced company-specific charge cards and a few small banks had tried a few experiments, but Bank of America would be the first to build a multi-vendor card in wide use. It would build its BankAmericard credit card system (what would eventually become the Visa card we know today) and set the standard for all the other credit cards in use.

In launching the card, the bank faced the dilemma that they needed to create the participation of store owners as well as customers, with the catch 22 that customers would only want the card if merchants accepted it, and vice versa. To create this integrated credit system, Bank of America solved the problem in 1958 by mailing out 60,000 credit cards to every resident of Fresno who did business with Bank of America-the city chosen because of its high concentration of customers. (This strategy was an anticipation of the web browser/server division in networking that Netscape would solve years later through similar Internet "drops" of browser software in order to sell server software to businesses.) With so many cards suddenly in the hands of residents, most merchants, especially the smaller ones, would sign up since it would relieve them of the burden of tracking the small credit accounts they ran for customers all the time. The possibility for the credit card derived from the new computers used by Bank of America that could alert the bank to when credit card customers were spending more than their limit allowed. Because of this, it was a natural for the credit card to emerge in the technology center of Northern California. Essentially, the credit card transferred the work of small store bookkeeping into the offices of Bank of America and its computers. In 1956, Bank of America had introduced the first electronic check-handling system, MICR, developed by BofA and the Stanford Research Institute. The first fully automated checking account system would follow in 1959, the same year that, with the Fresno success, Bank of America would make similar credit card "drops" to two million Californians around the state, using its regional concentrations to leverage the adoption of the credit system city-by-city. New technology was launching a revolution in how information about financial transactions would be tracked from this point onward.23

Heavy initial financial losses in some areas (especially Los Angeles) due to careless accounting and loose card distribution actually helped the bank by scaring off the competition from trying to enter the credit card field; by the early 60s, Bank of America was making big profits on a credit card market in California it had to itself until 1966. With other companies readying to enter the field both in California and across the country, Bank of America began franchising the BankAmericard (as it was called then) to other banks in other states; each could sign up local merchants and reap the profits regionally for a nominal fee to BofA. BofA was happy with this franchise approach with its view that the more universal the card, the more likely California residents would want to have one for travel or that tourists from out of state would make BankAmericard purchases from California merchants signed up with Bank of America. In this way, each bank could grow its own regional market while taking advantage of a more universal credit card system.

Four of BofA's California rivals would form MasterCharge in 1967 (which would merge with other national banks to become the MasterCard system) and the next three years would see an explosion of credit cards across the country as over 100 million credit cards were mailed to customers across the country. Abuses and fraud in the confusion led to a federal ban in 1970 on the practice of mailing physical cards without permission by the customer, but by that time credit cards had become a permanent national fixture.

The system used by BankAmericard banks to approve purchases from each others' cards had become cumbersome and unwieldy by 1970. Few banks were happy that Bank of America, a potential competitor even at its most well-behaved, was running the information exchange system. For these reasons, Bank of America turned over control of the credit card approval system in 1970 to a new organization called the National BankAmericard Incorporate (NBI), which was eventually renamed Visa in 1976. NBI would create a non-proprietary exchange system that would be trusted by all involved, a key step in creating a universal financial network. What was clear was that computer technology was needed as the backbone of the new credit card system. Technology could make a complex set of financial transactions between banks simple and invisible to the average user, creating the trust needed for wide use of the credit card. Dee Hock, who left BofA to run the Visa system, set up two monster mainframes in San Mateo, California, to run the national network of computers in the NBI system, managing 200 million transactions in 1974 and nearly 6 billion by 1992. Replacing a cumbersome system of phone calls, NBI drew on technological advice in the Bay Area to create a system costing only $3 million that would save member banks $30 million in its first year of operation. This first broad-based computer network of banks would soon extend to computer devices at individual merchants--the broadest computer network in the US until the advent of the Internet.

With a technological jump on rival MasterCard's Interbank system, the BankAmericard system would catch up quickly on the larger number of banks in the MasterCard system. But the very success of the BankAmericard system would open opportunities that would soon undermine the regional credit systems of all the banks involved. In the 1970s, Citibank had emerged as the main rival to Bank of America in size; Citibank had even set up an R&D facility in California to draw on the same technological resources available to its rival in innovating new consumer uses of the technology. Citibank had been a longtime member of the Mastercard credit card system, but when lawsuits forced the NBI system to allow banks to issue both BankAmericard and Mastercard cards in 1975, Citibank soon signed up as a member of NBI. And when BankAmericard changed its name to Visa in 1976, Citibank took advantage of the confusion to mail a staggering 26 million pieces of direct mail telling people their "new" Visa card was waiting for them, never mentioning the words "Citibank" as the company grabbed 3 million Visa card customers from regional banks. This one campaign would make Citibank the second largest Visa card issuer after Bank of America and, combined with its Mastercard accounts, made Citibank the largest issuer of credit cards in the world.

In one fell swoop, Citibank had used the technological system Bank of America had built through the support of various regional banks to leave its New York base and create a national system of consumer lending. It had steam-rolled over the local systems of lending that each of the regional banks had built over decades. These actions by Citibank would, in the words of financial writer Joseph Nocera, make it among other bankers "the world's most hated bank" and cause bankers "[to spew] unbankerly venom at an institution they had all come to despise."24 Much as Netscape had used technological networks built up by others to steal control of the Web, Citibank had used the Visa technological network to steal customers from the regional banks around the country who had laboriously built the social capital of trust in each regional part of the broader Visa network.

Citibank had been able to bypass investments in building the local lending networks itself in favor of wholesale marketing to already existing customers of the credit card network; by the mid-80s Citibank would emerge as the largest bank in the nation with one out of five American families having some kind of banking relationship with Citibank. However, because of Citibank's deception and fears that something similar would happen with any new network, when Automatic Teller Machines were introduced, Visa banks refused to integrate them into the Visa system. They preferred proprietary ATM systems that would give them greater control over who could use their ATMs and make sure each bank could collect their fees. After a decade of regional banks building trust in open non-proprietary systems of financial information exchange, Citibank had undermined trust in the network and set back moving towards an integrated financial information system until the Internet appeared two decades later.

How Silicon Valley technology and national political muscle undermined regional banking

Citibank was hardly alone in using the new technology to grab customers from traditional regional banking. A combination of Silicon Valley technological knowledge and political lobbying (which Citibank was one of the few banks to support) was eroding the traditional banking system in the United States. In the 1970s when inflation was eroding the value of personal savings, the alliance between technology and upper-income investors looking for a better deal for their money became a political steam-roller that would wipe away decades of bank regulations. The result would be devastating for lower-income savers in the form of the savings and loan disaster, higher banking fees for average depositors, and the closing of branch offices across the country.

On the West Coast, one of the first money market funds was created by a San Jose analyst named James Benham, a Merrill Lynch broker who quit to create a fund investing in nothing but government Treasury bills in order to help consumers get a better interest rate than at their local bank. Joined by Henry Brown and Bruce Bent of the small Brown & Bent Manhattan investment firm, Benham in 1972 would use deceptive filings with the Securities and Exchange Commission (SEC) to get approval for its new unregulated money market fund. Their fund could now compete directly with the banks--something that violated the decades-old intention of the Regulation Q restrictions on interest rates. Computer software would be used to make these new bond funds mimic the actions of a bank account, further assisting in the blurring of the difference between money market funds and savings accounts. Such funds would explode from $1.7 billion in assets in 1974 to $200 billion by 1982.

It was in this period that SRI, now independent of Stanford University and looking for corporate clients, took on a $1.5 million study of the financial industry which would be funded by over fifty corporate sponsors. The most surprising result was how conservative families were with their savings: the typical "affluent" savings account had over $40,000 in it, while more average people (those making $15,000 per year) had an average of roughly $10,000 in savings. Savings and checking accounts amounted to 16 percent of total household assets, just behind real estate. Money that was then routinely recycled through banks mostly in local regional investments were seen by the SRI study as prime targets for companies investing in global capital markets.25 One of SRI's first major clients was Merrill Lynch, for whom in 1975-76 they helped create the Cash Management Account (CMA), which would tie money market accounts to stocks, bond funds, checking, and credit cards. The system was incredibly complicated technologically and required millions of dollars invested in new computer systems, but the result was a simple instrument that could attract the savings of the most affluent bank customers. CMA accounts at Merrill Lynch would eventually swell to $250 billion in assets.

One other main Bay Area contribution to the emerging deregionalization of financial assets was the discount brokerage firm, SF-based Charles Schwab, which would computerize his operations years before other Wall Street firms. By electronically updating each customer's trades, Schwab could offer deep discounts to a new class of upper-income families entering the stock market. And Schwab's success in the Bay Area would be due not only to the technological know-how he could tap but to the new class of "computer jocks" from Silicon Valley being paid in stock options. This new breed of equity owner would need to trade their shares over time, creating the critical mass of a market to launch Schwab's kind of firm.

As much as the regional banking system was undermined by non-bank institutions using the new technology, the regional banks themselves contributed to its decline through their abandonment of their local markets and branches. Bank of America by the 1980s would become a symbol of this decline as it teetered on the edge of bankruptcy and jettisoned almost every mark of its Giannini tradition. A new generation of corporate-oriented bankers had entered its career ladder in the late forties following Giannini's death and by the end of the 1960s, Giannini's prized bank branches were losing out to the ambitions of the new bank leaders. Individual branches no longer handled corporate accounts, weakening the ability of the bank to work closely with rising startups and new industries and the bank had for decades before. Aside from a few investment forays, such as an ill-fated alliance with Memorex, Bank of America would miss most of the opportunities for investment in Silicon Valley in the 60s and 70s as its new corporate leaders sought to compete with non-bank financial institutions in global markets. Instead of recycling regional capital between home mortgages and local business startups as it had in the past, Bank of America would spend much of the 70s trying to recycle Arabian petrodollars into third world loans.26

In this context, the rise of venture capital in the region appears less as a bold innovation than a mutation of that older Giannini entrepreneurial tradition, albeit a mutation that no longer cycled capital between rich and working families in the state but rather would cycle superprofits between the hands of elite investors and soon-to-be millionaire "computer jocks" who would invest their proceeds in new brokerage firms like Charles Schwab. This evolution has split off the whole process of business creation into a separate elite financial world from other regional economic development.

Bank of America was becoming even less interested in its traditional mortgage business and was persuaded in 1977 by traders at Soloman Brothers (who would pioneer mortgage trading on Wall Street) to package its home loans in the form of bonds to be sold by Soloman Brothers. This was the first private issue of mortgage securities in the country, a practice which by the 1980s would help lead to the wholesale conversion of local mortgages into over $150 billion in bonds by 1986. It was largely this speculative trading that would help bring on the savings and loan debacle by the end of the decade, a debacle that would hit California especially hard as ground zero for a disproportionate share of S&L failures.27

One other mark of Bank of America's disengagement from its own regional market by the 1970s was its antiquated computer systems, now a haphazard combination of near-obsolete IBM mainframes connected by a Babel of software programs unable to talk to each other. Unbelievably, it was the London office that had to pioneer the banks computerized system of information for its global investments. When Bank of America finally updated its computer systems in the early 80s, it was in the context of seeking to leverage the $1 trillion a day of transactions moving through its electronic networks into (an ultimately unsuccessful) proprietary fee-for-service electronic network to service other banks. Gone was the tradition that had built the open BankAmericard electronic standard on the assumption that rising use would strengthen the California economy (and thereby Bank of America); now, the explicit view of the bank leadership was that direct loans were losers and the real money was to be made in fees from elite customers, especially other banks. This was also the period when Bank of America, previously a laggard in deployment of ATMs, would expand its ATM network into the largest in the state even as local branches began being closed, 134 closed in 1984 alone. In 1985, Claire Hoffman, the last remaining member of the Giannini family still on the BofA board of directors, resigned her position and publicly denounced the bank's abandonment of its role as a "corporate trustee of great public purpose" and its abandonment of its human relationship with customers in favor of technology.28 It would be left to the next decade for Bank of America and, even more enthusiastically, its old elite rival Wells Fargo to reconstitute regional civic economic coalitions around the new model of using region-based electronic standards to leverage global economic dominance.

What is worth emphasizing in all this change is that while the new Silicon Valley technology made all these changes possible, it was political changes that legitimated them and even pushed them forward. The SRI studies had shown how conservative most people were in their loyalty to their bank accounts, so it was not overwhelming demand that pushed the new financial systems forward. But once money market funds slipped into the marketplace, the federal government left them unregulated, giving them a decisive advantage over traditional banking which was restricted in what interest rates it could offer. State governments attempted to apply state regulations but were met with a tough political response by the new investment firms who could mobilize their elite base of customers to attack the Regulation Q regime of equal interest rates for all consumers. The state of Utah would become the test case when it sought to apply its banking regulations to money market funds in 1980; Merrill Lynch and its fellow money market funds would launch a $1 million lobbying campaign to defeat the attempt and scare off other state governments from attempting similar legislation. And when New York state refused to lift its usury laws which limited the interest rates that could be charged consumers, Citicorp moved its credit card operation to South Dakota after that state wrote tailor-made legislation for the company that abolished usury laws in the state. With Supreme Court decisions making the home state of a banking operation the relevant legal standard for customers from other states, this move by Citicorp effectively abolished usury laws across the country as Maryland and Delaware followed South Dakota's example in abolishing usury laws in their states. Federal legislation in late 1980, which deregulated bank interest rates and freed Savings & Loans from a range of regulations, would just push the ongoing financial revolution forward. With Merrill Lynch's CEO, Don Regan, assuming the Treasury Secretary position in the incoming Reagan administration, the elimination of the regional banking regime of the New Deal would be completed.

The conversion of local family mortgages into speculative bonds was not a natural event, but based on legislative changes at the federal level. That first sale of Bank of America mortgage bonds would become a precedent for the wholesale conversion of the sleepy regional mortgage business, once the glue of "fixed capital" in the shared self-interest of regional growth coalitions, into a key part of global speculation. The sale of such bonds was legal in only three states before federal legislation allowed it everywhere. With interest rates uncapped by 1980 legislation and local banks hemorrhaging with old loans, a new tax break in 1981 would pay savings and loan thrifts to sell their bonds on this new global market. As Michael Lewis, a Soloman trader in the 1980s, would write in his book Liar's Poker: "On September 30, 1981, Congress passed a nifty tax break for its beloved thrift industry...Wall Street hadn't suggested the tax breaks, and indeed, [Soloman's] traders hadn't known about the legislation until after it happened. Still, it amounted to a massive subsidy to Wall Street from Congress...The market [in mortgages] took off because of a simple tax break."29

The Collapse of the Regional Subsidy System for the Poor and Working Families

While some transformation of the financial system was inevitable with the power of the new technology, the abandonment of regional banking systems was a product of political rather than technological sanction. The success of the early electronics-based money market funds were based on the high margins possible in dealing almost entirely with high-income depositors, while leaving the higher-cost low-income depositors to traditional banks. Such "cream skimming" operations had specifically been blocked in the past by Regulation Q which mandated equal interest rates for all depositors, thereby blocking competition on rates that would overwhelmingly benefit upper-income savers. It was the relaxation, then outright elimination of those regulations that refocused banks not on investment of regional deposits --which had encouraged the continual expansion and deepening of those regional markets--to a model based on fee-based transactions in the global financial market, a model that focused on upper-income individuals where the margins on fees are the highest.

For individuals, the new technological model of banking has the potential to have banks tailor services to their needs (especially in the case of upper-income savers) as well as the more sinister potential to have banks tailor fees and loan offers to extract the most money possible out of each transaction. By using the massive computer databases detailing customer characteristics and purchasing information about how the customer responds to other forms of telemarketing or electronic solicitation, banks can use repeated offers of different services to set a price for each individual for different kinds of transactions. Using this information, banks have a legup in knowing when, in the words of one bank analyst, a bank's "prospect is likely to need a particular financial product and [when] to present an offer for that product at the right time."30 For society as a whole, this presents the danger of banks and others with who people conduct day-to-day financial transactions distorting financial markets through privileged access to customer information. With direct knowledge of customer buying patterns, they will be able to undercut other competitors based not necessarily on lower prices but on knowing when to deliver a bid to a customer and knowing when they are likely to accept a higher-priced offer. Communications professor Oscar Gandy Jr. observes that banks are approaching the position of " perfect discrimination where the organization captures not one penny less than what a customer might be willing to pay. The consequence is a market that becomes increasingly inefficient as monopolization expands."31 The banks like Wells Fargo most committed to electronic commerce and smart cards see little threat in the loss of their revenues from traditional deposit fees since they can make up the difference in increased monitoring of consumer buying habits as they substitute use of smart cards for the use of cash. Wells Fargo's VP Hruska argued that banks now had opportunities either to sell this new massive amount of data on customers to other corporations or to use it for targeting their own in-house products to smart card users.32

The deeper fear over the new technology is that banks will have even more detailed information on which customers to avoid altogether. Banks have been closing physical branches where they think customers are too expensive to service and using higher fees to push such customers to use ATM and other alternative services. The next step has been to evaluate not just groups of customers but each individual customer to determine their overall profitability to the bank. Only with recent technology and the internal networking of different departments within a branch have banks been able to tally whether the net gain in fees, interest payments and use of customer's deposits pays for the costs of servicing that particular customer. Most analysts for the banks are estimating that only about 20 percent of customers actually make money for a bank, while three out of five customers cost more to service than they generate in revenue. Many banks are hiking fees across the board in anticipation of many customers, especially low-income and less profitable customers, withdrawing their accounts altogether. If a more desirable customer complains, however, their fees are waived automatically by tellers who have been alerted to which customers to nurture.33 In the end, the same services end up being priced differently for different customers, with higher fees being charged in many cases to the poorest customers in order to drive them out of the banking system altogether. The new Internet-only banks are able to offer higher interest rates partly because of their lower costs, but partly, as they acknowledge, because the Internet itself limits their customers to mostly well-off consumers. "We feel we are getting the cream of the crop from the Internet," argued one Internet bank executive in 1998.34

As banks market to new customers, especially in areas like loans and credit, the implications of using information technology become even clearer. With the right databases, banks will make sure they don't solicit business from lower-income and "less profitable" customers. As an analyst wrote in Bank Marketing, "At this point, the concept that information is power becomes very clear. Competitors that gain access to valuable consumer information aren't going to go after your unprofitable customers--they have enough of their own."35 This will expand redlining in minority and poor communities even more broadly, albeit with a bit more selectivity to avoid the most blatant racist patterns. But as Gandy argues, even if discrimination in failing to offer services or in offering higher prices for the same service is focused on groups developed through "multivariate clustering techniques", it is discrimination nonetheless. In fact, Gandy sees the discrimination as more invidious since those being discriminated against will lack the information to know how they are grouped or easily gain the collective consciousness to act collectively to struggle against this "categorical vulnerability."

The irony of "deregulation" is that it has opened an era of constant tinkering with bank laws and the jurisdictions of various financial entities. The federal government has even occasionally made half-hearted attempts to pursue the same regional financial goals of equity and regional civic economic cooperation that "deregulation" itself has undermined. Even as the government changed the laws that had supported regionally-focused banks, it has enacted new ones to deal with the consequences of failed savings & loans, fiddled with the tax code to spur venture capital to replace lost local business loans, and struggled with the capital flight from crumbling inner cities. The greatest irony was that just as federal government banking rules and technology were undermining regions as relevant lending units by banks, the Congress in 1977 passed the Community Reinvestment Act (CRA) to scrutinize banks even more tightly on the fairness of their lending practices within regions. But if the general banking rules were not naturally channeling deposits back into the region where the deposits were made, the CRA legislation's attack on capital flight from the inner-city became as natural a development of the new era as attempts to promote venture capital through capital gain tax cuts. Each was a jerry-rigged attempt to support regional capital accumulation in a new era of increasingly nonregional banking.

CRA regulations would never require that as much be invested in inner-city minority communities as was deposited by their residents. However, it would create an increasing paperwork surveillance of bank actions that would, if banks were found deficient in attention to community needs, slow approval of bank expansions, leading some banks to consider dropping their bank charters purely to escape CRA regulations. No such expansions were ever actually blocked under the act but at least two banks--Atlanta's Decatur Federal Savings & Loan Association in 199 and DC's Chevy Chase Federal Savings Bank in 1995--were forced to sign consent decrees committing funds to minority area based on evidence of discrimination. Those decrees targeted the banks because of historical traditions of where they made loans, making such a regional history a liability for banks.36

When Bank of America merged with Security Pacific Corporation in 1991, it was forced to agree to a goal of $12 billion in loans targeted to communities covered by the Community Reinvestment Act, a total loan volume of 9% of expected new consumer loans (excluding significantly credit card loans). Similarly, when Wells Fargo merged with First Interstate, it pledged $45 billion in CRA loans over a ten-year period.37 However, with much larger regions to spread those loans and with banks using the new technology to market to specific, multivariate categorical groups (to use Oscar Gandy's phrase), the traditional CRA tools used to analyze compliance using census tracts may fail in the face of the statistical power used to create targeted marketing and services. At the deepest level, though, the rise of national marketing of banking outreach over the Internet is undermining the traditional basis for even holding any specific bank responsible for reinvestment in any specific community. However hard it is to evaluate discrimination in the limited regional reach of community banks, it becomes nearly impossible to analyze national marketing and loan approval processes given the sophistication of statistical grouping by banks. When a bank markets over the Internet, what region gets defined as "their" region? When bank executives proclaim the end of regions, they have a strong self-interest in doing so, since the disappearance of regions basically erases most of the political regulations tied to regions that empower communities to demand resources from them. In the end, these billion-dollar CRA commitments by Bank of America and Wells Fargo ended up being thankfully paid by the banks as the price of once-and-for-all disentangling themselves from regional responsibilities as they merged their way into global markets.

When Bank of America was bought out by NationsBank in 1998, most of the bank executive functions would be transferred to NationsBank's headquarters in North Carolina, but the combined bank's corporate lending offices would remain in San Francisco. The intimate connection to the California consumer that had been Giannini's creed was now disposable, but the connection of the bank to the region's financial and technological elites was seen as indispensable - a symbol of the collapse of cross-class regional collaboration in favor of the new community of elite business relationships in the region.

Electrifying the Internet : The Marketization of Electric Utility Networks

If bank capital has been the traditional circulatory system that has fueled regional economic growth, power utilities have supplied the energy at the base of industrial expansion just as that growth has in turn fueled growth and jobs in regional utilities. Around the world there has been a clear correlation between the consumption of energy, especially electricity in modern economies, and the growth in national incomes. There has been some erosion in this correlation in recent decades as less energy-intensive information-based industries have become more important, but the critical role of energy supplies remains unabated in the growth of regions.38 With $200 billion in retail sales of electricity alone to business and residential customers, energy still plays a massive role in the economy. 39

However, what is changing at a rapid pace is the way a market in energy transactions is replacing traditions of public service utilities that were once the "growth statesmen", in the words of John Logan and Harvey Moltoch, of local regional growth and universal access.40 Instead, the 90s saw the accelerating trend in the "deregulation" of electric utilities--the quotes around "deregulation" reflecting the complicated increases in regulation needed to replace the broad functions of the regionally-based utility system with a whole new, much more complex market system of national energy exchange. In the process of this change to a national system of energy distribution, dynamics paralleling the change to a national banking system are emerging: a dependence on new technology to facilitate the change, the replacement of local regulation by national regulation, particular regional dynamics of innovation fueling new national initiatives, and the rise in inequality in a new national market. Universal energy rates are increasingly abandoned in favor of deals that favor larger, more profitable customers who are cheaper to service in an era where marginal profit rates replace the fixed rate returns of an earlier era.

A key part of new national regulation that facilitated these changes was tied to the new information technology, particularly the Internet itself. In fact, even as the media was hyping the initial hundreds of millions of dollars of commerce being conducted over the Internet by the end of 1996, the federal government and the utilities were quietly launching $50 billion of wholesale electricity sales onto the World Wide Web. Mandated by the Federal Energy Regulatory Commission (FERC) as part of the industry's deregulation, the new Open Access Same-Time Information System (OASIS) would become the marketplace for utilities to reserve energy from producers across the country. The OASIS Internet system would guarantee open access to information about transmission capacity, prices and any other data critical to making energy purchases or sales.41

With a system compared to an airline reservation system, each utility (or in the new welter of acronyms, Transmission Provider), is required to continually update the total transmission capacity of their individual area while listing the available transmission capacity (ATC) at any moment. Any producer of energy may request a "seat" on a utility's electricity grid from one point to the next, possibly across as many as a dozen grids to a final destination. This request is effected by thousands of other energy producers attempting to place similar transmission reservations all asking for a similar "seat" to get the best price at the right "departure" time. The continuous nature of power distribution makes the real-time aspects of Internet information exchange critical to the whole system as power producers seek to sell their energy to utilities, which in turn will retail it to their consumers (or will have multiple "power marketers" competing to offer it to consumers). In order to enforce fair competition, utilities are being forced to separate their functions into three kinds of divisions: retail marketing, power production, and transmission provider functions. Utilities are now required to provide transmission information to their other divisions in exactly the same way they do to competitors to those other divisions and must post to OASIS and charge itself the same way potential customers would be charged.42

The motivation for using the Internet, embodied in what is known as FERC Order 889, was to create standardized access to information with no time-based advantages for any competitor. All utilities with transmission capacity would now be required under the rule to post a common set of data about that capacity on the Net in consistent data formats with common transmission protocols.43 To accomplish these goals, national regulation of electricity has increased as utilities have been forced for the first time to make firm calculations on a continuous basis of what their transmission capacity is at any time, calculations that were unnecessary in the past when most of that capacity was tied to long-term internal power production. The "unbundling" of different utility divisions has required greater and greater scrutiny by the federal government FERC to determine exactly what information needs to be available to competitors to assure equal access. In a sense, each decision by the FERC is a decision on what an ideal marketplace for energy should look like. Each regulation is a choice from a range of possibilities to create the information standards that will yield the "ideal" market result, an ideal that has been and continues to be sharply debated as lobbyists from all parts of the industry line up over each phase of the FERC's mandates around creating an energy marketplace.

The creation of a market involving energy utilities dates from 1978 when Congress enacted the Public Utility Regulatory Policies Act (PURPA). PURPA slowly expanded wholesale competition in the sale of electricity by requiring the interconnection into the utility grid of small energy producers, especially "cogeneration" plants (energy produced in the course of other industrial activity). More recently, Congress enacted the Energy Policy Act of 1992 that enhanced FERC's authority to order wholesale "wheeling", the industry term for equal access by all comers to each utility.44

It was also in 1978 that natural gas prices were deregulated and a gradual marketization of the gas pipeline system was instituted and it was that earlier experience with natural gas contracts that deeply influenced the FERC's movement to the Internet. After years of customers just purchasing natural gas from the nearest pipeline, the FERC had in a 1993 Order 636 mandated the "unbundling" of marketing and distribution of natural gas, thereby encouraging the explosion of new agreement systems, including electronic bulletin boards to allow on-line contracting to make purchases of natural gas from distant sources much easier.45 Those with excess capacity from long-term contracts were mandated to post that capacity on the electronic bulletin boards to be resold to the highest bidder. However, while there was some standardization mandated by FERC around information exchange standards promoted by the American National Standards Institute (ANSI), proprietary systems created by different companies had balkanized the industry into conflicting approaches that made information exchange nearly impossible (and extremely expensive) for a lot of different participants. With every business transaction requiring from 13 to 30 data elements to compare and evaluate bids, one major distributor of market data on the gas industry, TransCapacity, estimated that the industry spent over 750,000 man-hours each month on information transaction bids due to the repetition of data entry needed to deal with different systems. A trade group representing buyers of natural gas, the Process Gas Consumers Group, at a 1995 FERC hearing described how impossible it was for casual consumers to get accurate information: "Someone not full-time in the natural gas industry has a hell of a time keeping on top of the EBBs [electronic bulletin boards]." Industry working groups, under pressure from FERC for standardization, had a year earlier created a Gas Industry Standards Board, but the complexity of gas transactions and the proprietary information systems had created so much information fragmentation that the industry was worried that many customers might turn to electricity as an alternative. By 1996, the FERC began detailing proposals on how gas companies would have to standardize business transaction information even as many industry analysts began eyeing the Internet as a better alternative.46

These later proposals explicitly cited the electric industry, more recently moving toward wholesale market competition, which after a December 1995 FERC decision was mandated to move information exchange to the Internet. FERC had been urged to move in this direction by the Western States Power Pool (WSPP), a group of utilities, federal power marketing agencies and rural electric cooperatives in twenty-two Western states and British Columbia. WSPP had begun using a computer bulletin board using standardized software for selling power between the utilities and saw the Internet as a good next step. With much more complicated transactions looming where the previous cooperation of utilities representing fixed geographic markets would give way to more direct competition, the WSPP saw mandated use of the Internet as the best way to continue successful energy pooling with an expanded set of participants. FERC had established technical task forces on how to implement information exchange and the Palo Alto-based Electric Power Research Institute (EPRI), the electric utilities' research consortium, proposed using the Internet with communication standards EPRI had developed tied to the Internet's TCP/IP standard.47 By mandating Internet information standards for electricity before competition entered regional electricity markets, the FERC would avoid the problems of the natural gas industry where already existing competition had undermined free information exchange.

While the technology centers of the West Coast played a strong role in contributing both support and expertise in the movement of the electricity market onto the Internet, it was the traditional energy centers in Texas that would leverage that tradition into a central role in the new electricity market. Houston-based BSG/ITT Inc., a computer system integrator for the energy industry, was able to convince utilities representing half the transmission capacity of the country to hire the company to establish a Web site to integrate their Internet energy transactions. The collection of mostly East Coast and Midwest utilities, called the Joint Transmission Services Information Network (JTSIN), specifically hired BSG because of its experience with information systems involving the natural gas industry and knowing what to avoid on proprietary systems. "The industry in aggregate," argued JTSIN co-chair Jeff Geltz, "would have spent a ton of money if they had gone off in their own directions and tried to develop their own OASIS programs. Not only that, they would have ended up with the Tower of Babel syndrome they have in the natural gas industry." Using security software from Tradewave Inc., an Austin-based security software company, BSG's JTSIN system was designed under FERC mandate to create as equal access to information as possible for all customers and utilities using just a Web browser and normal 28.8K modem access speeds. Much as technology investments in Silicon Valley led to a strengthening of the position of Bay Area-based banks working with those companies, high tech investments in Austin have helped to engineer a similar strengthening of energy-related companies in Texas as energy sales have gone high-tech and an estimated $25 billion a year of energy commerce flows through BSG's JTSIN Web nodes.48 Companies like Dow Jones Telerate division and an IBM/Siemens collaboration would divide the rest of the market for integrating other electricity power pools onto the Internet.49

Internet information standards were seen as even more critical as many states moved beyond the federal creation of competition in the sale of energy to utilities themselves (a $50 billion per year enterprise) towards allowing competition in the direct sale of power to retail customers (a $200 billion potential marketplace). With phase one of the FERC mandates for availability of information on transmission capacity on the Internet being implemented in 1997, many companies looked to 1998 when phase II would include information on the OASIS system about retail access, scheduling, and a financial spot market for power exchange. This mandate by FERC would assist states in separating the functions of power production, transmission, and marketing to customers as they drew up the lengthy "deregulation" regulations that introduced competition into first power production and then marketing in their regional utility regions.

The unanimous passage in August 1996 of California's AB 1890, the law that opened up competition in the state's $23 billion retail electric utility industry, would largely follow prescriptions laid out by California's Public Utilities Commission (CPUC). The law created a non-profit Independent System Operator (ISO) to manage the physical transmission grid and a separate Power Exchange (PX) to manage the state's internal market for buying and selling power on the grid. The PX and ISO themselves becoming "nodes" in the national OASIS system of power purchasing and transmission by the time they are operational in January 1998.50 To assure competition in the production of electricity in the state, Pacific Gas and Electric was forced by the CPUC to put four generating plants, totaling 3059 MW of capacity, on the auction block. With competition being introduced for retail customers in 1998, the combination of these policies promised the most radical reshaping of regional energy suppliers since PG&E was formed out of the merger of gas systems throughout the Northern part of the state in the first years of the century.51 In the same period, PG&E paid $1.59 billion to purchase 18 power plants in New England in order to position itself as a national player in supplying energy to consumers across the country. Like previously regional banks, PG&E's self-interest was no longer going to be tied to a specific region but rather to marketing to specific slices of customers in a natioanl market.

The Technological Basis of "Deregulation"

The traditional consolidated regional utility structure had a technological logic that had shaped the geography of electric power in the United States. Part of the impetus were the limits of transmission technology and the need to balance seasonal demands between urban areas with heavy power needs in the winter versus rural areas with heaviest demands in the summer. Supplemented by the long lines tied to the alternating current technology (the technology that had defeated the smaller-scale generation designs pioneered by Thomas Edison), consolidated regional utilities became the norm across the country. Each was built around large generators which required heavy financial investment backed by guaranteed rates of a fixed market. These integrated utilities delivered subsidized rates to rural agricultural areas, thereby expanding economic development and access to the electricity grid.52

What has changed in recent decades is that new technology has made possible power generators that can be scaled in size to meet marginal demand in a more flexible way than in the past, even as new information-based technology has dramatically increased the efficiency of energy generation and transmission. Large-scale coal-fired and nuclear plants are increasingly giving way to a new generation of natural gas-fired plant with much higher efficiencies than in the past. Using turbine technology borrowed partly from the aircraft industry, these new gas turbine plants combine fast construction, lower capital costs and the ability to be scaled in size from relatively small to quite large to meet energy demand. These new plants are much more flexible than the older economies of scale that almost automatically demanded mammoth plants with guaranteed markets. Especially with the low cost and plentiful supply of natural gas, this has helped to fundamentally realign the comparative costs of traditional large-scale plants versus newer "modular" gas turbine designs.

With the addition of advances in "renewables" like wind power, biomass and solar technologies, a whole new range of small-scale power has been added to the generator mix. Wind turbines, which California pioneered rapid expansion beginning the 1980s, have seen their costs decline by a factor of seven over the last fifteen years to become competitive with fossil fuel sources. New advances in technology promise to make solar thermal and photovoltaic devices competitive as well within a few years. Fundamentally small-scale and, given their intermittent production of electricity, needing to be supplemented with additional power generation, they helped challenge the paradigm of large-scale, centralized utility-based energy production.53

Many of these innovations along with critical advances in the use of information technology would be directed from the Bay Area through the offices of the Electric Power Research Institute (EPRI). EPRI had been founded in the early 1970s when the real cost of electricity was rising for the first time in a century and the traditional producers of power research, large contractors like General Electric and Westinghouse, no longer seemed able to capitalize the research needed to deal with new energy challenges and rising environmental mandates. Faced with Congressional bills proposing the creation of a government-run power research agency funded by a tax on electricity, the industry reluctantly agreed to create a private consortium that could share the costs of research to supplement the disparate energy programs of the federal government. Jesse Hobson, a successor to Fred Terman as head of SRI, had pushed the idea of such a consortium in the 1950s. Coincidentally, Palo Alto was picked at the EPRI headquarters in 1973 based on many of the same criterion that led Xerox and so many other engineering-based companies to site research centers in the area. The idea also was that being far from Washington, DC would help keep its research agenda out of the political fray. The oil shocks of the 1970s would just redouble the focus on alternatives to traditional fossil fuel plants and on increasing efficiency in all aspects of the industry.54

The results after twenty years of research at EPRI, federal research facilities and borrowed technology from aerospace would lead to dramatic changes in the cost-effectiveness of energy technologies and help to drive down the costs of electricity once again. Along with assisting new technologies, the infusion of microelectronics and "smart software" into older plants increased their efficiency dramatically and planning software had made efficient management of a mix of technologies much easier. But some of the most dramatic changes in technology came not in the production of electricity but in the transmission--a key innovation as bulk power transfers between utilities increased fourfold between 1986 and 1996. With forty percent of electricity generated in the US sold on the wholesale market by 1996 and even more expected with open access to that market by new producers, the need for efficient transmission management of those power transfers became critical. The answer would be a new class of solid state controllers called thyristors, a development sometimes called the "second silicon revolution." Able to control electricity in the grid with the same speed and precision as a microprocessor but with power levels a billion times higher, thyristors allow utilities to safely operate much closer to a system's thermal limits and expand the transmission capacity by 20-25 percent. By making transmission more economical, this new system, called the Flexible AC Transmission System (or FACTS) would open transmission capacity to new producers and make it more economical from them to extend the geographic reach of their marketing. For those seeking to marketize electricity transfers, EPRI's director Richard Balzhiser has argued that with these new controllers, "the physical flow of power can be optimally aligned with the commercial, contractual pathway," meaning that the grid can be managed specifically to serve the commercial transaction needs of many power producers. "The electronic controllers of the new FACTS system are indeed setting the stage for widespread competition."55

The Politics of "Deregulation" and the Collapse of Energy Price Equality for Working Families

Having noted how technology has accommodated the changes to a market-driven electricity system, it is worth emphasizing that technology is not destiny. While there was a logic to centralized integrated utilities in the earlier part of the century, the existence and success of federally-owned power systems, rural cooperatives and municipally-owned utilities show that any technology can inspire multiple economic formations. The balance is determined more by politics than engineering. And while the logic of the new Internet-type technology opened new possibilities for the federal government to encourage the wholesaling of power between various utilities, generators and government-run power agencies (even as it raises problems that we explore in this chapter), the rapid movement to retail markets in electricity followed little logic other than the political demands of big business at the economic expense of residential customers. What retail competition for electricity promises is to dissolve regional energy planning and universally shared energy rates in regional economies between large industry and working families.

After California passed its legislation opening up retail sales of electricity to competition, one state utilities commissioner reported to the utility industry's Public Utilities Fortnightly publication that, "What defines our whole exercise, our whole restructuring, is our high rates and the strong cry by the large users for customer choice. Those are the two driving forces." In the same article, another state regulator noted that industrial pressure was driving retail competition across the country: "If the 100 largest industrial customers in every state want to have some kind of restructuring, there's going to be tremendous pressure put on the commission. And I think if the commission doesn't respond in some form, there will be pressure put on the legislature and I'm sure the governor's office in every state...Those are the realities. I don't think it's the utilities, it's the large industrial customers driving this."56

Industry Week, the manufacturing trade magazine, was explicit in seeing a payoff for big industries of $80 to $100 billion a year in lower electricity costs throughout industry with 20% to 50% savings at individual industrial plants. The threats driving states to open up retail competition was clear and written about approvingly by the magazine: "Some large companies are threatening to close or move facilities unless they get rate reductions. Such threats are drawing responses from states. Michigan, for example, is so fearful of losing industry that it now allows new industrial customers to shop for power. "57

If big business is saving 20% to 50% on its energy bills--far more than any projected general technological savings--where are their savings going to come from. Part of the answer is that what deals exist may be quickly locked up in long-term contracts before competition is fully opened to small business and residential customers. One critic noted the "elitism" of the California Public Utilities Commission installing the meters needed to manage retail purchases at large customers, while leaving fixed prices for smaller customers: "The more jaded observer will note," wrote Penn State professor Frank Clemente responding to the market competition hype in a utility trade magazine, "that this 'women and children' last approach ensures that large users get a big piece of a pie that may not be further divided."58 Small business magazines were even more explicit in seeing big business winning out at the expense of smaller customers. Inc. magazine noted that small business organizations were attacking the new changes. "What you see is small business getting screwed," said Julie Scofield, executive director of the Smaller Business Association of New England. Inc. condemned the "fine print" that gave large customers freedom to negotiate rates while leaving smaller customers still tied to electric utilities with fixed rates.59 As for residential customers, it is even less clear how they fit into the new market system. ``The battles to determine how this all comes out are just starting,'' said Nicolette Toussaint, a spokeswoman for the main California utility consumer group TURN, The Utility Reform Network. ``It may be 20 years before we see real competition'' in the marketplace.60

While differential access to new markets are part of the explanation for big industry's enormous gains from retail electricity competition, an even larger reason is the way market competition would effect what is euphemistically referred to as "stranded assets." This refers to all the less profitable power plants owned by utilities that will become nearly worthless once new, lower-cost energy sources are allowed to cannibalize the electricity markets the older plants were built to serve. Planned in an the 1970s when energy demand was expected to increase more rapidly than it did and electricity prices were expected to keep rising, these "stranded assets" involve many nuclear power plants costing $90 billion to $150 billion. Someone is going to have to pay for them and the assumption is that the utilities won't pay and that larger businesses will circumvent paying much of the cost. Fortune magazine stressed that it would be only after "the costs [of the "stranded assets"] are paid down over the next decade or so [that] households should see prices fall."(italics added) 61

What this reflects is that market competition is now creating the opportunity for large industry to find their own deals, releasing them from the political imperative of the past that forced them to work with the broader community for energy planning. The California solution to selling this fact to the public was a rather bizarre jerry-rigged system of public subsidies to the utilities to help pay their $20 to $30 billion in stranded assets. The state utilities board had originally planned to freeze residential rates but the state legislature decided that there had to be a rate cut of 10 percent for consumers to buy the package politically. To pay for this, the state authorized the utilities to issue up to $10 billion in revenue bonds from the State Infrastructure Bank. With the proceeds of that sale in hand and with a long repayment period on these state-backed bonded, the utilities would receive a "competition transition charge" (CTC) of 2 to 4 cents per kilowatt-hour on electricity sold in the state to pay back the bonds. Between the long repayment schedule on the subsidized low interest loan from the state and the CTC charge, utilities would receive the costs of both their stranded assets and the rate cut given to residential customers (a rate cut consumers would of course be paying for themselves indirectly through their taxes). If this wasn't bad enough, analysts expect that many companies will seek exemptions from the CTC charge, further burdening residential customers.62

What is clear is that universal rates that once tied everyone in geographic regions together have been eliminated and, as the marketplace moves forward, it is also unclear who will pay the higher costs of servicing rural and some poor areas. And as the same direct marketing and evaluation of which customers are profitable to service moves forward as it did in the financial sector, inequality of services and prices are likely to increase. Already, a number of regulatory commissioners across the country have raised the issue of "information inequality" in an arena where complex issues may leave the old and the less educated vulnerable to deals that look good but deliver less than they promise. Frankly, that description applies to the whole California retail electricity competition law which cynically sold a rate hike to voters which the voters themselves will have to pay for over the next decade through the subsidized bonds sold by the utilities.

The Loss of Regional Power Planning

There is, however, a deeper meaning to the "stranded assets" issue. Their existence is chalked up to stupidity and cost overruns by the utilities, sanctioned by regulatory commissions over the years. But if energy demand had risen at the rates projected in the 1970s when many of those plants were built, those white elephants would be seen as profitable smart investments instead of part of an excess of energy supply that is driving down prices. And the fact that energy demand did not rise as the rate expected is largely due to regional power planning that encouraged energy conservation across the country. But instead of the fruits of that energy conservation and lower prices being shared by everyone, the breaking of universal rates in regions was a political method that delivered the fruits of those years of regional planning to the narrower benefit of big industry.

Analysts have noted the irony that since the Calvert Cliffs decision of the early 1970s, courts have required thorough socioeconomic impact statements on every power plant before they are built to assess their effects on jobs, on energy rates and on the environment.63 Yet the whole system of power management is being thrown out the window with virtually no hard numbers on how the changes will effect the regional economies of communities across the nation.

In California, as one example, it was the California Public Utilities Commission that inaugurated many of the most innovative energy conservation programs in the country. Jerry Brown had taken office as governor in 1974 and his appointments would promote a new focus on energy conservation. The CPUC eliminated the volume discounts that encouraged high energy consumption by industry and structured energy prices for residential customers in "block rate" systems that discouraged consumption. Where utilities had once received a rate of return on all energy produced, the CPUC began giving the utilities profit incentives to increase conservation and reduce demand, including allowing them to include all expenditures on conservation measures within the rate base. As well, penalties were created for failure to pursue conservation. The result, in combination with federal laws and subsidies for conservation was a broad program of housing weatherization and other programs supported by utility advice and low-interest loans to homeowners, was a broad decline in energy demand compared to the projections of the mid-1970s.

At the same time, the CPUC promoted unregulated "cogeneration" at industrial plants to expand the domestic energy supply. Along with government legislation requiring utilities to buy power from non-utility energy producers and encouraging the research and development into new technologies, the regulators created the expanded energy sources that has now lowered the prices of electricity production. The irony is that very success of the regulators is undermining the system of universal rates and fixed cost utility returns that would allow such regional planning in the future. Barbara Barkovich, the policy director of the CPUC for the last five years of the Brown administration, argues that a focus only on the lowest marginal rates on energy leaves regions without the power to create incentives and rate structures to deliver conservation and other long-term goals. "Regulators no longer have the flexibility," argues Barkovich, "to make tradeoffs between the present and the future or among customer groups...Rather than having a fixed number of customers and amount of sales over which to spread the utility's revenue requirement, they must face the possibility that sales will disappear as customers leave the utility system."64

It then becomes impossible for utilities to expend resources for non-revenue producing goals like conservation or new technology and expect to make it up in general rates. Consumers will now be able to abandon the utility system for low-cost producers who are concentrating only on day-to-day production and are unburdened by other costs associated with longer-term management of energy consumption. In fact, such new producers are likely to undermine that longer-term management by bringing back the volume discounts and pro-consumption incentives that the regional utility system worked to abolish. Similarly, many of the incentives that were created regionally to discourage use of environmentally destructive production processes may fall victim to national competition that imports high pollution energy to regions that once would have prohibited its production. Regions downwind from heavy polluting power plants may find their residents exacerbating the problem by purchasing power from those very plants.

Utilities are already responding to the collapse of the fixed rate system by slashing their funding of research and development. Under the fixed rate system, the utilities could collaboratively invest in basic research knowing that the results would benefit each in their regional area. As collaboration between utilities becomes competition in the new system and low-cost producers enter the scene using the fruits of research without having had to invest in it themselves, the utilities no longer can expect to fully capture the value of their investments in research through passing it on through their rate base. The result has been that the Electric Power Research Institute has been reorganized to emphasize specific proprietary projects rather than basic research. Utilities now fund only the products from which they expect immediate returns, undermining the long-term research that delivered the technological gains that fueled the low-cost power that inspired competition in the first place.65 Government spending on energy research has been slashed even more dramatically, falling 80 percent between the 1970s and the late 1990s; research dollars had dropped in half just between 1992 and 1997.

In practice, large industry is enjoying the fruits of that research and the research gains derived from the regional system of rates of the past while residential customers seem likely to bear the disproportionate costs of competition in paying for "stranded assets". And if the cutback in research and development leads to little new technological improvement over the next few decades, then the immediate cost savings that large industry is enjoying by breaking off from regional universal rates may be the only savings to be had as energy demand expands.

How Regulation increases with the Expansion of Market Competition

Even if the initial gains from market competition go to big business, the proponents have argued, the society as a whole will gain as supply and demand replace the regulatory intervention of bureaucrats and as business savvy replaces the dead hand of political interests. The problem with that argument is that as competition has increased, so has regulation and political lobbying by all energy interests involved. The real difference is the ability to regulate is being stripped from regional authorities, such as state utility commissions, and is being transferred to national authorities such as the FERC and directly to the US Congress as they write laws to design the new system. The early part of this chapter detailed the extensive regulatory interventions of the FERC to shape and encourage marketplaces for energy on the OASIS Internet system, but those are just the beginning.

As long as all retail energy sales were done by local utilities, traditional federalism left regulation to the states. However, as energy is increasingly sold across state lines directly to customers, the Interstate Commerce Clause of the Constitution increasingly paralyzes the power of local regulators in favor of the federal government. FERC officially stated in 1996 that "to the extent that retail [competition] involves transmission in interstate commerce by public utilities, the rates, terms and conditions of such service of subject to the exclusive jurisdiction of the Commission [FERC]..." The California system of a separate non-profit power exchange and grid will preserve some jurisdiction for state authorities but increasing direct sales to customers across state lines will largely transfer regulations over energy charges and rules of distribution to the FERC.66 So in an age of rhetorical decentralization of power to the states, "deregulation" is, ironically, largely a nationalization of power over electricity where national politics, rather than regional imperatives, will shape who wins and loses in the fight for the $200 billion electricity market.

Already, lobbyists are swarming the national Capitol and hiring lobbyists by the bushel. The top two dozen electricity combatants spent over $32 million in 1996 on lobbying along with millions more spent on research, polling, television advertising and organizing grassroots support for their interests. Enron, one of the largest power companies in the world, launched a political advertising campaign in 1996 costing upwards of $23 million to push for opening up more markets for direct retail sales by the company. The Edison Electric Institute, the main lobbying arm of the utilities, has pulled key Democrats and Republicans onto its payroll, including former Rep. Vin Weber, a close ally of Newt Gingrich, and Haley Barbour, the former chair of the Republican Party. Their spending is being matched by a coalition of independent power producers called the Americans for Affordable Electricity.67

God, or rather profits, will be in the details of how that retail competition is introduced. At issue is exactly how independent power companies will get access to the electricity grid and to the customers at the retail level. As the OASIS system is developed, each utility's transmission grid will have to calculate, according to FERC's directives, its available transmission capacity or ATC, a number that they had never measured before and for which there is no unified approach on how to calculate. How it is defined by FERC will have major impacts on how much is charged power providers and how much the utilities make in transmission. These new requirements for power companies to provide precise information on a whole range of data, compared to the older fixed rate system where unified management made precise measurement unnecessary, is just one example of the perversity that "deregulation" is increasing bookkeeping for a range of business and increasing the complexity of business transactions to a point of near chaos. Market "deregulation" promises a blizzard of rules by the FERC on exactly what functions and data sets will have to be specified in each bid in each round of transactions from production to transmission to retail sale.

These details and regulations, multiplied tenfold on the national level, that will structure winners and losers in each stage of transactions from generation to distribution to retail sale. Considering how often competition advocates declaim the end of the "natural" monopoly for utilities that integrated generation, distribution and retail sale of energy, it is worth considering the proliferation and expansion of regulation needed to separate those functions to create the "natural" market system. What is clear is that, in a complicated system such as energy delivery, markets are as much a construct of government as monopoly utilities ever were. Barbara Barkovich, the former policy head of the CPUC, sees exactly this continuity from her own pro-conservation interventionism to the market-oriented approach of her successors at the CPUC: "The roots of this new competition-oriented ideology are not inconsistent with those of interventionist regulation. Both market-oriented regulators and interventionist regulators are motivated by doubts about the ability of utilities to manage their businesses to provide the greatest benefit to their customers."68 In both cases, where decision-making on the details of how to structure energy delivery had once been left to designated regional utilities, the new interventionist regulations require constant fine-tuning of how to shape that delivery from production to transmission to retail delivery.

Where the old and new regulators differ is in the control mechanisms and technology that they encourage to circumscribe the behavior of the energy companies involved. With the traditional regional system of monopoly utilities, a relatively fixed demand for energy (that could be modified over time by interventionist policy) at a mandated price helped determine which technologies were deployed and how much capacity to build. Once regulations on what rates to charge customers were made and what the mandated rates of return for utility stockholders would be were determined, regulators had little to fiddle over in regards to the actual day-to-day details of maintaining network reliability and allocating costs between the internal functions of the utility. Those could largely be left to the company since its own long-term financial self-interest required regular maintenance of the transmission grid and long-term investments in upgrading the system as a whole.

Under market competition, power producers treat their own plant capacity as fixed moment-to-moment (since all prices will be based on marginal costs, not on longer-term rates of return as with the utilities) while prices will fluctuate across the country as demand adjusts to prices changes. Regulation is required because marginal cost decisions will not include calculations relating to maintaining the system as a whole, forcing new regulations at each point in the distribution system in order to bring those market transactions in line with the need for stable service, reliability, access and long-term investments in the transmission grid. It is the shift from a few key macro regulations to a proliferation of micro regulations.

What worries many people is that political pressure will create microregulations that favor short-term profit for producers over reliability of the system, a dangerous proposition for networks transmitting the lifeblood of commerce across the country. Many critics of the move to competition, especially to movement towards for-profit transmission systems, have pointed to the West Coast blackout of four million homes that occurred in 1996 just weeks before final passage of the California market competition legislation. A July 1996 report by the North American Electric Reliability Council, the umbrella for the nine regional utility councils that manage the national electricity grid, warned that with greater and greater national transmission of power, the thermal limits of power lines will be pushed farther on a day-to-day basis than ever before. All this will happen in an environment where short-term profit will encourage stretching the system to the limit, even as utilities which formerly cooperated in management of the grid increasingly become direct competitors.69

The hope is that through the careful information mandates involved in the OASIS Internet system, the FERC will maintain real-time management of demand in a way that overcomes those dangers. But many worry that if a loose wire can shut down the West Coast in 35 seconds in times of peak demand, what will the swings of market demand do in a system where knowledge of national capacity is uncoordinated in any central way? And if a crisis occurs, a number of analysts have expressed concern that the reliability and speed of the Internet's "World Wide Wait" itself may be insufficient for the split-second coordination needed; delays in connection to the OASIS server could cascade into delays and power outages across the system.70

There is an irony that the unreliability of connections on the Internet, itself a product of decentralized service based on market regulation, is seen as a key danger to the reliability of the emerging management of the electricity grid of the nation. Utilities are beginning to form alliances for reciprocal transmission rights of electricity over their lines, much as Internet backbone services agree to exchange information freely, but it is precisely those kinds of decentralized agreements that lead to the erratic flows on the Internet that cause congestion and delays in that network system.

As the next section will elaborate, the Internet evolved in exactly the same environment that is pushing forward market competition for energy--new competitors breaking from regional telecommunications systems to find profit servicing wealthy customers at the expense of working families. This created a system based more on politics that serves those elite interests than any abstract idea of "efficiency" and, as part of that policy, promoted a privatization of management of the Internet grid that served those profit-making concerns more than assuring reliability for customers.

Cream-Skimming the Old Bell System or, How Subsidies and Regulation made Phone "Deregulation" and the Internet Possible

With the privatization of management of the Internet from 1992 to 1995, the industry around the Internet quickly began trumpeting their success as proof of how unregulated market competition had helped explode the size of Internet participation. No players were more likely to trumpet the success of this new free market than the independent Internet Service Providers, or ISPs in the incessant lingo of the industry. From veteran Whole Earth Networks to upstart Netcom to giant American Online, these Internet providers not only beat back proprietary networks like Microsoft, they delivered to their customers (local businesses and upper-income individuals) an unlimited "all-you-can-eat" flat-rate price for service that made the high prices for long distance phone service seem laughable in the face of the new technology. Internet phone calls, made essentially for free over the Net, began to bypass traditional long distance phone services and the Net seemed to promise limitless connections at a price the mastodons of the old regulated phone system could only dream about.

Two events in April and May 1997 would undermine the "free market" bravado of the ISPs as these Internet free-marketers made loud, extremely public appeals for the Federal Communications Commission to protect them from market prices in order to "save the Internet" (and their own profit margins.) The first event was the announcement by Uunet (described in Chapter 2), the national backbone service that was given control by NSFNet of most of the Internet access points used by ISPs, that it was no longer providing free "peering" to the midsize Internet providers, companies carrying roughly twenty percent of Internet traffic. "Peering" is the practice, crucial to the formation of the Internet, of automatically transmitting information from point A to point B whether it originated on your machines or not. However, the unregulated expansion of traffic on the Net rapidly outstripped the infrastructure for carrying the traffic, leading to congestion, lost messages and the "World Wide Wait" of connections. With flat-rate pricing encouraging ever increasing traffic, Vint Cerf who oversaw Internet projects at MCI, one of Uunet's main backbone competitors, observed that "The hill is overgrazed, there's no more grass, and the sheep die."71

Uunet announced that it would continue the practice of peering with other backbone providers like MCI and the other providers that invested heavily in infrastructure, but that the midsize ISPs, a large proportion of them located in the Silicon Valley area, would have to start paying for the right to exchange messages over Uunet's backbone and connections. Smaller ISPs had been paying Uunet for the right to directly connect their customers to the Internet, but the charging for "peering" has emphasized that the traditional cooperative model of infrastructure building on the Net has broken down.

However, the midsize ISPs immediately cried foul and a number of them turned to regulators to settle the score. San Francisco-based Whole Earth Networks President David Holub, which has about 20,000 customers in the Bay Area, immediately scheduled a meeting with the state Public Utilities Commission to protest Uunet's decision and indicated he would turn to the Federal Communications Commission to regulate Internet carriers like phone companies, invoking rules that bar phone companies that route other phone companies' calls over their networks from imposing discriminatory fees on different competitors. That type of regulation would prohibit Uunet from peering with MCI while charging a fee to ISPs like Whole Earth--a change that would throw the whole system of Internet pricing into chaos and accentuated how unstable the whole marginal pricing system of Internet competition had become.72

At the same time (and more successfully for the Internet providers), the ISPs along with AT&T, Apple Computer, Netscape, Microsoft, Compaq Computer, IBM, and a host of other computer companies were demanding FCC intervention to prevent market pricing on the other end of the information pipeline, namely the local telephone companies used by ISP customers to reach them in the first place. Since the initial breakup of AT&T back in 1983, the FCC had exempted Internet providers from paying the same kind of per-minute access charges to local phone companies that long distance customers had to pay to connect their customers. This allowed Internet providers to pay the flat business rate to local phone companies that ordinary local business customers paid--which in turn allowed them to offer flat-rate service for the Internet to their customers.73 What this means is that in connecting a customer to an Internet provider, payments for Pacific Bell were one-twentieth of that for connections to normal long distance carriers. This was all despite the fact that the costs for handling each kind of call are exactly the same for the local phone company.

Even worse for the local phone companies is the fact that Internet calls average much longer than either local or long distance phone calls. According to Pacific Bell: *

30% of the total time of customers use of the phone system generated by dial-up Internet traffic came from calls lasting 3 hours or more and 7.5% came from calls lasting 24 hours or more while the average voice call lasted only 4 to 5 minutes. *

With the concentration of ISPs in its Northern California region, Pacific Bell has been impacted far more than any other local phone system; of Pacific Bell's 772 switches, approximately one-third in 1996 serviced ISP hubs concentrating Internet traffic. By January 1997, 62 of those switches had already exhibited congestion where voice calls were being degraded or even blocked by the level of Internet traffic.

Pacific Bell cited one Silicon Valley ISP hub where traffic levels in late 1996, driven by a single ISP, undermined service in the whole area. The ISP represented only 3.6 % of total office lines, but accounted for about 30% of use during the busiest hours of the day. The result was that 1 out of 6 phone calls were being blocked due to the congestion. Pacific Bell spent $3.1 million cost to fix that one hub alone, and estimates that it will have spent $100 million on Internet traffic upgrades in 1997 and will spend $300 million on upgrades by the year 2001. Given the ISPs exemption from paying long distance access changes, Pacific Bell maintained it would earn only $150 million from additional revenues in that period due to ISP traffic.74

Worse than the actual costs of the upgrades was the fact that those investments were being made in traditional analog voice phone lines and switches, instead of moving the ISP phone traffic onto high-speed digital switching systems that would be more efficient create the basis for upgrading all data traffic. Most of the Baby Bells began offering such high-speed digital services for ISPs in 1997, but the Internet providers had little incentive to pay for such services as long as they could convince the FCC to allow them to use the local phone lines like ordinary business users.75

And in May 1997, the FCC, under intense lobbying from both computer companies and Internet users, agreed to continue the ISP exemption from access charges, although it ordered essentially minor concessions to the local phone companies by raising all charges on second phone lines, the logic being these would likely be used for Internet connections. Some of the smaller Internet providers complained that the additional charges on all their incoming phone lines would hurt them, but larger ISPs like America Online declared victory: "We will see an increase in our charges, but we do see that on balance we need to accept the additional charges because they are flat and they are nominal," said Jill Lesser, America Online's deputy director of law and public policy. "A permanent access charge would have been orders of magnitude worse for AOL. Even at one cent per minute, we would have incurred a charge that would have been in the neighborhood of $100 million and which we would have had to pass on to the customer. So when you look at an increase that is 1/10 of that, that's a fairly modest increase."76 The broad coalition of computer companies had successfully protected the subsidized status of Internet providers.

By spring of 1998, the FCC was in the even more farcical position of refusing to apply access charges to companies using Internet systems for actual telephone calls. A whole new class of companies, such as Qwest Communications and IDT Corp., had grown massively by allowing customers to use ordinary phones to access Internet lines and make phone calls more cheaply than with competitors. Since these new companies were using IP packet switching, they could claim to be Internet Service Providers and thus exempt from long distance access charges. Since those local access charges add up to about 4 cents a minute (on calls costing 10 to 15 cents), this exemption for IP phone calls gave these new companies a massive window for profit, essentially subsidized by the local phone companies. While IP networks were carrying only 1% of phone calls in 1998, estimates were that more than 13% of phone traffic would be carried on such networks within four years, adding up to a potential loss of $4 billion per year in access charges for local phone infrastructure.77

The whole Internet industry fought to prevent the FCC from applying the same access charge rules to the new IP phone calls as they did to regular phone calls, with the executive director of the Commercial Internet Exchange arguing that the industry should remain exempt because the Internet is "an industry that [has] grown very quickly and that is critical to the future of our economy."78 The irony of these lobbying campaigns by the Internet industry is that the industry spokespeople had pictured the privatization of the Internet as the "end of government subsidies" where the free market had successfully stepped into the gap. The reality, as these FCC decisions highlighted, is that the profits of the private Internet industry had derived substantially from the cannibalization of past and present investments in the local phone infrastructure. Local phone users, mostly lower-income users without a computer in the home, were seeing investments diverted to industry and higher-income Internet users that could have been targeted for upgrading the overall network or delivering new technology for schools, hospitals or other public places serving the whole public. Instead, the specific private subsidies for the Internet industry helped fracture planning for the overall local phone system and blocked overall upgrading of data traffic.

In the end, where federal investments once fueled overall economic and technological advancement in regional economies, these new "market competition" policies end up sucking funds from the infrastructure serving low-income and local users to subsidize those using the Internet for national and international purposes. And the forced segmentation of "competition" into their own boxes of long distance, local service, ISP and other regulated divisions has so fragmented phone service as to make comprehensive investments for upgrading the overall system nearly impossible.

Now, if this had been a small sin to help the Internet get off the ground, it might be a minor, even admirable hiccup in regulatory history, but this is the pattern dating back to the first attacks on the integrated AT&T Bell system. And with competition and "deregulation" of telecommunications becoming the metaphor and model for other network-based industries like electricity, it is important to understand how MCI, Sprint and other new telecommunications companies were a product of regulatory subsidy and infrastructure cannibalization rather than market efficiency as their mythmakers hold out. The last part of this chapter will explore that history, its implications for understanding network economies and how this legacy has thwarted more comprehensive regional economic approaches to digitally-based telecommunications.

Many proponents of competition pooh-pooh concerns over investments in phone infrastructure, noting that in the early decades of this century, full-throated competition led to a massive expansion of phone service across the country. Which is absolutely accurate. AT&T had emerged by 1880 with a monopoly on the telephone industry with its control of the Alexander Graham Bell patents and expanded service to 260,000 people by the time its patents expired in 1893. Full-scale warfare broke out between the Bell system and 3000 independent competitors to compete in building infrastructure across the country, with AT&T retaining only half the market of a vastly expanded 6 million phones by 1907.79

But it was an infrastructure that frustrated most of the customers, since they could not call friends in the same city if they belonged to a competing network and would be unable to call whole cities if those towns were controlled by networks hostile to the hometown service. The Bell system was the only service that provided anything approaching a comprehensive long distance phone network. For the rest, competition made most of that expanded infrastructure unavailable across lines of hostile businesses--a state that led to pressures towards consolidation and regulated utilities. Facing this dilemma, in 1907 AT&T board of directors would hire as President Theodore Vail, a longtime advocate of universal service within the company, who would largely shape the modern telephone structure and regulatory regime. Vail would change corporate strategy and withdraw from direct competition all over the country, instead favoring interconnection agreements with small operations where AT&T was not competing--de facto absorbing them into the Bell system. As AT&T began to also purchase other phone companies, Vail also reversed Bell policy and accepted government regulation of the industry in order to maintain high-quality technology and uniform pricing. A 1913 consent decree with the Justice Department officially put AT&T purchases of other phone companies under the regulation of the government and required non-Bell companies to be connected into AT&T phone lines, all in the context of negotiated agreements that turned AT&T and the independents from competitors to collaborators in maintaining the phone infrastructure. State utilities commissions strongly supported the movement to consolidation and 1921 federal legislation, the Willis-Graham Act of 1921, placed AT&T under the jurisdiction of the Interstate Commerce Commission and exempted it from antitrust restrictions on purchasing other telephone companies. AT&T and would purchase 223 independents in the next thirteen years. The new Bell system was largely decentralized with AT&T operating the long distance lines and acting as a holding company for separate state telephone companies which would be regulated by state utilities commissions. Each of these state Bell affiliates would become, in the public service vision of Theodore Vail, a fixture in the economic development of their regions with strong public participation on their boards of directors. This was the structure that would last largely until the breakup of AT&T in 1983.80

Latter-day market advocates argue that all that was needed were regulations to require mandatory interconnection between services, and the country could have preserved the benefits of both competition and interconnection (much as is promised today with phone competition).81 The problem with this retrospective viewpoint (and present advocacy) is it ignores the basic economic implications of Metcalfe's Law--the rule-of-thumb that the value of a network increases not arithmetically but geometrically with the number of participants in that network. What this means is that the economic value of interconnection for small networks to much larger systems is astronomically high, while the main value of the investments in infrastructure by large networks is precisely the fact that they can offer such a large geometric network value where smaller networks cannot. Mandate interconnection and much of the value of that larger network's infrastructure (and the incentive to create it in the first place) disappears. Regulation of customer phone service rates may be eliminated but government regulation will still be required to establish the rates paid for interconnection, an intervention that will most likely either be too high to encourage new entrants to the marketplace or, more likely given larger networks' preference for no interconnection (i.e. an infinite price), a price set too low for the larger network to maintain the quality and breadth of its infrastructure for all users. In such a situation, the most profitable position is to be a smaller network servicing high-income, high-profit individuals or businesses who can as needed reach the low-profit customers of the larger network due to mandatory interconnection regulations.

This is the position of cannibalization where Internet providers are presently positioned, but is also the position that opened up competition to MCI and Sprint. That competition would service large industry at the expense of funds to assure access and investments in the local phone networks serving working families.

Subsidies and Separations: How MCI cannibalized the Bell System and sold the myth of market efficiency

There was no technological innovation, no business efficiency that made MCI into a multi-billion dollar competitor. That is the major fact to understand about telephone deregulation in the 1970s.

MCI's profits and growth (as well as that of Sprint and other new competitors) came purely from convincing regulators to give them discounted interconnection to the Bell system and allow it to shift resources from ordinary ratepayers into the hands of its business customers in the 1970s. It was that simple. Since AT&T was bound by regulation to continue investing in local phone infrastructure while the new competitors were not, the result was predictable; new competitors would underprice AT&T's rates. But the true success of MCI and other competitors was that they sold this regulatory success as a triumph of the free market rather than skillful political and legal maneuvering. Alan Stone in his book Wrong Number has noted the irony that:

AT&T, the largest firm of all, was committed through goals such as universal service and rate averaging to the interests of the small subscriber, whereas those with whom AT&T came into conflict, such as MCI, were primarily concerned with the interests of large business subscribers. The main political achievement of AT&T's rivals was their collective ability to portray their own interests as the 'public interest,' and in the process to gain important allies at every decision-making level.82

The key to MCI's success was the confused and fragmented nature of US government in dealing with precisely the issue of regional development and investments. Even as the official policy of the government was to favor investments in local expansion of the phone network, universal access, and low-cost service to rural and poor communities, MCI would use divided regulatory structures and economic confusion to expand its markets at the expense of such investments and universal rates. Backed by an alliance of populist liberals who distrusted AT&T's size and ideological market libertarians, MCI would help push forward the breakup of AT&T in favor of market competition, a policy opposed by large majorities of the public and large sectors of the government.

Beginning in World War II, the Federal Communications Commission (the regulator of AT&T based on the Communications Act of 1934) had required the AT&T Long Lines long distance division to pay a portion of every long distance call to local phone company exchanges based on the portion attributable to using that local infrastructure. This allowed local regulatory boards to keep local rates low and expand service to rural and low-income users. This created an extremely complex accounting system for the Bell network (its complexity making it easier for later competitors to sow confusion for later regulators) based on "separations" between these two parts of any call. In the 1950s, the National Association of Railroad and Utility Commissions, the national lobbying arm of state regulatory agencies, pushed for changes in the "separations" that would allocate a larger portion to the local phone company divisions. This increase in long distance rates occurred as new technology was reducing the costs of long distance and rising wages were actually increasing the costs of labor-intensive local services. Essentially, public policy was diverting the technological dividends of long distance to the costly investments needed to insure universal service at the local level. The FCC would approve a series of increased separations payments to local service in the coming decades, creating an ever increasing gap between the price charged for long distance and the actual costs of delivering the service.83

The first challenge to universal phone rates had come in a 1957 FCC decision as private industry, promoted by Motorola who provided the equipment needed, sought to bypass AT&T using newly available radio spectrum to construct private phone lines over microwave equipment between different corporate locations. AT&T opposed these private lines since they would undermine universal rates for all customers, contribute nothing to local infrastructure investments and drain off some of the most profitable customers of the long distance system. Worse yet, AT&T was barred by the FCC from offering competitive rates to those business customers in order to salvage some income for the system from those customers. When AT&T tried to drop rates, it was accused of predatory pricing, while the regulatory rules against it dropping its rates would lead critics to accuse the AT&T system of inefficiency since it didn't match the rates of new rivals. This "damned if you do, damned if you don't" straightjacket would define the ideological vice that would eventually destroy the integrated phone system over the next quarter century.

This private line exception would not impact the Bell system directly, but in 1963, a small, almost bankrupt company called Microwave Communications Inc. (MCI) sought to establish a private line service from Chicago to St. Louis that would resell capacity to individual corporations who had offices in both cities. The key to their proposed service, however, was that local corporate offices would need to use local Bell telephone services to connect to the MCI microwave transmitter. The transmitter would then relay their call to the other city and would then use local Bell service connections at the other end to connect the call to each company's other corporate office. This was a dramatic advance from the 1957 private line decision, but in 1969 MCI's application was approved by a close 4-3 vote of the FCC commissioners, although this was treated as an experiment for a single line that would closely be watched. But AT&T officials protested loudly that the "experiment" was an inevitable slippery slope towards full long distance competition that would undermine the whole system.

The recently hired CEO of MCI, Bill McGowan would aggressively run with the application and expand MCI with a series of interlocked companies creating private lines city-by-city. McGowan came to MCI not as a technology person but as a finance and marketing expert and he knew that managing politics would be the key to MCI's expansion. While McGowan would sell MCI to investors and regulators as a "high-tech" company, the technology was just the microwave radar technology from World War II. The key was politics. In his authorized biography of the company, business journalist Larry Kahaner gave a sympathetic view of McGowan but related that:

McGowan knew that the future of MCI hinged on managing the regulators, the FCC. He had learned...that you could build a business by thinking of regulatory bodies such as the FCC as something that had to be managed, the way you would manage any other item that affected your business. Lumber companies managed natural resources like forests; MCI would manage the FCC.84

In the case of MCI, the goal was to clear-cut the traditional AT&T phone system with the hesitant support of the FCC. To facilitate continuing intervention with the regulators, MCI hired FCC commissioner Kenneth Cox who had cast the deciding vote in the 1969 decision--a legal if unseemly offer of employment following a decision in MCI's favor. By mid-1970, the FCC faced over 2000 applications for private line services, most of them from MCI companies set-up in different towns.

Contributing to the original FCC decision against AT&T were recent phone service problems due to unexpected surges in demand for phone services, specifically in well publicized problems in New York's Wall Street district where technological changes in the finance sector detailed earlier in this chapter were driving the need for multiple phone lines and increased data transfer. New consumer activism at state regulatory agencies blocking local rate increases and a shortage of new capital due to high interest rates would combine to further burden the local phone companies within the Bell system. The surging demand for telecommunications services would increase pressure on the phone system just as MCI was leading the charge to help the most profitable customers of the system avoid paying the long distance charges that helped finance that local infrastructure.

The irony was that MCI could not turn a profit on the private line business and by 1973 was facing bankruptcy (helped along by AT&T dragging its feet on interconnection in many cities). So MCI created a service called Execunet in 1974 to provide its customers with the ability to call anyone in a city serviced by an MCI line, essentially providing traditional long distance phone service directly in competition with AT&T. Much as the original creators of money market funds had gotten approval by the SEC by obscuring what they were doing, MCI filed the new service not with the FCC commissioners themselves but with the routine tariff division. When the FCC commissioners found out what MCI was doing, they were extremely angry and ordered the company to stop. The commissioners had hit the tolerance limit for their experiment and decided to draw the line at full competition on long distance. State regulatory agencies were already up in arms over the FCC's previous approval of MCI's private line services and their national lobbying arm, NARUC, had called in 1973 for the FCC to be stripped of jurisdiction over long distance rates and how those rates subsidized local phone operations.

But decisions over the phone system were spinning out of the control of both the state regulators and the FCC. In 1974, crusading liberal anti-big business lawyers at the Justice Department filed an anti-trust suit against AT&T. Downplaying the big business customer base of MCI and the complex subsidies for local phone companies, the Justice Department liberals saw the case as one of defending scrappy upstarts like MCI against the "largest corporation in the world." Federal courts overruled the FCC's attempts to, belatedly, reign in MCI, with a major 1977 decision arguing (as AT&T initially worried) that once they granted competitive access, they could not then limit the range of services offered by companies like MCI. The end result was that MCI and other competitors were now in direct competition with AT&T for long distance customers--but with discounted access to the local phone system, they could undercut AT&T on price while actually spending more to deliver the service.

Perversely, the FCC in 1970 had increased the "separations" payments by AT&T to local phone operations dramatically under what was called the Ozark Plan, even as the FCC was allowing MCI to start cannibalizing the long distance service that made those payments possible. While long distance rates had fallen over the years, the separations payments had channeled most of the improved efficiency in the system into lower rates and more universal access in the local operating companies of the Bell system. The cost of monthly residential telephone service would by 1980 be one-third in real dollars of what it cost back in 1940, even as the quality increased and the percentage of households with phones had grown from 37 percent in 1956 to over 93 percent by the 1980.85

By the time MCI offered its Execunet service, AT&T separations charges accounted for two-fifths of long distance revenues. MCI and other specialized carriers negotiated a rate of payment for interconnection equal to only half that amount, so their rates could be a mammoth 20 percent lower than AT&T's even if their costs were exactly the same as the Bell System. That regulation-imposed difference in costs would allow MCI to reach $1 billion in revenues by 1983 when the Bell system was broken up by the anti-trust court decree. And that regulatory difference in costs would attract entry by companies with significantly higher costs than the Bell system, a complete undermining of old anti-trust arguments for competition breeding the most efficiency. Yet many of AT&T's opponent would watch the growth of competing firms and argue their growth showed that the Bell System was inefficient and that competition was desirable. The reality was that productivity gains in the Bell system during the 1970s exceeded all but one of sixty-three industries surveyed by the Department of Labor, including fifty-one industry groups with productivity gains one-half that of Bell.86

Technology and Interconnection

While long distance competition fueled by the separations subsidies was one pressure on the Bell system, the other side of the escalating conflict leading to breakup was the battle over interconnecting new technology to the local phone systems. It was this fight that would fundamentally hobble the ability of the phone system to comprehensively respond to the emerging computer revolution and instead precipitated the proliferation of proprietary networking technology which the Internet would only overcome two decades later.

Fights over computerization and equipment had actually been the longest lasting fight between regulators and AT&T in the post-war period. Long Lines long distance service and local operating companies may have been deploying the technology that drove down phone rates and improved efficiency, but it was AT&T's Bell Labs division's research and development and its Western Electric manufacturing unit that were building the technology that drove breakthroughs not only in the telephone industry but, as mentioned earlier, in the computer industry as a whole. Back in 1949, the Justice Department had challenged AT&T requirements that customers use equipment from its Western Electric division and it had sought to break the company up in an antitrust suit. The 1956 Consent Decree that emerged out of this lawsuit kept AT&T intact, but prohibited Western Electric from producing non-telephone equipment and forced AT&T to license its computer-related patents (thereby spurring the migration of technology development to Silicon Valley). With massive expansion of the phone system in this post-war period, AT&T was willing to concentrate its energy on fulfilling phone service orders, even as Bells Labs continue to churn out innovation after innovation that AT&T could not develop itself.

The conflicts began as computer technology began to converge with telephones in the late 60s. Where "equipment" stopped and new additions to the telephone network started became increasingly unclear. AT&T had long restricted what equipment, labeled "foreign attachments" in Bellspeak, could be linked to the phone system in order to maintain consistency in the network (and, of course, maintain their revenues from equipment sales.) But in 1968, the FCC ruled that a device called the Carterphone, which relayed sounds from a telephone to a mobile radio, was a legal attachment to the phone system, thereby opening up the market for attached equipment from answering machines to Private Branch Exchanges (PBXs) to modems.

Opposed to the FCC decision were the state regulatory agencies who feared lost revenues from equipment sales would drive phone rates higher and that additions to the phone network would create a massive, complicated regulatory mess (a point on which they were absolutely correct). The new equipment opt-outs were allowing companies to create their own internal private phone infrastructure without paying the costs of expanding access to expensive rural customers and lower-income users. While MCI was undermining subsidies from national phone use to local infrastructure, equipment opt-outs were undermining intraregional subsidies.

But beyond the economic issues were the way equipment opt-outs were undermining technological standards and fragmenting interconnection of data on the network. As computer use of the telephone advanced, AT&T found itself in the perverse position of being one of the foremost innovators in computer technology, while being barred from selling equipment in that area. As other computer companies began processing information using the phone system, they inevitably would begin performing quasi-telephone company functions as they switched data between different customers, opening the question of whether those information services should be regulated by the FCC and what their contributions should be to the overall network. Time sharing computers were at the heart of these early disputes and, in response to a stock information service called Telequote IV, the FCC made a ruling in March 1971 called Computer((I which removed regulation from most computer interaction with the network. AT&T itself was barred from offering such data processing services and the FCC even ruled that the company could not provide data processing to its own divisions (although this was overruled by the courts on appeal). But it was clear that any step by AT&T towards computerization of its networks would be met by FCC challenge, so it could do little to upgrade its own interconnection operations to provide an alternative to the proprietary versions then appearing. It was in this context that AT&T had refused to explore computerized packet switching networks when asked to by ARPA in the late 60s and 1970s.

It was fears of just such fracturing that led the National Academy of Science through its Computer Sciences and Engineering Board to publish a report that opposed uncontrolled connection to the telephone system precisely for fear that they would undermine network performance and undermine standards for compatibility of data exchange.87 The rise of private PBX systems and other proprietary devices would fulfill those fears in the next decade. AT&T was working to build a digital backbone for the whole phone system and slowly extend it to smaller and smaller units, which meant they wanted to offer analogue switching devices to customers looking to install PBX equipment in the 1970s. AT&T looking over the long run saw proprietary digital devices tied to an overall analogue system as a waste until they could optimize the whole system. However, customers wanted the newest digital equipment for immediate needs and they turned to other suppliers. Those short-run decisions would increase long-term costs in connecting disparate parts of the phone network and this would also retard upgrading the system as a whole to digital standards given the technological "lock-in" of individual proprietary digital devices using the analogue phone system. When a 1980 decision barred AT&T from selling data terminals to customers, this further undermined the ability of the phone system to coordinate information standards with the underlying digital architecture of the phone system. 88

The Breakup of AT&T

The breakup of AT&T in 1983 led by the Justice Department would put the final nail in the coffin of coordinated data standards promoted through the phone system, even as other parts of the government would strive to build an Internet to put the humpty-dumpty of fragmented networks back together again.

Whether to break up AT&T divided the government to the last moment. State regulators, most of the scientific community, and the Defense Department acted as the loudest defenders of the AT&T system, while the hybrid populist liberal and libertarian conservative coalition that led deregulation efforts in a host of industries in the 1970s would push forward the partitioning of the system. An ironic twist for AT&T, in a battle with endless perverse twists, was that the very connections of the phone system to promoting economic civic life and its engagement with public service would be a decisive, if unintended factor in its destruction. When the Reagan Administration won the White House in 1980, the new Attorney General William French Smith was dead-set against continuing the antitrust suit. However, Smith was a long-time board member of Pacific Telephone from his work in Los Angeles and had to recuse himself from any consideration of the case, while his Deputy Attorney General's law firm had done work for AT&T, so he had to recuse himself as well. This left almost total control of the lawsuit to the new head of Justice's anti-trust division, Stanford Law Professor Bill Baxter, who was a longtime ideological opponent of the AT&T monopoly. The lead opponent of the breakup in the Reagan administration was Bernie Wunder, the head of the National Telecommunications Information Agency (NTIA)-- an agency that would play a crucial role throughout the 80s in building the Internet. Wunder's opposition was joined by the Secretary of Commerce and Secretary of Defense, whose department would issue a 1981 document that argued:

the Department of Justice does not understand the industry it seeks to restructure...All that divestiture as outlined by Justice could possibly cause is a serious loss of efficiency in the manner the network operates today...We believe that [divestiture] would have a serious short-term effect, and a lethal long-time effect, since effective network planning would eventually become virtually non-existent.89

But fears that any dismissal of the lawsuit would lead to accusations of conflict of interest on the part of William French Smith left Baxter in control and the result, in the end, was a partitioning of AT&T on Jan 8, 1982 beyond anything Justice had originally envisioned. All local telephone exchanges (two-third of the million employees in the system) would be grouped into seven "Baby Bell" companies that would remain regulated common carriers, while long distance service would be combined with Bells Labs and Western Electric as a purely profit-oriented company competing with MCI and all the other new telecommunications company on an equal footing. Partly, this was the result that AT&T wanted by this point since with the financial noose of regulatory restraints slowly strangling the company, its executives saw fully embracing competition and abandoning its public service tradition as the only way to survive. Unfortunately for AT&T, lobbying by the American Newspaper Association (who feared a threat to their classified ads) added provisions to the settlement that barred the company from electronic publishing for seven years--a further retardation of innovation in the name of competition. This divorce of the system's innovative heart from the day-to-day maintenance of local phone infrastructure would be the final shredding of the fabric that had funneled the fruits of technological innovation into the needs of average phone consumers.

While access charges on long distance telephone calls would preserve some of the funds flowing to local service, the aftermath of the consent decree would dramatically increase the price of local phone service and the costs of installation and repair of equipment would increase tenfold in some regions. As market competition increased, a range of services increased in price for local residents. By 1997, even local pay phone costs were skyrocketing in price as the FCC overruled local regulatory controls on their prices - an especial burden on the 20 percent of low-income families without their own phones who depend on pay phones as their only source of phone service.90

Before the breakup of AT&T, 80% of the public in a New York Times poll had expressed satisfaction with their phone service and in 1985, 64% of the public declared the breakup a mistake.91 All that was left of the old system would be the jerry-rigged system of access charges that would funnel a (decreasing) amount of money to the local phone companies for infrastructure but would prevent any fundamental technological upgrading of the system.

Was there an alternative? Minitel versus the Internet

Before turning to the other effects of the Consent Decree on regional telecommunications, it is worth considering what opportunities were lost in the manic push to competition in telephone services, especially when compared to the experience of the French Minitel system--the most developed public information network in the world before the expansion of the Internet. In looking at the French experience, it is clear that the regulatory push in the US for competition in telephone service undermined the ability to deliver an integrated electronic information network and delayed its availability to the public by a decade or more.

Many Americans do not realize that in 1994, as most people were hearing about the existence of the Internet for the first time and Netscape was just being founded, France had for ten years been running an integrated electronic information network that had pervaded homes and workplaces across that country. With over 6 million subscribers and 20,000 commercial services, the Minitel system in France completing dwarfed the Internet and the private information services like Prodigy and Compuserve. Almost 50 percent of the French population had access to minitel terminals at home or at work encompassing a wide diversity of the nation, while 30 percent of businesses (and 95 percent of businesses with more than 500 employees) were connected to the system. Almost every bank had developed home banking services allowing customers to check their accounts, pay bills and trade stocks. Travel agencies, insurance companies and retailers had all created services on the system, all of these services generating $1.4 billion in revenue by 1993.92

Minitel consoles were found in bars and hotels, libraries and offices, rural post offices and mountain railway stations, making any debate about information "haves" and "have-nots" almost irrelevant. As one US analyst observed in 1994, "The real difference which I noticed in France is the French system's omnipresence. You see it everywhere; or rather, you know it is there but you don't see it--the system has achieved the technological goal of invisibility currently being set by product developers for the Internet."93

What is clear is that it was the integrated nature of the French telecommunications system and a financial structure built around fixed returns on capital rather than marginal market transactions, that allowed this broad information network to emerge so much earlier in France than in the US. Where business and US policy pushed for clear market segmentation between the telephone and computer industries, the French government early on committed to integrating them. In 1975, two researchers were commissioned by the French President to develop a strategy for computerizing French society. The report by the researchers, Nora and Alain Minc, became a bestseller and coined a new word "Telematique" which defined the merger of computers and communication technologies as the key to innovation in government and business. France in the early 70s had one of the least developed telecommunications systems in the developed world, with only 7 million telephone lines serving a population of 47 million people. With massive investments needed to finance the expansion of the system to achieve universal access, the promotion of information services became not just a visionary goal but a financial necessity to subsidize the expansion of traditional phone service.

An initial concept was to have an electronic phone directory substitute for the rising costs of printing phone books and staffing directory assistance. With prototype terminals installed in test cities from 1980 to 1983, the system seemed to be a technical and sociological success. A full-scale distribution of free terminals to customers was begun with over a million terminals in place by the first year and over 3 million distributed by 1987. The free distribution of the terminals, hotly debated by the government-run phone company initially, was a key innovation in attracting the critical mass of users quickly and giving the enterprise the aura of democracy that generated quick acceptance. The other key component was adopting a common, standard protocol that interfaced the telephone network easily and cheaply with the packet-switching data network. The phone company would remain a common carrier, while a range of other businesses could easily "set up shop" on the system with Minitel collecting fees in a simple and cost-effective way that would show up on customers' phone bills. As would happen in the US's Internet, electronic mail and chat would unexpectedly explode use of the network as would controversial "messageries roses" sex-related services. This traffic would overload the system in 1985 and give both fame and notoriety to the new system as it was reengineered for the increasing level of traffic that continued to expand.

Financially, the system could only succeed because France Telecom could subsidize the start-up costs and the free distribution of terminals with the expectation of long-term returns due to its monopoly position. Given the link between the expansion of basic phone service and the Minitel service, it has been broadly disputed exactly how well the Minitel system has done in returns on investment, but audits have shown that the system reached the break-even point in 1989 with mintel terminals paying for themselves within 5.7 years. By the year 2000, the minitel system will have earned an internal rate of return of 11.3 percent over the 1984 to 2000 period. But only an integrated telecom monopoly generated fixed cost returns could have invested with that long-term a view of financial returns and with the confidence that it could capture the revenues from its investments over that period.94

In the US, the hemorrhaging of fixed cost returns for AT&T as new businesses were allowed to grab high-end telecom markets meant that there was little chance of such comprehensive investments, especially as AT&T faced the increasing reality of its dismemberment. And, even if AT&T had been willing to take the financial risk, the regulatory segmentation of phone and computer markets (in the name of competition) meant that the company was legally barred from even offering, much less subsidizing, Minitel-style terminals to its customers. Without a central phone system enforcing standards, the result in the US was the slew of niche networking markets, each with its own proprietary technology and standards. These proprietary technologies fragmented the electronic information services market, a fragmentation that, compared to France, delayed the mass availability of an electronic information network for over a decade.

Even as one set of government officials had fragmented the telecommunications marketplace in the name of competition, other government leaders centered at ARPA would create the Internet system of standards and finance the backbone lines that would, belatedly, create an alternative data exchange system as an overlay on the phone system. As detailed back in chapter 2, the Internet leadership would use government and university facilities to create the critical mass of users to make Internet standards compelling to other private networking companies. As the system was expanded, the exemption of Internet Providers from paying access fees to local phone companies would create the subsidies that would rapidly expand the system.

The Internet was a brilliant technological solution to that networking fragmentation but its effects on regional investments in overall telecom infrastructure and on general equity were the reverse of that in the Minitel system. France's Minitel system subsidized the expansion of traditional phone service in that country for lower-income users, while the traditional phone system users had to subsidize the Internet. And where the Minitel system made information services available on a democratic basis through the free availability of terminals, in the US you had to own a computer to get access to the early Internet, so subsidies in the system invariably flowed to upper-income and business users.

If the Internet had a relative strength, it was the fact that in having to overcome the patchwork of proprietary systems within the US, it became a strong vehicle for overcoming different electronic standards used between countries and therefore emerged, with the economic and political strength of the US government behind it, as the international networking standard. Even France was promoting the use of the Internet by 1997, starting to introduce dual-use terminals that could access both minitel and Internet services.95 But if the push in the United States for networking had been combined with long-term investments that AT&T could have pursued with the technological innovation of Bell Labs behind it, there seems a high likelihood that electronic networking could have expanded earlier and more democratically within the United States.

Expanded Regulation and the Aftermath of the AT&T Breakup

While the integrated AT&T system was obliterated with the Consent Decree, the need to regulate not only did not diminish but expanded as the FCC and other bodies had to microregulate the charges paid at every step of the new market in telecommunications. As well, while the subsidies from long distance to local operating companies diminished in the new system, they did not end and the FCC and local regulatory bodies would have an increasing role in detailing exactly how to allocate those subsidies. This intimate involvement by regulators was in contrast to the period of the integrated AT&T system that had left those day-to-day decisions to AT&T management.

The breakup of AT&T would unleash a flood of new and increasing regulations and lobbying as the telecommunication division of the DC Bar Association, previously a nonexistence category, would double each year as the splinters of the telecommunications industry each demanded regulation in its own self-interest.96 Exactly how much to charge long distance companies for access to local phone companies networks would remain a contentious issue. Initial FCC proposals to raise local rates $8 per year to replace previous payments by AT&T provoked a firestorm of opposition and encouraged 1984 legislation by Congress to maintain at least part of the previous subsidy system from long distance. Local rates would rise but continued regulatory battles would determine by how much.

While the original conception of the local operating companies was that of pure common carriers who would engage in no equipment-related or other competitive markets, the reality was that the Baby Bells were not economically viable with lessened subsidies, so they were given ownership of local Yellow Pages, the right to sell phone equipment (but not to manufacture it), and entry into a host of competitive of markets in order to bolster revenues. Of course, with local phone monopolies operating in these local competitive markets, this just reopened all the issues that had led to regulatory controversy around AT&T, so a new host of regulations were needed to govern the entry of the Baby Bells into these competitive markets. With cellular phones, the Internet and a host of more specialized services demanding interconnection into the phone system, a continual stream of microregulations would be needed to govern fair rates for the mandatory interconnection that local operating companies would have to accept. The FCC would spend years embroiled in endless debates over which economic theory should be consulted in order to set rates for interconnection. In a sense, the question was how do you simulate the market price for a good (i.e. interconnection) that the phone company would refuse to offer if given a free choice in the marketplace--an amusing theological question that would consume endless hours of regulatory time.

As the Telecommunications Act of 1996 opened up competition in local phone service, a blizzard of new regulations at the FCC and at state regulatory bodies began to govern where interconnection would occur, what network elements would be provided to new market competitors in individual "unbundled" markets, and how to arbitrate prices for each element.97 New regulations would mandate that geographic distance rather than density of traffic or costs of service would govern rates--a practice to benefit hard-to-access rural areas but one that created a new round of regulatory contention as wireless phone companies complained that this undercut their potential competitive advantage serving such rural areas.98 As the ISP access charge exemption showed, regulations on if and how rates are set for interconnection would fundamentally shape which technology was deployed. With each regulatory decision representing a choice between economic competitors pushing their own technology, "deregulation" has made politics rather than cost a more determining factor over technological direction than in the days of AT&T's dominance.

As many of these disputes ended up in federal court, it also became clear that at stake was less whether regulation was expanding (since it obviously was) but who would control that regulation, federal or local authorities. When a federal court ruled in 1997 that the FCC had overstepped its powers in limiting the power of state regulators, you had the spectacle of one set of companies, largely long distance carriers, lining up to support federal regulation by the FCC, and another set of businesses, largely the regional Bell companies, lining up to support local regulators.99 Despite the rhetoric popular in Washington, DC of "returning power to the states", the reality was that in telecommunications as in so many other areas, federal regulators were stripping power from local authorities while radically expanding their own jurisdiction.

Adding to the morass of new regulations was the issue of what should be funded with the access fees dedicated to "universal service" and how to assure that new entrants to the marketplace were paying their share of the overall infrastructure costs. As competition entered the local market, a whole host of companies would begin seeing themselves as possible recipients of those funds (and could thereby score an economic advantage if their rates could be supplemented with the universal service fund). Additionally, while the extra costs of subsidizing phone service in rural communities and for the poor had been the traditional focus for such funding in the days of AT&T, the 1996 Telecommunications Law mandated that universal service be defined in terms of "evolving levels of telecommunications services."100 The FCC implemented this mandate by creating a new assessment on all phone services to give public school and libraries discounts of up to 90 percent off local phone service. This was combined with mandates from local regulators to offer those institutions high-speed lines for Internet access. Exactly how these requirements will be applied created a whirlwind of lawsuits and ad-hoc decisions by regulators.101 ``I don't know any economist who thinks the current system makes sense in any terms other than politics,'' said Michael Katz, a professor at the Haas School of Business and former chief economist at the FCC, in surveying the 1997 decisions of the FCC.

The End of Regional Phone Companies

One key to the old Bell system was that it combined an integrated phone system with strong decentralized operating companies in each state. Since each operating company had to raise its own debt and operate within its own regulatory environment, those local companies became fixtures of those communities in representing the Bell system's commitment to universal service.

However, the post-breakup Baby Bells increasingly abandoned their local focus in favor of new, more profitable ventures serving higher profit customers. Even before the mega-mergers between four of the Baby Bells (NYNEX and Bell Atlantic, Southwestern Bell Corporation and Pacific Telesis), a merger and investment frenzy had permeated the operating companies. US West Inc. invested $2.5 billion to acquire a 25 percent interest in Time Warner Inc.; NYNEX Corp. Put $1 billion into Viacom International in its fight to takeover Paramount, receiving preferred stock and two seats on the board. SBC Communications made a $1 billion joint venture with Cox Enterprises. The failed merger of Bell Atlantic and cable giant Tele-Communication Inc. (TCI) just pushed the merger frenzy farther.102

This was all quite a change from the desperate financial straits the operating companies found themselves immediately following divestiture. But in many ways it was a logical result of an economic environment that left their core business open to economic cannibalization by other market competitors, thereby making investments in their core business unattractive from a shareholder perspective. The change for California customers was probably the most dramatic, since Pacific Telephone has been one of the prime beneficiaries of the old AT&T system. With its rapidly expanding population and its tight pro-consumer regulatory environment in the 60s and 70s, California had received a disproportionate share of the long distance "separation" subsidies. (AT&T saw one of the greatest advantages of divestiture as escaping its financial support for keeping up with California expanding phone infrastructure.) Post-breakup, Pacific Telesis was left so overloaded with debt from those expansion costs that it remained financially solvent only because Congress erased $1.5 billion of tax liability in 1982 legislation. 103 With its new solvency, it quickly began leveraging its regulated phone infrastructure assets into new ventures in competitive telecommunication markets. Between 1984 and 1993, analysis by Boston-based Economic and Technology Inc. estimated that 95.7 percent of capital for those new ventures came from local phone assets.104

One result of those competitive investments was Pacific Telesis's cellular phone system which it spunoff as a separate company Airtouch in 1994. With the aggregate value of the separated companies $6 billion more than before the spinoff, the shareholders of Pacific Telesis had done well off the investments diverted from regulated infrastructure to competitive markets. The utility advocate group TURN demanded that ratepayers receive $1 billion as its share of these returns, but the increasingly conservative Public Utilities Commission granted only a $50 million payback.105 What Pacific Telesis also delivered to its customers was lagging investments in upgrading outdated infrastructure needed to accommodate advanced networking technologies. Despite having headquarters in the expanding technology centers of Northern California, Pacific Telesis ranked dead last among the Baby Bells in use of fiber optic cables, the key to delivering high bandwidth applications into homes and small businesses across the region. In 1995, only 6 percent of its access lines were fiber optic cables (compared to 11% by leader Bell Atlantic) and it ranked near the bottom of the Bells in installation of digital switches as well.106 Between mandated subsidies to private Internet providers and its own diversion of resources into market ventures like the Airtouch spinoff, Pacific Telesis had essentially looted the infrastructure serving average consumers in favor of ventures that met the needs of private telecommunications systems serving high-income customers.

The final deathnell for regional telecommunications in California would come on March 31, 1997 as the state's utility commission approved the takeover of Pacific Telesis by its fellow Baby Bell, Texas-based SBC Communications. This new entity would have a combined market value of $47.9 billion serving seven of the ten largest markets in the country; through SBC, it would have global telecommunications stakes in Mexico, Chile, South Korea, Taiwan, France, South Africa, and Israel. Clearly, this new telecommunications colossus had outgrown the day-to-day regional universal service concerns of its heritage.

There would be one last-ditch attempt to claim some funding for the community before its once proud community institution disappeared into its multinational merger. Based on California law requiring that at least half the profits from any utility merger be returned to ratepayers, the Public Utilities Commissions Office of Ratepayer Advocacy and the consumer advocacy group TURN would estimate those merger profits to be at least $2 billion with the public's share thereby being at least $1 billion. The California Telecommunications Policy Forum, an advocacy organization based largely out of communities of color, would mobilize a coalition of public interest, community and labor organizations to demand that the state utilities commission take this as the opportunity to "ensure that all sectors of California's diverse economy and population derive measurable and substantive benefits" from the merger. To this end, in addition to insuring that long-term infrastructure investments were made, the coalition advocated that half of the estimated $1 billion be given back as a rate cut and the other half be dedicated to a "telecommunications trust fund" that would finance community-based technology centers in sixty of the poorest communities in the state, consumer education and advocacy, additional funds for infrastructure in low-wealth school and library districts, and scholarships for low-income students to enter the telecommunications field.107

However, ignoring its own staff recommendations for a $590 million ratepayer refund, the state utilities commission ordered only a $248 million refund and ignored recommendations for the telecommunications trust fund. As Pacific Telesis was swallowed by SBC, Pac Bell customers could expect a rebate of 25 cents per month. TURN would declare that the decision "sells out California's autonomy for a very few pieces of gold."108 Much as Wells Fargo and Bank of America were escaping regional obligations through merger (and as PG&E looked to do in coming years), the regional phone company anchor of civic economic life had ceased to exist in California as global mergers swallowed Pacific Telesis.

Economic Standards/Technological Standards: The Collapse of Regional Economic Models

What is clear across the three sectors of banking, energy utilities and telecommunications is how the radical shift in networking technology has been inextricably tied to changing economic models. Technology largely originating in Silicon Valley would cascade across these sectors, facilitating the expansion of models of market competition while undermining the regional reinvestment models that were once a key part of cross-class agreement that stabilized income and equality for all regional actors.

Previous technological regimes had been backed by economic pricing models that allowed key regional actors to expand their network of customers and recover the fixed costs of the investments needed for that expansion. Banking would depend on Regulation Q and other restrictions on interest rates to prevent a focus only on high-income borrowers, while both energy utilities and the phone system would mandate universal rates for all customers. With fixed rates of return on capital investment more or less guaranteed under these political regimes, each regional industry's fate would depend on expansion of their customer base and the general prosperity of the region--a critical element in these company's commitment to the civic economic life of their regions.

New technology combined with aggressive political lobbying would break open that older regulatory regime in favor of new market-oriented regimes of pricing. New technology would cascade across the networks; computer-managed money market funds would open the first major breach in universal interest rates, while use of finance technology would put new pressures on the phone infrastructure (as in the dramatic failures on Wall Street in 1969) thereby encouraging businesses and high-income customers to bypass the regular phone infrastructure, while the emerging telecommunications networks would allow big business to begin wholesale direct purchases of energy for the first time. These new technological possibilities were seized on by big business and the wealthy to push for regulatory changes that would allow them to escape the cross-class investments that had been at the heart of post-war regional growth coalitions. Usury laws were repealed, new long distance companies like MCI were largely exempted from obligations to fund local phone infrastructure, and large industry was increasingly allowed to bypass fixed utility rates, even as these broad macro regulations were replaced by increasingly detailed rules governing the proliferating segments of each marketplace.

In the process, while business absorbed the economic fruits of technological innovation (largely created through the investments promoted under the older fixed return system), it managed to shift the costs of the transition to market-based regulatory systems onto average working families. Subsidies that had flowed from big business and the wealthy to smaller users began to be reversed as market competition allows sophisticated users to cream the benefits of technology for themselves. This would leave the rest of the residents of the region with failing savings and loans and discriminatory lending, "stranded assets" in the power utilities, and higher phone bills for average users, even as wealthy phone customers would find a new array of options, including an Internet that consumed greater and greater portions of the regional phone utility's resources.

The fraud of "deregulation" in each of these industries was that government intervention, rather than disappearing, would become even more crucial and, in many ways, more intrusive. The fixed capital model of regional development had largely been one of delegating economic development to designated regional actors, whether in finance, energy or telecommunications. With their markets broadly defined and restricted, relatively light regulatory goals would assure that the public interest would be served and the infrastructure needed to undergird commerce would be sustained and managed well. With the rise of market competition, however, the maintenance of the infrastructure, especially in the area of technology standards, becomes less automatic and higher levels of government intervention become needed. These government interventions may be supplemented by private consortia trying to collectively sustain the infrastructure, as with Silicon Valley firms creating alliances around electronic banking standards or energy and technology companies in Texas underpinning energy management structures. However, the underlying rules of commerce, of how to assure fair market access to each market, and even to define where one market starts and others end--all would require government intervening with careful regulations to assure the maintenance of the infrastructure of each underlying network of transactions. And with the traditional alignment of self-interest and public interest largely dissolved under the pressure of market competition, whatever shreds of commitment to regional development and economic equity remain now requires even greater regulatory intervention to address those goals.

Many of the strongest proponents of market competition have seen the solution to this increasing regulatory morass as moving even more decisively to cut the link between private profit and public goals. End any attempt to achieve equity or access within the industries themselves and just let the government issue cash vouchers to the rural and urban poor. Let the poor buy their goods on the open market without the government trying to engineer indirect subsidies.

While such proposals deliberately ignore the regulatory structures that would still be left in calculating the rates for interconnection to and maintenance of these networks, the real hypocrisy of such proposals is that most of those promoting such direct subsidies are precisely the same political actors who have supported the elimination of other direct subsidies to the poor, from school lunches to heating oil to health care. Even as subsidies for business interconnection to networks would remain implicit in public policy, the subsidies for the poor would be laid out as a line item in budgets ripe for political attack. As one example, when the California PUC proposed listing universal service funds as a line item on customer bills, reporters produced headlines about a "new" customer tax on toll service, creating a political focus on those subsidies even as those for business or upper-income users are ignored.109

What is even more politically duplicitous is that even as the basis of local regional economies are torn asunder by the marketization of traditional regional economic anchors, these same market advocates promote the "decentralization" of government spending to local governments. This is doubly destructive to economic equity since, beyond even the loss of political support for equity at the local level, networking technology is helping to undermine the fiscal survival of local government as its tax base is undermined by global commerce. The next chapter will outline the Internet's effect on regional government's tax revenue as a backdrop to the fiscal vice limiting the public information systems governments are able to create for their citizens.

----------------------------------------------

Endnotes for Chapter 5

1 See Harvey, The Urban Experience, 1989 and Molotch, 1976.

2 Vogel, Steven K. Freer Markets, More Rules: Regulatory Reform in Advanced Countries. Cornell University Press. Ithaca. 1996.

3 Hills, Jill. Deregulating Telecoms: Competition and Control in the United States, Japan and Britain. Quorum Books. Westport, CT. 1986.

4 Meister, Charles K. "Converging trends portend dynamic changes on the banking horizon." Bank Marketing v28, n7 (Jul 1996):15-21.

5 Lubove, Seth. "Cyberbanking." Forbes v158, n10 (Oct 21, 1996):108-116.

6 "Banks worldwide plan to increase Internet services in 1997." Bank Marketing v29, n1 (Jan 1997):9.

7 Orr, Bill. 'We're not in Kansas anymore'. ABA Banking Journal v88, n7 (Jul 1996):72.

8 Martin, Frances. "Banking on Internet banking's success." Credit Card Management v9, n5 (Aug 1996):50; "SFNB marks one year on the frontline of internet banking." ABA Banking Journal v88, n12 (Dec 1996):62 and Rule, Bruce. "Internet bank decides the real money's in consulting." Investment Dealers Digest v63, n2 (Jan 13, 1997):12.

9 Junnarkar, Sandeep. "Fewer Bricks Mean Higher Returns At New Internet Banks." New York Times. February 25, 1998.

10 "More banks follow the Internet route." Banker v146, n846 (Aug 1996):20.

11 Hicks interview, Ibid.

12 Morgan, David. "Internet Begins To Demonstrate Its Value To Business." Reuters New Media. April 11, 1997.

13 Lebowitz, Jeff. "The dawning of a new era. Mortgage Banking v56, n9 (Jun 1996):54-66.

14 Zuckerman, Sam. "Taking small business competition nationwide." US Banker v106, n8 (Aug 1996):24-28+.

15 Zuckerman, Ibid.

16 Booker, Ellis. " 'Net to reshape business." Computerworld v29, n11 (Mar 13, 1995):14 and Orr, Bill. 'We're not in Kansas anymore'. ABA Banking Journal v88, n7 (Jul 1996):72.

17 Bers, Joanna Smith. "Banks must decide whether to be a catalyst or catatonic in the face of e-commerce." Bank Systems & Technology v33, n11 (Nov 1996):16.

18 Davis, Beth. "Wells Fargo to certify Net payments." Informationweek, n610 (Dec 16, 1996):30; McGann, Michael. "Wells Fargo gets SET-compliant certificates." Bank Systems & Technology v34, n3 (Mar 1997):20; Fisher, Lawrence. "Hewlett in Deal for Maker of Credit-Card Devices." New York Times. April 24, 1997.

19 Clark, Tim. "HP leads e-commerce initiative." NEWS.COM. December 2, 1997.

20 Winkler, Connie. "Wells Fargo stakes out new frontiers." Computerworld, Financial Services Journal Supplement (Nov 1996):F14-F18.

21 Early history of Bank of America comes from Nash, Gerald D. A.P. Giannini and the Bank of America. University of Oklahoma Press. Norman and London. 1992 and Nocera, Joseph. A Piece of the Action: How the Middle Class Joined the Money Class. Simon and Schuster. New York. 1994.

22 Nash, p. 57.

23 Johnston, Moira. Roller Coaster: The Bank of America and the Future of American Banking. Ticknor & Fields. New York. 1990, p. 53.

24 Nocera, p. 139.

25 Nocera, p. 136.

26 Moira's Roller Coaster carefully traces the fall of Bank of America in the 70s and 80s.

27 Lewis, Michael. Liar's Poker: Rising through the Wreckage on Wall Street. Penguin Books. New York, 1989.

28 Moira, p. 229.

29 Lewis, p. 104.

30 Meister, Charles K. "Converging trends portend dynamic changes on the banking horizon." Bank Marketing v28, n7 (Jul 1996):15-21.

31 Gandy Jr., Oscar. "It's Discrimination, Stupid!" in Resisting the Virtual Life: The Culture and Politics of Information, ed. James Brook and Ian Boal (City Lights, San Francisco:1995).

32 Prince, Cheryl J. "Last chance to recapture payments." Bank Systems & Technology v33, n7 (Jul 1996):28-32.

33 Levinson, Marc. "Get Out of Here!" in Business Week, June 3, 1996.

34 Junnarkar, 1998, Ibid.

35 Meister, Ibid.

36 Ritter, Richard J. "Redlining: The Justice Department cases." Mortgage Banking v55, n12 (Sep 1995):16-28 and Garwood, Griffith L.; Smith, Dolores S. "The Community Reinvestment Act: evolution and current issues." Federal Reserve Bulletin v79, n4 (April, 1993):251.

37 "Wells Fargo announces $45 billion CRA commitment in First Interstate bid." ABA Bank Compliance v17, n1, Regulatory & Legislative Advisory (Jan 1996):5-6; and Monck, Ellen Rowley. "How a megabank is answering the CRA challenge." Journal of Commercial Lending v76, n1 (Sep 1993):47-54.

38 Rainbow, Roger. "Global forces shape the electricity industry." Electricity Journal v9, n4 (May 1996):14-20.

39 Spiers, Joe. "Upheaval in the electricity business." Fortune v133, n12 (Jun 24, 1996):26-30.

40 Logan, John R. and Harvey Moltoch. Urban Fortunes: The Political Economy of Place. University of California Press. Berkeley. 1987.

41 Frye, Colleen. "Electric utilities have Oasis in sight." Software Magazine v16, n10 (Oct 1996):19.

42 Schuler, Joseph F Jr. "Oasis: Networking on the grid." Public Utilities Fortnightly v134, n20 (Nov 1, 1996):32-36; Ramesh, V C. "Information matters: Beyond OASIS." Electricity Journal v10, n2 (Mar 1997):78-82; Gawlicki, Scott M. "The OASIS horizon." Independent Energy v26, n9 (Nov 1996):26-29.

43 "FERC says use Web to post transmission access data." Electricity Journal v9, n1 (Jan/Feb 1996):5-6.

44 Jubien, Sidney Mannheim. "The regulatory divide: Federal and state jurisdiction in a restructured electricity industry." Electricity Journal v9, n9 (Nov 1996):68-79.

45 Katz, Marvin. "Beyond Order 636: Making the most of existing pipeline capacity." American Gas v77, n5 (Jun 1995):26-29.

46 Lander, Greg M. "Just in time: EDI for gas nominations." Public Utilities Fortnightly v134, n3 (Feb 1, 1996):20-23; Hollis, Sheila S; Katz, Andrew S. "Wired or mired? Electronic information for the gas industry." Public Utilities Fortnightly v134, n3 (Feb 1, 1996):15-18; White, Brian. "Electronic bulletin board standardization resulting from FERC EBB working group actions." Gas Energy Review v22, n8 (Aug 1994):2-11.

47 Wallace, Scott. "Power to the People: A Complex Web of Computerized Transactions Is Involved in Delivery of Electric Power." Computerworld v25, n22 (Jun 3, 1991):79-80; Weisul, Kimberly. "Electricity trading mart moves toward Internet." Investment Dealers Digest v63, n4 (Jan 27, 1997):13; "FERC says use Web to post transmission access data." Electricity Journal v9, n1 (Jan/Feb 1996):5-6.

48 Schuler, Nov 1996; Gawlicki, Nov 1996; Frye, Oct 1996; Dunlap, Charlotte. "Integrator spins Web sites for utility outlets." Computer Reseller News, n698 (Aug 26, 1996):65-66.

49 Weisul, Jan 1997; Marshall, Jonathan . "PG&E Is First To Use the Net To Sell Surplus." San Francisco Chronicle. November 12, 1996.

50 Richard, Dan; Lavinson, Melissa. "Something for everyone: The politics of California's new law on electronic restructuring." Public Utilities Fortnightly v134, n21 (Nov 15, 1996):37-41; Gawlicki, Nov 1996.

51 "Pacific Gas & Electric to unload 4 plants, 3,059 MW." Electricity Journal v9, n10 (Dec 1996):4-5.

52 See Coleman, Charles. PGandE of California: The Centennial Story of Pacific Gas and Electric Company 1852-1952. McGraw-Hill Book Company. New York. 1952 with an early history of PG&E from the company's viewpoint.

53 Rainbow, Roger. "Global forces shape the electricity industry." Electricity Journal v9, n4 (May 1996):14-20.

54 "The First Two Decades." EPRI Journal. January 1993.

55 Balzhiser, Richard E. "Technology - It's only begun to make a difference." Electricity Journal v9, n4 (May 1996):32-45.

56 Schuler, Joseph F Jr. "1996 Regulators' Forum: Consensus & controversy." Public Utilities Fortnightly v134, n21 (Nov 15, 1996):14-24.

57 Miller, William H. "Electrifying momentum." Industry Week v246, n4 (Feb 17, 1997):69-74.

58 Clemente, Frank. "The dark side of deregulation." Public Utilities Fortnightly v134, n10 (May 15, 1996):13-15.

59 Plotkin, Hal. "Small biz gets no charge from electricity deregulation." Inc. v18, n18 (Dec 1996):32.

60 Ingram, Erik. "Deregulation -- Devil's in the Details: How competitive market in electricity may shape up." San Francisco Chronicle. May 8, 1997.

61 Spiers, Joe. "Upheaval in the electricity business." Fortune v133, n12 (Jun 24, 1996):26-30.

62 Richard and Lavinson, Nov 15, 1996.

63 Clemente, Ibid.

64 Barkovich, Barbara R. Regulatory Interventionism in the Utility Industry: Fariness, Efficiency and the Pursuit of Energy Conservation. Quorum Books. New York. 1989, p. 157.

65 Hirsch, Robert L. "Technology for a competitive industry." Electricity Journal v9, n4 (May 1996):81,80; Jones, Cate. "R&D bracing for economic fallout from deregulation." Electrical World v211, n1 (Jan 1997):22-23.

66 Jubien, Nov 1996.

67 Drinkard, Jim. "Utility Deregulation Fierce Battle." Associated Press. April 24, 1997.

68 Barkovich, Ibid, p. 156.

69 Marshall, Jonathan. "More Failures Expected for Power Network: Increased demand strains regional transmission lines." San Francisco Chronicle. August 13, 1996.

70 Ramesh, V C. "Information matters: Beyond OASIS." Electricity Journal v10, n2 (Mar 1997):78-82; Hoag, John C. "Oasis: A mirage of reliability." Public Utilities Fortnightly v134, n20 (Nov 1, 1996):38-40.

71 Gurley, J William; Martin, Michael H. "The price isn't right on the Internet." Fortune v135, n1 (Jan 13, 1997):152-154.

72 Marshall, Jonathan; Jon Swartz, "Net Service Providers Facing Fees From Uunet." San Francisco Chronicle. April 25, 1997.

73 Marshall, Jonathan; Jon Swartz. "Coalition Fights to Keep Net Fees Low; Phone companies want higher rates." San Francisco Chronicle. November 9, 1996.

74 Surfing the "Second-Wave": Sustainable Internet Growth and Public Policy. Pacific Telesis. 1997.

75 Junnarkar, Sandeep. "Regional Phone Companies to Offer New Access Technologies for ISPs." The New York Times. April 22, 1997.

76 Junnarkar, Sandeep."New Phone Rules Will Have Mixed Effect on Net." The New York Times. May 8, 1997.

77 Peter Elstrom. Peter."Telecom's new trailblazers." Business Week. March 26, 1998.

78 Schiesel, Seth. "FCC Urges That Internet Phone Service Be Fee-Based." New York Times. April 11, 1998.

79 Fischer, Claude S. America Calling: A Social History of the Telephone to 1940. University of California Press. Berkeley. 1992.

80 Stone, Alan. Wrong Number: The Breakup of AT&T. Basic Books, Inc. New York. 1989, p. 52.

81 See Mueller, Milton. "Universal service and the telecommunications act: Myth made law." Communications of the ACM v40, n3 (Mar 1997):39-47 for a broad analysis and retrospective attack on the consolidation of the Bell utility system.

82 Stone, p. 10.

83 Along with Alan Stone, Temin, Peter and Louis Galambos. The Fall of the Bell System: A Study in Prices and Politics. Cambridge University Press. Cambridge. 1987 gives a good economic account of the changes in the Bell system, while Coll, Steven. The Deal of the Century: The Breakup of AT&T. Atheneum. New York. 1986 gives a good blow-by-blow history of the regulatory breakup of AT&T.

84 Kahaner, Larry. On the Line: The Men of MCI Who Took on At&T, Risked Everything, And Won! Warner Books. New York. 1986.

85 Temin, p. 59.

86 Stone, p. 53.

87 Stone, Ibid.

88 Temin, p. 149-150

89 Temin, p. 227

90 Minton, Torri, Jon Swartz. "35 cents Charge For Pac Bell Pay Phones." San Francisco Chronicle. October 18, 1997.

91 Coll, p. 367.

92 Preston, Holly Hubbard. "Minitel reigns in Paris with key French connection." Computer Reseller News, n594 (Sep 5, 1994):49-50 and Cats-Baril, William L; Jelassi, Tawfik. "The French videotex system Minitel: A successful implementation of a national information technology infrastructure." MIS Quarterly v18, n1 (Mar 1994):1-20.

93 Kessler, Jack. "Electronic networks: A view from Europe." American Society for Information Science. Bulletin v20, n4 (Apr/May 1994):26-27.

94 Cats-baril, Ibid; and Poirot, Gerard. "Minitel: Oui! Multimedia: Non!." Communications International v22, n7 (Jul 1995):23-24.

95 Giussani, Bruno. "France Gets Along With Pre-Web Technology." New York Times. September 23, 1997.

96 Coll, p. 366.

97 Fusting, Pamela. "Will universal service be preserved?" Rural Telecommunications v15, n6 (Nov/Dec 1996):14-22.; Schroeder, Erica. "Telecom Act fuels regulatory wars." PC Week v13, n14 (April 8, 1996):51. (2 pages).

98 Barrett, Amy. "But do Aspen and Vail Really Need Phone Subsidies." Business Week. May 12, 1997.

99 Greenhouse, Linda. "High Court to Hear Dispute on Opening Phone Markets." New York Times. January 27, 1998.

100 Mueller, Milton. "Universal service and the telecommunications act: Myth made law." Communications of the ACM v40, n3 (Mar 1997):39-47.

101 Petersen, Melody. "Trenton Tells Bell Atlantic To Speed Up Urban Cable Connections." The New York Times. April 22, 1997; Marshall, Jonathan; Jon Swartz. "Sweeping Changes in Phone Rates Long-distance cuts to be passed on to customers." San Francisco Chronicle. May 8, 1997.

102 Southwick, Karen. "California, here I come." Upside v6, n12 (Dec 1994):34-45.

103 Temin, p. 305.

104 Staking Out the Public Interest in the Merger Between Pacific Telesis and Southwestern Bell Corporation. A White Paper prepared by the California Telecommunciations Policy Forum. February 1997.

105 Southwick, Ibid.

106 Staking Out the Public Interest, Ibid.

107 Staking Out the Public Interest, Ibid.

108 Howe, Kenneth. "Green Light For Takeover Of Pac Bell; Refunds cut in deal with Texas firm." San Francisco Chronicle. April 1, 1997.

109 Bucholtz, Chris. "Battle lines drawn for universal service." Telephony v230, n19 (May 6, 1996):22. 332 1