Vol. 14, No. 3,339 - The American Reporter - January 17, 2008



Andy Oram Reports
WHO OWNS THE INTERNET NOW?

by Andy Oram
American Reporter Correspondent
Cambridge, Mass.

Back to home page

Printable version of this story

CAMBRIDGE, Mass. -- The Internet was the great non-commercial success story of our time. Commissioned by the government, built on open-source software, promulgated initially through research and academic facilities - the Internet was the crowning example of a public good, a resource without an owner, a self-regulating convocation of equals that had the power and reach to help all mankind.

All that seems threatened now. This month, local phone companies revealed a far-reaching change to Internet access. These companies, who control the line into the Internet users' homes (usually through ADSL connections over traditional telephone wires) want to create varying levels of service for Internet content of their choice.

They plan to reserve high-speed connections for content they serve up, or that they accept from entertainment firms and other commercial companies willing to pay. All other content (originating from sites such as this one, the American Reporter) will receive poorer service.

And if the phone companies can do it, cable companies (the other major providers of Internet service to end-users) could very well start doing it too.

Those who hail the open Internet cringe at this initiative, which exploits the Internet to build and market private, premium content. But this is is by no means the first time companies have tried to bend technology to favor their services. In fact, it's an old story.

As I'll show in this article, companies have been trying to position themselves at choke holds and manipulate the Internet since it became commercialized in the early 1990s. Such shenanigans are simply an exercise of market power. Up to now they have failed to change the essential nature of the Internet. If they threaten to do so, opponents can invoke regulatory power and antitrust law to fight them.

Case One: Walled Gardens

Parallel to the Internet, in the 1980s and 1990s, grew several commercial networks whose names are mostly part of computing history: Prodigy, CompuServe, and the one that managed to beat the odds, America Online. These sites offered email, forums, and special content to their users; they were often termed "walled gardens" because they existed only for paid subscribers, and because the companies used their content in bidding wars to win users to their exclusive service.

There was one form of competition, though, that none of the commercial companies could beat. That was the Internet, a completely uncontrolled repository of every imaginable thing anybody wanted to put up in digital form. During the mid-90s, the users of the commercial services demanded access to Internet riches, and soon there was little interest in the special, limited-access forums. The companies gambled that they could use the Internet as a lure to keep users in the walled gardens - and they lost the gamble.

The functions of Prodigy and the rest are now split into two types of business, both of which are thriving. One side of the split is pure connectivity, the other pure content.

Internet service providers (ISPs) offer end-users raw physical access to the Internet. Meanwhile, portals - which are experiencing a resurgence, and of which Yahoo! is the most successful - offer high-quality content attractions such as news and discussion forums.

Both businesses are becoming concentrated in fewer and larger corporations, which is typical for maturing markets. And as the phone company announcement showed, some companies are trying once again to combine these functions. We'll see later in the article whether this attempt to create a new choke hold can succeed.

Case Two: Peering and transit charges

The Internet grew because companies strung lines between their routers and connected to each other. No connections, no Internet.

This principle, in fact, lies at the heart of the term "Internet." For a long time, computer administrators have been running networks that cover a department, a building, or a small campus. Each network can be an Ethernet, a wireless network, or some other local area network technology. Whenever the administrator connects two of these networks, it's called an "internet" (small "i").

The vision of connecting all these networks globally led to the capital-I Internet. It was brought to fruition by the simplicity and flexibility of the TCP/IP protocols (and, some say, government requirements that these protocols be used for communications with government agencies).

At first, ISPs carried each other's traffic for free. How else could they imagine doing it? If they put up any barriers to connection, they'd slow the growth of the miraculous Internet that increased the value all providers could offer to their users. Furthermore, the effort and cost of counting traffic, working out pricing systems, and collecting payment didn't seem worth the extra revenues they might bring. Because everybody was equal in these halcyon days, building connections was called "peering" (as in the modern term Peer-to-Peer).

By the late 1990s, though, hard-headed bean counters had taken over, and a major change ensued. The largest ISPs and backbone owners announced they would peer only with companies who could provide comparable service to them - other companies would have to pay.

What was considered comparable? Comparable companies had to have a certain geographic spread, accept a certain volume of traffic, and meet various other criteria for reliability and service.

A lot of small providers complained, but this change was economically necessary. End-users paid for the connections to the ISPs, but who would pay for the lines that stretched for thousands of miles across continents and between continents, carrying the Internet from one far-flung ISP to another? The large ISPs who owned these thick bundles of optical fibers, known as backbones, needed to charge to cover both their sunk costs and their maintenance.

According to Fred Goldstein, principal at ionary Consulting, "Major backbone operators (Tier 1, as they are called) were a new market that had to create itself from the early noncommercial Internet. Not only was there no dominant player, it was a cut-throat business in which huge operators went bankrupt. Transit charges helped make the wider Internet possible."

Still, a whiff of oligopoly hangs over the issue. The large backbone companies gambled that they could maintain a common front and force smaller companies to pay extra. And this gamble, unlike the earlier gamble of the walled garden companies, succeeded.

At that time, people also worried that large ISPs would employ technical measures to make service for users on the same ISP better than service for users on different ISPs. Certain ways to transmit streaming data (audio and video) work better if a single company has control over the whole route. Therefore, an ISP might be able to market a "quality of service" that requires users at both ends to sign up with that ISP.

This has not yet happened, perhaps because the need was not felt by users (Voice over IP, or VOIP, works pretty well on the current Internet, while few people do video teleconferencing), and perhaps because the market did not emerge for social and business reasons.

The peering controversy mostly died down in the 1990s, but it can still pop up. In October of this year, a controversy between two providers - Level 3 and Cogent - burst into public view. Level 3 wanted Cogent to start paying for its connection, and to show its muscle, cut off the connection to Cogent for three days. Subsequently they signed a new agreement. But people using each provider who were trying to access each other's email or Web pages found out that peering and transit is a living controversy. (Some commentators attribute the dispute to other business conflicts as well.)

Ironically, back in 1998 it was Level 3 who complained that larger companies were charging it instead of peering. What's fair or unfair looks different from the two ends of a cable.

The only policy argument over ISP transit currently lies in the international realm. ISPs in North America and Europe require the much smaller ISPs in less developed regions, notably Latin America and Africa, to pay for transit.

This has been a major bone of contention in international communications policy for years. It comes up repeatedly at meetings of that well-publicized United Nations body on Internet issues, the World Summit on the Information Society (WSIS). In fact, WSIS participants consider peering and transit arrangements more important than the issue that grabbed the headlines in the U.S., that of domain names and ICANN. So transit is now a digital divide issue.

But independent analysts back the backbone operators. They consider peering and transit not as policy but purely as business, privately negotiated and covered by non-disclosure agreements. Chris Savage, head of the Telecom/Internet practice at the law firm Cole, Raywid & Braverman, says, "To avoid transit charges, an Internet provider has to bring to the table (a) a lot of users, and/or (b) a lot of highly valued content. The providers in the underdeveloped countries, at least historically, have had neither."

So the worldwide Internet is not the seamless universality that idealists like to talk about, but rigidly segmented. The cost of my accessing a Web page in Brazil, or even some rural parts of the U.S., are greater than my costs of accessing a Web page in Menlo Park, Calif. It is not I, however, who pays the difference (though I may well pay in the form of noticing a longer time delay during the download).

Transit charges led to increased costs for small ISPs in the U.S., but these didn't made much difference in their profitability. What killed most of these ISPs was the cost and difficulty of a very different kind of connection: those between small phone carriers and the established local phone companies. The battle over the last mile had begun.

Case Three: Last-mile Legerdemain

Aside from the transit charge controversies, Internet backbones present little to fight over. This is because they have ample bandwidth for current needs, partly because of the over-optimistic investments of the dot-com boom.

Trouble arises only in the wires that connect the backbones to individual homes and businesses: the so-called "last mile." This is what our traffic passes over when we sign up with a local Internet service provider.

Originally, an ISP was just a company with a connection to an Internet backbone. Customers dialed up the ISP just like they dialed up a friend, and the phone company treated the call the same way. In the early days, ISPs were often Mom-and-Pop operations; a computer programmer might offer service as an adjunct to managing his or her own Internet connection.

But as new technologies with higher-speed access emerged, ISPs realized they had to start acting like phone companies. Some formed close relationships with small, upstart phone companies, while others created their own companies that traversed the regulatory maze to offer phone service. The upstarts ran their own lines, or more often rented lines from the old Bell phone company, the incumbent.

Once incumbent phone companies woke up and realized Internet business was big business - both because the upstarts were successful, and because cable companies started offering the Internet over cable modems - they started marketing their own service, and redoubled their efforts to cut off the competitive phone companies. These could not survive without connecting to the incumbents. Who would sign up for phone service or Internet service from a small company, if that service reached only customers of that company?

In a dozen ways, incumbents made it hard for competing phone companies to connect. Their numbers dropped precipitously during the late 1990s; few exist today.

Now the ISPs themselves are in the incumbents' direct sights. When the incumbents build new, high-speed lines, they no longer are forced by regulation to lease or share them with competing ISPs.

As for cable television companies, U.S. regulations have ruled out any requirement for them to serve competing ISPs, although Canadian regulators have taken the opposite tack. ISPs in Canada still need to buy service from companies with which they are in natural competition.

So the open Internet - the Internet cited at the beginning of this article as an exemplary achievement of noncommercialism - now ends, ironically, in choke holds. Incumbent phone companies and cable television companies both hold considerable market power, enforced by regulation. The incumbent phone companies are the children of the break-up of AT&T, a regulated monopoly; they still face only minimal competition. The cable television companies get franchises from cities and towns, and often enjoy the sole cable franchise in each community.

The incumbents and cable companies are gambling that they can re-establish walled gardens; that they can leverage the Internet to tie customers to their high-revenue offerings. Goldstein says, "It's no coincidence that the companies are rolling out these plans after most of the alternative phone companies and ISPs have disappeared." A key part of their gamble is that users won't find viable competition to move to.

So Who Now Owns The Internet?

With this background we are almost ready to tackle the historic (and perhaps histrionic) question asked at the beginning of this article.

First, we have to recognize that the Internet access offered by incumbents and cable companies to home users is notably different from Internet access as it was understood originally. In the early days, bandwidth was equal in both directions. A typical Internet site was an institution owning file, mail, and news servers; it hosted content.

When sites hosting content pushed it down their fat pipes (high-bandwidth lines) and home users downloaded it on their small pipes (dial-up lines), the users experienced the notorious "World Wide Wait."

The next step up in Internet access was ADSL (from phone companies) and cable modems (from cable companies). But both are asymmetric (that's the A in ADSL). This is part of their design.

The providers expect you to request a Web page (a very small transmission in the upstream direction, perhaps just a couple dozen bytes) and use most of your bandwidth downstream (which can easily be tens of thousands of bytes, if the page contains images or animations). Bandwidth is divided up accordingly. The model of Internet access, ensconced in current ADSL or cable lines, is a consumer model.

Markets in tandem with technology can often overcome limitations. So perhaps, despite being relegated to the status of a consumer, you are merrily blogging, putting up photos, and even posting songs and videos (legally, I presume) on the Web. Most individuals do these things by forming some kind of relationship with a hub on the Internet that has fat pipes, powerful services, and terabytes of disk space. The individual remains a consumer, but can piggyback on a producer.

Meanwhile, this market fuels the growth of portals, mentioned earlier. Two example readily at hand are the popular site for posting photos, Flickr, and the site for sharing favorite Websites, deli.cio.us. Both were acquired by Yahoo! this year.

Because of bandwidth restrictions, and the physical nature of the cable as a medium shared by hundreds of users, the terms of service published by most cable companies rule out servers and peer-to-peer applications. Some place absolute limits on traffic usage.

We should not be surprised that a cable company's idea of Internet access differs from the original meaning of the term. Cable companies have always existed to deliver canned content of their choice with graduated prices. When they discovered the Internet, they set aside one channel for Internet traffic; the Internet became an incentive to sign up for cable service, as it served the Prodigies and CompuServes of the 1980s.

In other words, the cable company leopard never changed its spots; it just let a monkey hop on its back for a ride. The lifespan of the monkey is up for debate.

Phone companies have been watching the premiums charged by cable companies for decades; now they see their opportunity to do the same. Hence the plan mentioned at the start of this article to allow different tiers of service.

Home users and small businesses have been waiting decades for faster lines (generally optical fiber instead of copper). Imagine, for instance, if you could carry on a videoconference with decent quality from home. This would cut down on a lot of ground and air travel, an ecological boon.

Phone companies are finally ramping up better connections. But the new plans would dedicate the new fat pipes to commercial vendors who pay to use them. Personal, small-business, and community-organization Internet sites would be ghettoized onto the current aging wires. And the promise of innovative applications such as video teleconferencing would remain a pipe dream.

In fact, such a policy would actually reduce incentives to build faster connections. The phone companies would be able to keep using the old ADSL lines, just marking traffic by its origin and favoring the highest bidder. The change would increase revenue without improving service.

Goldstein says, "The incumbent phone companies want to apply a 'message unit' model to Websites, who must either pay up ('800 model') or become harder to reach ('hobo class'). And perhaps they'll even block all access outside of the walled garden. This is what they set up on mobile phones, whose data services were never regulated."

The goal of favoring one type of content over another can be fulfilled through a technology called differentiated service. This is not something new, nor is it the result of oligopolistic conspiracy. Research into this area has gone on for many years, and many Internet tools support differentiated service.

Differentiated service lets administrators choose routes for data by multiple criteria, and let through traffic between certain users while holding up other traffic. The important criterion might be how fast a single request gets to its destination, or how fast a heavy stream of traffic gets through in the aggregate. Reliability and cost can also be factors; each factor assumes a different way of handling traffic.

For a long time, the business goal behind much of this research was to allow ISPs to provide different quality of service to different customers, and to charge for the difference. The attempt has mostly been a failure, as I mentioned earlier.

But differentiated service has a new lease on life, and it's much more closely targeted to users and content. Particular types of traffic (identified, for instance, by port number) and particular sources and destinations can be either favored or penalized.

The first suggestion that cable and phone companies could employ differentiated service to prefer particular content came in 1999, when Cisco Systems, the leading maker of Internet routing equipment, introduced a router specifically marketed to these companies and promising sophisticated ways to enforce preferential treatment.

Public interest groups such as Consumer Project on Technology jumped on this development. They criticized Cisco, and by extension its potential customers, heavily. But it's hard to criticize a technology developed, with support from standards, over many years with many useful applications. It's also hard to criticize companies for using technology to direct consumers to their own content. That would be asking the leopard to change its spots.

So now we can make a stab at predicting the outcome of the trend toward creating new Internet haves and have-nots. The question should be what constitutes an anti-competitive practice.

What forced the issue into public view is a bill in Congress that would explicitly stop preferential treatment and mandate "neutrality" in Internet service. The phone companies want this clause removed.

Historically, the Federal Communications Commission has tried to leave the Internet unregulated, but at key moments it has often laid down rules concerning the interactions between Internet services and the larger communications environment that the FCC is responsible for. Most recently, they fined a phone company for blocking a Voice-over-IP provider; the phone company had clearly seen the provider as a competitor and was using its position as a choke point to curb that competition.

The FCC has freed incumbent phone companies, in one ruling after another, from the need to support competitors. The trend in Congress seems to approve. As mentioned before, cable companies have always had that freedom in the United States. But discriminating in Internet access may be a drastic change the FCC cannot stomach, a bait-and-switch approach to offering Internet service - and Congress may feel the same way.

Savage says, "It would not surprise me if, regarding Internet access, the FCC will matter more over the next three to five years than it has in the past. This is because the two kinds of entities that will now be providing the overwhelming majority of consumer Internet access are incumbent telephone companies and cable operators, which the FCC has traditionally viewed as generally within its regulatory ambit."

We should not sing a dirge over competition, either. Old competition has been vanquished, but new forms poke their shoots up.

In some areas, a second cable company may offer competition. Cellular phone companies (some owned by the incumbent phone companies and some independent) are rolling out Internet services, although not very fast ones, in North America. And in rural areas many people connect to wireless ISPs. Wireless is expected to become a more and more common solution to the last mile. In some areas it may be offered by a powerful new standard called WiMAX.

Municipalities are also getting into the free wireless act. The more games companies play with access, the more pressure will grow in the public for their municipal governments to provide alternatives. And while the phone companies anticipated this movement and worked hard to pass laws in many states to prohibit municipal networks (a Philadelphia case made a particularly large news impact), more and more public officials and experts are coming out in favor of them. In New Orleans (free access citywide), Philadelphia (tiered fee and paid access), San Francisco (Google's wireless venture) and Tempe, Ariz. (some free, some paid access), with phone company obstructionism exposed to public view, compromises were arranged.

I don't think we need to panic over the two-tier Internet. Attempts to monopolize the Internet have failed before, and there are many factors in both the business and the legal frameworks to prevent it from happening again. We will always experience tensions between business models and the public good. But it's clear that, around the world, people want their Internet. Ultimately, they'll get it.

AR Webmaster Andy Oram is a member of Computer Professionals for Social Responsibility and an editor at O'Reilly Media. For his earlier articles, click on the drop-down box at upper left and select "Andy Oram Reports."

Copyright 2008 Joe Shea The American Reporter. All Rights Reserved.

Site Meter