Don’t regulate, collaborate

FCC Commissioner Robert McDowell has written one of the most sober and sensible essays on the Internet’s present technical crisis in today’s Washington Post. With so many members of the Commission willing to jump into the breach with ex post facto rules and regulations, it’s good to see that there are some on the inside … Continue reading “Don’t regulate, collaborate”

FCC Commissioner Robert McDowell has written one of the most sober and sensible essays on the Internet’s present technical crisis in today’s Washington Post. With so many members of the Commission willing to jump into the breach with ex post facto rules and regulations, it’s good to see that there are some on the inside of the regulatory machine who have a sense of the Internet’s history. See Who Should Solve This Internet Crisis?

The Internet was in crisis. Its electronic “pipes” were clogged with new bandwidth-hogging software. Engineers faced a choice: Allow the Net to succumb to fatal gridlock or find a solution.

The year was 1987. About 35,000 people, mainly academics and some government employees, used the Internet.

This story, of course, had a happy ending. The loosely knit Internet engineering community rallied to improve an automated data “traffic cop” that prioritized applications and content needing “real time” delivery over those that would not suffer from delay. Their efforts unclogged the Internet and laid the foundation for what has become the greatest deregulatory success story of all time.

The Internet has since weathered several such crises. Each time, engineers, academics, software developers, Web infrastructure builders and others have worked together to fix the problems. Over the years, some groups have become more formalized — such as the Internet Society, the Internet Engineering Task Force and the Internet Architecture Board. They have remained largely self-governing, self-funded and nonprofit, with volunteers acting on their own and not on behalf of their employers. No government owns or regulates them.

The Internet has flourished because it has operated under the principle that engineers, not politicians or bureaucrats, should solve engineering problems.

Today, a new challenge is upon us. Pipes are filling rapidly with “peer-to-peer” (“P2P”) file-sharing applications that crowd out other content and slow speeds for millions. Just as Napster produced an explosion of shared (largely pirated) music files in 1999, today’s P2P applications allow consumers to share movies. P2P providers store movies on users’ home and office computers to avoid building huge “server farms” of giant computers for this bandwidth-intensive data. When consumers download these videos, they call on thousands of computers across the Web to upload each of their small pieces. As a result, some consumers’ “last-mile” connections, especially connections over cable and wireless networks, get clogged. These electronic traffic jams slow the Internet for most consumers, a majority of whom do not use P2P software to watch videos or surf the Web.

At peak times, 5 percent of Internet consumers are using 90 percent of the available bandwidth because of the P2P explosion. This flood of data has created a tyranny by a minority. Slower speeds degrade the quality of the service that consumers have paid for and ultimately diminish America’s competitiveness globally.

The Commissioner makes many of the points that those of us who’ve been involved in the development and refinement of Internet protocols for the period since Internet Meltdown have made: new applications have broken the Internet before. After the FTP crisis was averted by Van Jacobson’s patch, the Internet very nearly ground to a state of gridlock on the early 90s when HTTP 1.0 came along and opened too many TCP virtual circuits. That problem was averted by HTTP 1.1, which used fewer VCs more efficiently. We didn’t need government mandates to solve the problem, as everyone was motivated already.

The P2P crisis is already the focus of intense industry collaboration in the P4P Working Group sponsored by the DCIA and in the IETF. Whatever orders the FCC issues on the complaints against Comcast are going to be less helpful than these collaborative efforts, and will in all likelihood retard the course of the Internet’s technical evolution.

Don’t regulate, collaborate.

Technorati Tags:

Regulate first, ask questions later

Press reports on the FCC’s vote on the Vuze/Free Press petitions against Comcast suggest a peculiar outcome, where FCC orders Comcast to stop managing BitTorrent and to also tell the FCC how and when it manages BitTorrent: The FCC would require Comcast to stop slowing or blocking access to certain online applications, mostly video file-sharing … Continue reading “Regulate first, ask questions later”

Press reports on the FCC’s vote on the Vuze/Free Press petitions against Comcast suggest a peculiar outcome, where FCC orders Comcast to stop managing BitTorrent and to also tell the FCC how and when it manages BitTorrent:

The FCC would require Comcast to stop slowing or blocking access to certain online applications, mostly video file-sharing services such as BitTorrent. The company would also be required to provide more disclosure to consumers about its network management practices and provide more details to the FCC about how it’s blocked or slowed traffic in the past.

If the FCC is convinced the management is wrong, why ask for the data? And why only ask for the data after nearly a year of investigating and three raucous public spectacles?

Vuze recently changed its business model, providing search service for piracy sites such as Mininova and Pirate’s Bay:

In addition to Vuze.com, the new search box gives users the option to search third-party web sites, with Mininova, Sumotorrent, BTJunkie and Jamendo being preselected. With the exception of Jamendo, all of these also feature unlicensed content. In fact, Mininova was sued by Dutch rights holders just a few weeks ago. But Vuze CEO Gilles BianRosa told me that he doesn’t think his company could run into trouble by searching these sources. “We have considered the existing legal framework and feel comfortable about the addition of this feature to our new release,” he told me, adding that rights holders could use the search to add their platforms to the mix as well.

We have a curious outcome where the FCC is ordering carriers to provide free bandwidth to pirates.

Small, wireless ISPs are hit harder by this order than the large corporations. If they can’t manage BitTorrent, they’re out of business. Brett Glass is in that situation.

See my recent FCC Comments here.

More as this develops, but for now enjoy the debate at DSL Reports, where the nefarious scheme to allocate bandwidth fairly first emerged.

John Dunbar’s AP story is here. Pretty straight coverage.

Nate Anderson’s Ars Technica is not so straight, tilting toward an editorial.

Blog talk is here, thanks to the good folks at GoogleTM.

Adam Thierer gives props to the big gov’t-pro regulator team at TLF:

It is a difficult thing for me to say, but I am man enough to do it: I must congratulate our intellectual opponents on their amazing victory in the battle to impose Net neutrality regulations on the Internet. With the Wall Street Journal reporting last night that the FCC is on the verge of acting again Comcast based on the agency’s amorphous Net neutrality principles, it is now clear that the folks at the Free Press, Public Knowledge, and the many other advocates of comprehensive Internet regulation have succeeded in convincing a Republican-led FCC to get on the books what is, in essence, the nation’s first Net Neutrality law. It is quite an accomplishment when you think about it.

Indeed.

Bob Fernandez covers the story in the Philly Inquirer, Comcast’s hometown paper:

Consumer and advocacy groups say action by Martin is necessary to preserve First Amendment protections on the Internet and to protect broadband consumers. Free Press, an advocacy group opposed to media consolidation, filed the complaint with the FCC. It was disappointed that Martin wouldn’t fine Comcast to send a message to the industry.

But others warn that Martin’s decision, announced at a Washington news conference, advances the FCC’s powers on the Internet without new laws.

“This is the foot in the door for big government to regulate the Internet,” said Adam Thierer, a senior fellow at the Progress and Freedom Foundation, a free-market think tank in Washington. “This is the beginning of a serious regulatory regime. For the first time, the FCC is making law around net neutrality.”

Net neutrality refers to the concept that Internet operators should treat all data traffic the same and not interfere with it – a subject hotly debated in recent years on Capitol Hill. Companies say they sometimes interfere with Internet traffic for practical reasons, like easing data jams.

Nobody ever mentions that unmanaged traffic causes more delay for users than managed traffic.

Comcast sets the record straight

In the course of pursuing its grievance with the FCC over broadband traffic management, Free Press and its allies have developed annoying tendencies to overstate the qualifications of its “experts” and to make wild technical assertions unsupported by empirical data. They pass Robb Topolski off as a “network engineer” when he was, while employed, a … Continue reading “Comcast sets the record straight”

In the course of pursuing its grievance with the FCC over broadband traffic management, Free Press and its allies have developed annoying tendencies to overstate the qualifications of its “experts” and to make wild technical assertions unsupported by empirical data. They pass Robb Topolski off as a “network engineer” when he was, while employed, a low-level tester of PC software. David Reed, who was in the design loop for TCP/IP in the 1970s but has gone in other directions since then, is represented as having worked continuously for 35 years on the advancement of Internet protocols. Free Press now employs Topolski and increasingly relies on him for analysis.

Comcast has finally said “enough is enough” and filed a document with the FCC addressing the inaccuracy of Free Press and Topoloski’s claims about their management systems:

• First, Comcast’s High-Speed Internet customers can and do access any content, run any application, and use any service that they wish.

• Second, our network management practices are similar to those deployed by other Internet service providers in the United States and around the world, and are reasonably designed to enable, not hinder, the high-quality user experience that the Internet Policy Statement contemplates and that competitive marketplace considerations require.

• Third, although Free Press and its consultants believe they know and understand Comcast’s network and how it manages that network, they do not, and they have made no legitimate effort to gain such an understanding (as others have recently done).

• Fourth, Comcast’s network management practices are not discriminatory and are entirely agnostic as to the content being transmitted, where it is being sent from or to, or the identity of the sender or receiver.

• Finally, Comcast’s customer service agreements and policies have long disclosed that broadband capacity is not unlimited, and that the network is managed for the benefit of all customers. Comcast’s disclosures have always been comparable to — and are now far more detailed than — almost any other Internet service provider’s disclosures.

The bottom line is this: the Internet is a web of shared communication links provisioned by statistical predictions about traffic. Any application or user which uses more bandwidth than the typical profile takes it away from others. The owner/manager of every link has a responsibility to assure fair access, and allowing applications with enormous bandwidth appetites to gobble up an unfair share of communication opportunities is a failure to own up to this responsibility.

Comcast has been charged with degrading an innovative new application, but the facts don’t support the charge. Actually, the innovative new application – P2P as presently implemented – has the effect of degrading traditional applications. Hence, P2P has to be managed.

So the only interesting questions are how. There are several members in the set of reasonable means of managing P2P traffic. The burden is on the FCC and the petitioners to show that the Sandvine system isn’t one of them, and they haven’t seriously attempted to do so.

Hiding behind wild claims and overblown rhetoric doesn’t help consumers, doesn’t protect free speech, and doesn’t improve the nature of broadband networking.

Sober analysis does, and that’s what we try to do here. Kudos to Comcast for standing up to these bullies.

Does the FCC have the authority after all?

MAP attorney Harold Feld has put together an interesting argument on the FCC’s authority to sanction Comcast over the BitTorrent management question: If the FCC had said directly to Comcast: “If in the future evidence arises that any company is willfully blocking or degrading Internet content, affected parties may file a complaint with the Commission.” … Continue reading “Does the FCC have the authority after all?”

MAP attorney Harold Feld has put together an interesting argument on the FCC’s authority to sanction Comcast over the BitTorrent management question:

If the FCC had said directly to Comcast: “If in the future evidence arises that any company is willfully blocking or degrading Internet content, affected parties may file a complaint with the Commission.” I would think we could all agree that this constituted “notice,” yes? Perhaps not notice of whether or not the behavior at issue constituted blocking or degrading — that is, after all — what the Commission determines in a complaint. But certainly if the FCC had told Comcast directly, to its face, no ifs and or buts, the above quoted line, I would hope we could all agree that Comcast had received reasonable notice that parties could bring complaints to the Commission, asking the Commission to determine whether the parties had behaved in an inappropriate manner.

Comcast rebuts this argument in a recent FCC filing.

IANAL so I don’t have an opinion on the soundness of Harold’s legal argument or Comcast’s rebuttal, but if Harold were correct, the argument would simply shift from authority to reasonable network management. The system Comcast was using made an a priori judgment that P2P was less worthy of all the bandwidth it wanted under load and unattended than interactive applications. This is not an unreasonable judgment, as a) it’s a bandwidth hog by design; and b) it was asking for more than the typical user was getting. But there are cases, to be sure, when a particular instance of P2P is not hogging, and they lead us to the empirical question about the link state at the time of the throttling. And that’s what I told the FCC in both my filing and my oral testimony.

I’m glad we have all these clever lawyers to resolve all these devious questions of authority, but wouldn’t it be simpler to know the rules in advance? That’s important because the terms like “degrade” are completely vague when we’re talking about a shared wire. Not managing P2P means that web browsers get degraded, and it makes no difference whether they’re degraded by ISP action or ISP inaction, so Comcast is screwed either way. That’s un-American.

Technorati Tags:

Liveblogging the FCC, Panel 2

See First Panel here, and the live video here. David Farber, former FCC chief tech, and CMU: What would you need 300 baud for? It motivated faster TTYs. We’re moving to faster networks, and that will stimulate new applications. If this going to lead to a better world, or to 1984? Don’t cut off the … Continue reading “Liveblogging the FCC, Panel 2”

See First Panel here, and the live video here.

David Farber, former FCC chief tech, and CMU:
What would you need 300 baud for? It motivated faster TTYs. We’re moving to faster networks, and that will stimulate new applications. If this going to lead to a better world, or to 1984? Don’t cut off the future with bad regulations. Big rush to restrict P2P traffic. It’s not all illegal, but it’s hard to tell. Peak loads are hard to restrict with monthly caps. Three dollar surcharge on video downloads.
Continue reading “Liveblogging the FCC, Panel 2”

Court calls FCC “arbitrary and capricious”

The Third Circuit delivered the big smackdown to the FCC over the wardrobe malfunction incident: The court said the FCC is free to change its policy without “judicial second-guessing,” but only with sufficient notice. “Because the FCC failed to satisfy this requirement,” the court added, “we find its new policy arbitrary and capricious under the … Continue reading “Court calls FCC “arbitrary and capricious””

The Third Circuit delivered the big smackdown to the FCC over the wardrobe malfunction incident:

The court said the FCC is free to change its policy without “judicial second-guessing,” but only with sufficient notice. “Because the FCC failed to satisfy this requirement,” the court added, “we find its new policy arbitrary and capricious under the Administrative Procedure Act as applied to CBS.”

It also found that CBS could not be held strcitly liable for the actions of independent contractors — another argument the FCC made for its finding. “The FCC cannot impose liability on CBS for the acts of Janet Jackson and Justin Timberlake, independent contractors hired for the limited purposes of the halftime show,” the court said.

This ruling has implications for the proposed sanctions against Comcast: both involve post-hoc rules and both involve sticking it to someone other than the bad actor. The court doesn’t approve of the FCC making rules after an incident has occurred, which is exactly what the FCC proposes to do in the cast of Comcast’s management of P2P. Notice and rule-making have to precede sanctions, not follow them.

And the bad actor notion also applies. The Court found that Jackson and Timberlake were the bad actors, not CBS. In the P2P case, the users who congested the network are the bad actors, not the operator who sought to rein them in.

Chairman Martin, note this well.

Also of interest: the Court noted that most of the complaints against CBS were junk:

The Opinion notes CBS’s research indicating that over 85 percent of those complaints came from forms produced by activist groups. Many of the protests were filed in duplicate, “with some individual complaints appearing in the record up to 37 times,” CBS asserted.

The same can be said of the junk comments manufactured by Free Press against Comcast, of course. Free Press employed the electronic equivalent of seat-warmers to flood the FCC with junk comments, to the tune of 30,000 duplicate complaints.

Recommended reading

Brett Glass has filed a very good letter with the FCC regarding the current controversy. Of particular interest is one of the “Four Freedoms”, the freedom to run any application you want: It’s important to step back and think about the implications of this clause – the one which Comcast has been accused in the … Continue reading “Recommended reading”

Brett Glass has filed a very good letter with the FCC regarding the current controversy. Of particular interest is one of the “Four Freedoms”, the freedom to run any application you want:

It’s important to step back and think about the implications of this clause – the one which Comcast has been accused in the current proceeding of having “violated.”

An application (a technical term for any computer program which is not an operating system) encodes and embodies behavior — any behavior at all that the author wants. And anyone can write one. So, insisting that an ISP allow a user to run any application means that anyone can program his or her computer to behave any way at all — no matter how destructively — on the Internet, and the ISP is not allowed to intervene. In short, such a requirement means that no network provider can have an enforceable Acceptable Use Policy or Terms of Service.

This is a recipe for disaster. Anyone who engages in destructive behavior, hogs bandwidth, or even takes down the network could and say, “I was just running an application… and I have the right to run any application I want, so you can’t stop me.”

The application freedom, like the others, is limited by “reasonable network management,” which is undefined. So the real exercise is defining this term, where the operative essence of the four freedoms is “you can do any damn thing you want, except for what you can’t do, and here’s what you can’t do.” Rather than enumerate freedoms, Michael Powell should have enumerated restrictions, on users, carriers, and services.

That’s hard work, but it’s the kind of thing that serious policy-makers do. Restrictions should start with the following list:

1. You can’t lie to your customers or the public, nor can you be economical with the truth:
– You have to fully disclose terms of service in language as plain is it can be, using standard metrics and terminology.

2. The Internet is a shared facility, and no one is entitled to overload any portion of it.

3. You can’t manipulate dominant market share in to fix prices or eliminate competition.

4. You can’t act arbitrarily or without notice to terminate services.

5. You can’t operate equipment on the pubic Internet with doors and windows open to malware, viruses, and bots. If your equipment is hijacked, you will summarily be cut off.

6. No stealing.

Etc.

Some of these apply to carriers, some to users, and some to services. In a mature Internet, we all have responsibilities, not just freedoms. With great power, etc.

Hyperventilating in New York

As one would expect, the New York Times editorial page is not happy with the Supreme Court’s decision upholding the right to keep arms. But the language of their editorial is quite a bit over the top: This is a decision that will cost innocent lives, cause immeasurable pain and suffering and turn America into … Continue reading “Hyperventilating in New York”

As one would expect, the New York Times editorial page is not happy with the Supreme Court’s decision upholding the right to keep arms. But the language of their editorial is quite a bit over the top:

This is a decision that will cost innocent lives, cause immeasurable pain and suffering and turn America into a more dangerous country. It will also diminish our standing in the world, sending yet another message that the United States values gun rights over human life.

I doubt that the effects of this decision will be that far-reaching. It’s mainly just a slap in the face to jurisdictions that practice a particularly paternalistic form of government, where incomes are high, crime is low, and symbolism trumps substance. Criminals still commit most of the crime, and the criminal’s relationship with his weapon isn’t altered by the law.

For a more sensible analysis, see the Sacramento Bee’s Gun ban reversal has limited reach:

WASHINGTON – The U.S. Supreme Court’s historic decision Thursday on the right to bear arms was a sweeping pronouncement of constitutional principles that will nonetheless have little practical impact in most of the country, legal experts said.

Now that’s more like it.

Genarlow Wilson

Mark Cuban is thumping the drum on the Genarlow Wilson case, a gross miscarriage of justice in Georgia: For those who don’t know. Genarlow Wilson was sentenced to 10 years in jail for doing something every 17 year old I knew, including me, tried to do. He is two years into this nightmare that only … Continue reading “Genarlow Wilson”

Mark Cuban is thumping the drum on the Genarlow Wilson case, a gross miscarriage of justice in Georgia:

For those who don’t know. Genarlow Wilson was sentenced to 10 years in jail for doing something every 17 year old I knew, including me, tried to do. He is two years into this nightmare that only makes the State of Georgia a posterchild for mistrust in government.

It’s a fine example of legislating morality, blind punishment, and the oppression of mankind’s most basic instincts. The way to help is to contribute to a Georgia fathers’ rights or men’s rights organization because they’re the only ones working to reform the laws that Genarlow was prosecuted under. Read the ESPN story here and the New York Times story here.

Genarlow Wilson may not be a model citizen in the Disneyland sense, but his failings are perfectly normal for a teenaged boy, and considerably less significant than those of Pastor Ted Haggard or Congressman Mark Foley. Neither of them is behind bars and Genarlow shouldn’t be either.

The anguish of regulation

Note: This post isn’t clear. I’m trying to say that the notion of “layering” in network protocol design doesn’t mean there’s some kind of firewall of ignorance between layers. In layered architectures, protocol layers advertise services to their higher-layer consumers, and notions of regulation built on the notion of layering have to take that fact … Continue reading “The anguish of regulation”

Note: This post isn’t clear. I’m trying to say that the notion of “layering” in network protocol design doesn’t mean there’s some kind of firewall of ignorance between layers. In layered architectures, protocol layers advertise services to their higher-layer consumers, and notions of regulation built on the notion of layering have to take that fact into account. Crawford misunderstands protocol layering and attempts to build a regulatory framework on the back of her mistaken idea.

Some of the fans of network neutrality regulations are sincere but misguided, such as law professor Susan Crawford. She’s in a lot of anguish about how to sell the regulators’ agenda*:

If the only economic and cultural justifications you have for the need for a layered approach to internet regulation (an approach that treats transport differently from applications) are (1) the explosive innovation that competition among applications would produce and (2) the appropriate mapping between the “actual” architecture of the internet and the regulatory approach to be taken to it, you’ll lose.

But she never questions whether the “layered approach to regulation” is a good thing or even a well-understood thing. I see this a lot among the legal academics, who seem to base most of their regulatory model on a defective model of protocol layering. Lessig is the prototype for this misunderstanding, as he wants to extract architectural features from the Internet of the Past and use them to constrain the development of the Internet of the Future.

I work with layered protocols, and have for more years than I can remember, so please allow me to explain what layering means in real network systems. We divide network functions between abstract layers (physical, link, network, session, application) so we can mix and match combinations for real systems. So the IP network layer can run on the Ethernet link layer or the WiFi link layer, and work pretty much the same. And we can run Ethernet over a fiber-optic physical layer or a copper pair physical layer, and have it work pretty much the same. ]

The key here is understanding what “pretty much the same” means. Each protocol at each layer has its own constraints, and higher layers have to be able to accommodate them. For example, Ethernet packets can’t be more than 1500 bytes long, but WiFi packets are bigger and ATM packets (cells) are smaller. So IP needs to know what the size constraints of the link layer are so it can adjust to them and operate efficiently.

The way this is done is through a service interface between the network layer and the link layer that allows the higher layer protocol to discover the capabilities of the lower layer protocol and behave accordingly. So while these two layers are defined and built separately, they’re intimately connected through a shared interface that allows them to operate together smoothly.

At the link layer, many protocols have the ability to offer different services, each appropriate to a different set of applications. WiFi, for example, has a voice service that handles short packets that need to be transmitted and received at regular intervals differently than long packets that are less sensitive to delay but more sensitive to corruption and loss. The network lingo for this selection of services is Quality of Service or QoS. Note that it’s not really correct to say that Voice QoS is “better” than the bulk data QoS called “Best Effort,” it’s simply different. It would not be in your interest to use Voice grade QoS for downloading files from Netflix, even if those files contained movies, because it actually constrains total bandwidth. You essentially trade off moving a lot of data for moving a little very quickly.

The tragedy of the Internet is that the IP layer doesn’t have good facilities for selecting QoS options from the layers below it, and his makes it difficult for applications to get the service they need from the network top-to-bottom and end-to-end. So we bypass IP in real systems through something called a “Control Plane” and tell the Link Layer how to fit QoS around the data that need it.

But the main point is that the segregation of functions into protocol layers doesn’t mean that each layer doesn’t know what the other layers are doing. In fact, the layers must know what their options are and how to use them, even though they don’t need to know how the other layers make these options available. So the layered approach to protocol design doesn’t preclude diversity of services, it in fact facilitates it by sharing the important information and hiding the unimportant details.

In the real world, a layered approach to regulation would begin by identifying service options and the means for requesting them. The neuts don’t get this and begin by banning service level communication between layers. That’s what “just move the bits, stupid” means. It’s bad network design and it’s bad regulation.

*Crawford blocks referrals from this blog. She’s at: http://scrawford.blogware.com/blog.