Regulation and the Internet

Here’s a little speech I gave to members of the EU Parliament in Brussels on Oct. 14th. The cousins are contemplating a set of Internet access account regulations that would mandate a minimum QoS level and also ban most forms of stream discrimination. This explains why such rules are a bad (and utterly impractical) idea. … Continue reading “Regulation and the Internet”

Here’s a little speech I gave to members of the EU Parliament in Brussels on Oct. 14th. The cousins are contemplating a set of Internet access account regulations that would mandate a minimum QoS level and also ban most forms of stream discrimination. This explains why such rules are a bad (and utterly impractical) idea.

The Internet is a global network, and regulating it properly is a matter of global concern. I’d like to share a view of the technical underpinnings of the question, to better inform the legal and political discussion that follows and to point out some of the pitfalls that lie in wait.

Why manage network traffic?

Network management, or more properly network traffic management, is a central focus of the current controversy. The consumer-friendly statements of policy, such as the Four Freedoms crafted by Senator McCain’s technology adviser Mike Powell, represent lofty goals, but they’re constrained by the all-important exception for network management. In fact, you could easily simplify the Four Freedoms as “you can do anything you want except break the law or break the network.” Network management prevents you from breaking the network, which you principally do by using up network resources.

Every networking technology has to deal with the fact that the demand for resources often exceeds supply. On the circuit-switched PSTN, resources are allocated when a call is setup, and if they aren’t available your call doesn’t get connected. This is a very inefficient technology that allocates bandwidth in fixed amounts, regardless of the consumer’s need or his usage once the call is connected. A modem connected over the PSTN sends and receives at the same time, but people talking generally take turns. This network doesn’t allow you to save up bandwidth and to use it later, for example. Telecom regulations are based on the PSTN and its unique properties. In network engineering, we call it an “isochronous network” to distinguish it from technologies like the old Ethernet that was the model link layer technology when the DoD protocol suite was designed.

The Internet uses packet switching technology, where users share communications facilities and bandwidth is allocated dynamically. Dynamic bandwidth allocation, wire-sharing, and asynchrony mean that congestion appears and disappears on random, sub-second intervals. Packets don’t always arrive at switching points at the most convenient times, just as cars don’t run on the same rigorous schedules as trains.
Continue reading “Regulation and the Internet”

The Trouble with White Spaces

Like several other engineers, I’m disturbed by the white spaces debate. The White Space Coalition, and its para-technical boosters, argue something like this: “The NAB is a tiger, therefore the White Spaces must be unlicensed.” And they go on to offer the comparison with Wi-Fi and Bluetooth, arguing as Tom Evslin does on CircleID today … Continue reading “The Trouble with White Spaces”

Like several other engineers, I’m disturbed by the white spaces debate. The White Space Coalition, and its para-technical boosters, argue something like this: “The NAB is a tiger, therefore the White Spaces must be unlicensed.” And they go on to offer the comparison with Wi-Fi and Bluetooth, arguing as Tom Evslin does on CircleID today that “If we got a lot of innovation from just a little unlicensed spectrum, it’s reasonable to assume that we’ll get a lot more innovation if there’s a lot more [unlicensed] spectrum available.”

According to this argument, Wi-Fi has been an unqualified success in every dimension. People who make this argument haven’t worked with Wi-Fi or Bluetooth systems in a serious way, or they would be aware that there are in fact problems, serious problems, with Wi-Fi deployments.

For one thing, Wi-Fi systems are affected by sources of interference they can’t detect directly, such as FM Baby Monitors, cordless phones, and wireless security cameras. Running Wi-Fi on the same channel as one of these devices causes extremely high error rates. If 2.4 and 5.x GHz devices were required to emit a universally detectable frame preamble much of this nonsense could be avoided.

And for another, we have the problem of newer Wi-Fi devices producing frames that aren’t detectable by older (esp. 802.11 and 802.11b gear) without an overhead frame that reduces throughput substantially. If we could declare anything older than 802.11a and .11g illegal, we could use the spectrum we have much more efficiently.

For another, we don’t have enough adjacent channel spectrum to use the newest version of Wi-Fi, 40 MHz 802.11n, effectively in the 2.4 GHz band. Speed inevitably depends on channel width, and the white spaces offer little dribs and drabs of spectrum all over the place, much of it in non-adjacent frequencies.

But most importantly, Wi-Fi is the victim of its own success. As more people use Wi-Fi, we have share the limited number of channels across more Access Points, and they are not required to share channel space with each other in a particularly efficient way. We can certainly expect a lot of collisions, and therefore packet loss, from any uncoordinated channel access scheme, as Wi-Fi is, on a large geographic scale. This is the old “tragedy of the commons” scenario.

The problem of deploying wireless broadband is mainly a tradeoff of propagation, population, and bandwidth. The larger the population your signal covers, the greater the bandwidth needs to be in order to provide good performance. The nice thing about Wi-Fi is its limited propagation, because it permits extensive channel re-use without collisions. if the Wi-Fi signal in your neighbor’s house propagated twice as far, it has four times as many chances to collide with other users. So high power and great propagation isn’t an unmitigated good.

The advantage of licensing is that the license holder can apply authoritarian rules that ensure the spectrum is used efficiently. The disadvantage is that the license holder can over-charge for the use of such tightly-managed spectrum, and needs to in order to pay off the cost of his license.

The FCC needs to move into the 21st century and develop some digital rules for the use of unlicensed or lightly-licensed spectrum. The experiment I want to see concerns the development of these modern rules. We don’t need another Wi-Fi, we know how it worked out.

So let’s don’t squander the White Spaces opportunity with another knee-jerk response to the spectre of capitalism. I fully believe that people like Evslin, the White Space Coalition, and Susan Crawford are sincere in their belief that unlicensed White Spaces would be a boon to democracy, it’s just that their technical grasp of the subject matter is insufficient for their beliefs to amount to serious policy.

Comcast files their compliance plan

Today was the deadline for Comcast to tell the FCC how its existing congestion management system works, as well as how its “protocol agnostic” replacement is going to work. To the dismay of some critics, they’ve done just that in a filing that was hand-delivered as well as electronically filed today. It will be posted … Continue reading “Comcast files their compliance plan”

Today was the deadline for Comcast to tell the FCC how its existing congestion management system works, as well as how its “protocol agnostic” replacement is going to work. To the dismay of some critics, they’ve done just that in a filing that was hand-delivered as well as electronically filed today. It will be posted to the Comcast web site shortly.

The filing corrects some of the false allegations made by critics with respect to privacy, making it very clear that the existing system simply inspects protocol headers (“envelopes”) and not personal data. David Reed in particular got himself worked into a tizzy over the idea that Comcast was deciding which streams to delay based on content, but this is clearly not the case. Inside the IP envelope sits a TCP envelope, and inside that sits a BitTorrent envelope. User data is inside the BitTorrent (or equivalent) envelope, and Comcast doesn’t look at it.

The current system sets a bandwidth quota for P2P, and prevents P2P as a group from crossing the threshold from this quota (about 50% of total upstream bandwidth) with new uni-directional upload (AKA, file-server-like) streams by tearing down requested new streams with the TCP Reset bit. The system is a bit heavy-handed, but reserving 50% of the network for one class of application seems pretty reasonable, given that no more than 20% of customers use P2P at all.

Nonetheless, the new system will not look at any headers, and will simply be triggered by the volume of traffic each user puts on the network and the overall congestion state of the network segment. If the segment goes over 70% utilization in the upload direction for a fifteen-minute sample period, congestion management will take effect.

In the management state, traffic volume measurement will determine which users are causing the near-congestion, and only those using high amounts of bandwidth will be managed. The way they’re going to be managed is going to raise some eyebrows, but it’s perfectly consistent with the FCC’s order.

High-traffic users – those who’ve used over 70% of their account’s limit for the last fifteen minutes – will have all of their traffic de-prioritized for the next fifteen minutes. While de-prioritized, they still have access to the network, but only after the conforming users have transmitted their packets. So instead of bidding on the first 70% of network bandwidth, they’ll essentially bid on the 30% that remains. This will be a bummer for people who are banging out files as fast as they can only to have a Skype call come in. Even if they stop BitTorrent, the first fifteen minutes of Skyping are going to be rough. A more pleasant approach would be to let excessive users out of QoS jail with credit for good behavior – if their utilization drops to Skype level, let them out in a few seconds, because it’s clear they’ve turned off their file sharing program. This may be easier said than done, and it may raise the ire of Kevin Martin, given how irrational he is with this anti-cable vendetta.

The user can prevent this situation from arising, of course, if he wants to. All he has to do is set the upload and download limits in BitTorrent low enough that he doesn’t consume enough bandwidth to land in the “heavy user” classification and he won’t have to put up with bad VoIP quality. I predict that P2P applications and home gateways are going to incorporate controls to enforce “Comcast friendly” operation to prevent de-prioritization. There are other more refined approaches to this problem, of course.

At the end of the day, Comcast’s fifteen/fifteen system provides users with the incentive to control their own bandwidth appetites, which makes it an “end-to-end” solution. The neutralitarians should be happy about that, but it remains to be seen how they’re going to react.

It looks pretty cool to me.

UPDATE: Comcast-hater Nate Anderson tries to explain the system at Ars Technica. He has some of it right, but doesn’t seem to appreciate any of its implications. While the new system will not look at protocol headers (the evil “Deep Packet Inspection” that gets network neophytes and cranks so excited) , and it won’t use TCP Resets, that doesn’t mean that P2P won’t be throttled; it will.

That’s simply because P2P contributes most of the load on residential networks. So if you throttle the heaviest users, you’re in effect throttling the heaviest P2P users, because the set of heavy users and the set of heavy P2P users is the same set. So the “disparate impact” will remain even though the “disparate treatment” will end.

But the FCC has to like it, because it conforms to all of Kevin Martin’s rabbit-out-the-hat rules. The equipment Comcast had had to purchase for this exercise in aesthetic reform will have utility down the road, but for now it’s simply a tax imposed by out-of-control regulators.

FCC finally issues Comcast memo

Kevin Martin and his Democratic Party colleagues at the FCC have issued their Comcast order, available at this link. They find some novel sources of authority and apply some interesting interpretations of the facts. I’ll have some detailed commentary after I’ve read it all and checked the footnotes. It’s an amusing exercise, if you like … Continue reading “FCC finally issues Comcast memo”

Kevin Martin and his Democratic Party colleagues at the FCC have issued their Comcast order, available at this link. They find some novel sources of authority and apply some interesting interpretations of the facts. I’ll have some detailed commentary after I’ve read it all and checked the footnotes. It’s an amusing exercise, if you like that sort of thing.

For a good summary of the order, see IP Democracy.

BitTorrent Soap Opera continues

Valleywag’s outstanding reporting on the BitTorrent collapse continues with a detailed account of the tussle: BitTorrent has denied our report that the company laid off 12 out of 55 employees. That may be true: While our source told us 12 employees were on the layoff list, we’ve learned that, at the last minute, the jobs … Continue reading “BitTorrent Soap Opera continues”

Valleywag’s outstanding reporting on the BitTorrent collapse continues with a detailed account of the tussle:

BitTorrent has denied our report that the company laid off 12 out of 55 employees. That may be true: While our source told us 12 employees were on the layoff list, we’ve learned that, at the last minute, the jobs of two sales engineers, an HR manager, and an office manager were spared. Another tipster — “you can guess as to whether I’m an insider or not” — says that the BitTorrent layoffs aren’t the fault of new CEO Doug Walker, who came to the those-crazy-kids file-sharing startup to add some enterprise-software gravitas. Instead, the elimination of BitTorrent’s sales and marketing departments amounts to a coup by cofounders Bram Cohen and Ashwin Navin, pictured here to Walker’s right and left, who are giving up on the notion of marketing BitTorrent’s file-sharing technology to businesses and hardware makers, and instead pinning their hopes on becoming an “Internet peace corps.”

One part that I can confirm is the lack of enthusiasm for DNA on the part of the tech people. I’ve asked them why anybody should care about DNA and I got was silence.

How long until we hear about the equally vexing woes at Vuze? They won their battle with Comcast before the FCC, at the expense of their corporate viability. Peer-to-peer needs to be domesticated, but the FCC has forbidden that. The only other choice is extermination, and metered pricing will take care of that quite efficiently.

Sad.

Previous entry here.

Technorati Tags: , , ,

Kevin Martin’s secret regulations

As the crescendo of criticism builds against the FCC’s pending publication of its new rules for Internet access providers, the New York Times emerges as the sole source of pro-FCC coverage. They publish a bizarre Op-Ed by Free Press chairman Tim Wu equating competing carriers with OPEC and mistaking the general trend in broadband prices … Continue reading “Kevin Martin’s secret regulations”

As the crescendo of criticism builds against the FCC’s pending publication of its new rules for Internet access providers, the New York Times emerges as the sole source of pro-FCC coverage. They publish a bizarre Op-Ed by Free Press chairman Tim Wu equating competing carriers with OPEC and mistaking the general trend in broadband prices – sharply down – with the trend for gas prices, which goes in the opposite direction entirely:

AMERICANS today spend almost as much on bandwidth — the capacity to move information — as we do on energy. A family of four likely spends several hundred dollars a month on cellphones, cable television and Internet connections, which is about what we spend on gas and heating oil.

Here’s what’s happening to broadband prices at Comcast:

High-speed Internet revenue increased 10% to $1.8 billion in the second quarter of 2008 from $1.6 billion in 2007 reflecting a 12% increase in subscribers and a 3% decline in average monthly revenue per subscriber to $42.01, reflecting the impact of additional bundling and the recent introduction of new offers and speed tiers.

I’d love to see a 3% monthly decline in gas prices, even at the same volume level. But the Comcast figures show consumers upgrading to higher speed tiers (like Blast, which I measure at 28 Mb/s download speed) and still seeing an average decline in prices. Wu isn’t talking about life in the Real WorldTM.

Martin himself held a pow-wow with Times reporters, hoping to evoke some of that old-time populism that the nation’s elite daily is so good at. BITS blogger Saul Hansell reports on Martin’s faulty facts and shoddy analysis:

“The network operators can recoup their investment in the network and can charge for access to network services, but consumers have complete control over the devices and content that don’t have anything to do with investment in the underlying network,” he said.

I asked about reports that AT&T now bans all use of peer-to-peer networking software on its wireless data network. It also bans some video services, like the Slingbox feature that lets you watch your home television signal on your cellphone.

Mr. Martin declined to answer. His view is that the commission should not publish explicit regulations. Rather, it should address complaints that are made, as it did with the Comcast case.

“The commission is very careful in that we look at the particular facts that are in front of us. We are not judging the next case,” he said. “Hard and fast rules can actually be over- and under-inclusive, and they can also have adverse impact.”

Mr. Martin was asked whether the commission’s approach will push more Internet providers to start to impose caps on how much bandwidth consumers can use.

He said he wanted to reserve judgment on that trend. He seemed comfortable with Internet providers offering services with limits, so long as they are clearly stated.

So we have this new regime for Internet access providers where every move they make is to be judged according to a list of secret regulations. If ever there was a recipe for stalemate, this is it.

Technorati Tags: , , , ,