Has the FCC Created a Stone Too Heavy for It to Lift?

After five years of bickering, the FCC passed an Open Internet Report & Order on a partisan 3-2 vote this week. The order is meant to guarantee that the Internet of the future will be just as free and open as the Internet of the past. Its success depends on how fast the Commission can transform itself from an old school telecom regulator wired to resist change into an innovation stimulator embracing opportunity. One thing we can be sure about is that the order hasn’t tamped down the hyperbole that’s fueled the fight to control the Internet’s constituent parts for all these years.

Advocates of net neutrality professed deep disappointment that the FCC’s rules weren’t more proscriptive and severe. Free Press called the order “fake net neutrality,” Public Knowledge said it “fell far short,” Media Access Project called it “inadequate and riddled with loopholes,” and New America Foundation accused the FCC of “caving to telecom lobbyists.” These were their official statements to the press; their Tweets were even harsher.

Free marketers were almost as angry: Cato denounced the order as “speech control,” Washington Policy Center said it “fundamentally changes many aspects of the infrastructure of the Internet,” and the Reason Foundation said it will lead to “quagmire after quagmire of technicalities, which as they add up will have a toll on investment, service and development.”

Republican Congressional leaders made no secret of their displeasure with the FCC’s disregard for their will: Rep. Fred Upton (R, Michigan,) the incoming Commerce Committee Chairman called it a “hostile action against innovation that can’t be allowed to stand,” Rep. Greg Walden (R, Oregon,) incoming Chairman of the Subcommittee on Communications and Technology called it a “power grab,” and vowed to hold hearings to overturn it, while Sen. Kay Bailey Hutchison (R, Texas,) Ranking Member of the Senate Commerce, Science, and Transportation Committee said the order “threatens the future economic growth of the Internet.” Setting Internet policy is indeed a Congressional prerogative rather than an agency matter, so the longer-term solution must come from the Hill, and sooner would be better than later.

Contrary to this criticism and to snarky blogger claims, not everyone was upset with the FCC’s action, coming as it did after a year-long proceeding on Internet regulation meant to fulfill an Obama campaign pledge to advance net neutrality. The President himself declared the FCC action an important part of his strategy to “advance American innovation, economic growth, and job creation,” and Senator John Kerry (D, Massachusetts) applauded the FCC for reaching consensus.

Technology industry reaction ranged from positive to resigned: Information Technology Industry Council President and CEO Dean Garfield declared the measure “ensures continued innovation and investment in the Internet,” TechNet supported it, and National Cable and Telecommunications Association head Kyle McSlarrow said it could have been much worse. At the Information Technology and Innovation Foundation, we were pleased by the promises of a relatively humble set of the rules, less so with the final details; we remain encouraged by the robust process the FCC intends to create for judging complaints, one that puts technical people on the front lines. In the end, the order got the support of the only majority that counts, three FCC commissioners.

Most of us who reacted favorably acknowledged the FCC’s order wasn’t exactly as we would have written it, but accepted it as a pragmatic political compromise that produces more positives than negatives. The hoped-for closing of the raucous debate will have immense benefits on its own, as simply bringing this distracting chapter in the Internet’s story to an end will allow more time for sober discussion about the directions we’d like the Internet to take in its future development. There is no shortage of policy issues that have been cramped by the tendency to view net neutrality as the one great magic wand with the power to solve all the Internet’s problems: The FCC has work to do on freeing up spectrum for mobile networking, the Universal Service Fund needs to be reformed, and the National Broadband Plan needs to be implemented.

If the FCC’s approach proves sound, it might well be exported to other countries, forming the basis of a consistent international approach to the oversight of an international network developed on consistent standards of its own. Such an outcome would have positive consequences for the Internet standards community, which has its own backlog of unfinished business such as scalable routing, congestion management, security, and the domestication of peer-to-peer file sharing and content delivery networks to resolve. This outcome is far from inevitable; last minute rule changes make it less likely than it might have been.

The most important thing the FCC can do in implementing its system of Internet oversight is to elevate process over proscriptive rules. The traditional approach to telecom regulation is to develop a thick sheath of regulations that govern everything from the insignias on the telephone repair person’s uniform to the colors of the insulators on RJ11 cables and apply them in top-down, command-and-control fashion. Many of those on the pro-net neutrality side are steeped in telecom tradition, and they expected such an approach from the FCC for the Internet; theirs are the angry reactions.

But the Internet isn’t a telecom network, and a foot-high stack of regulations certainly would produce the negative consequences for innovation and progress the FCC’s critics have forecast. The appropriate way to address Internet regulation as to follow the model that the Internet has developed for itself, based on a small number of abstract but meaningful principles (each of which is subject to change for good reason) applied by a broad-based community of experts in a collaborative, consultative setting. Internet standards are not devised in an adversarial setting populated by angels and devils locked into mortal combat; they come from a process that values “rough consensus and running code.”

The specifics of the FCC’s order nevertheless give pause to those well-schooled in networking. A few hours before the Commission’s vote, Commissioner Copps persuaded Chairman Genachowski to reverse the Waxman Bill’s presumption regarding the premium transport services that enable Internet TV and video conferencing to enjoy the same level of quality as cable TV. Where the early drafts permitted these services as long as they were offered for sale on a non-discriminatory basis, the final rule arbitrarily presumes them harmful.

The order makes hash of the relationship of the content accelerators provided by Akamai and others to the presumptively impermissible communication accelerators that ISPs might provide one day in order to enable HD group video conferencing and similar emerging applications. The Commission majority fears that allowing network operators to offer premium transport to leading edge apps will put the squeeze on generic transport, but fails to consider that such potential downsides of well-accepted technical practices for Quality of Service can be prevented by applying a simple quota limit on the percentage of a pipe that can be sold as “premium.” This fact, which is obvious to skilled protocol engineers, goes unmentioned in the order.

The poor reasoning for this rule casts doubt on the FCC’s ability to enforce it effectively without outside expertise. By rejecting Internet standards such as RFC 2475 and IEEE standards such as 802.1Q that don’t conform to the telecom activists’ nostalgic, “all packets are equal” vision of the Internet, the FCC chose to blind itself to one of the central points in Tim Wu’s “Network Neutrality, Broadband Discrimination” paper that started the fight: A neutral Internet favors content applications, as a class, over communication applications and is therefore not truly an open network. The only way to make a network neutral among all applications is to differentiate loss and delay among applications; preferably, this is done by user-controlled means. That’s not always possible, so other means are sometimes necessary as well.

All in all, the Commission has built a stone too heavy for it to lift all by itself. The rules have just enough flexibility that the outside technical advisory groups that will examine complaints may be able to correct the order’s errors, but to be effective, the advisors need much deeper technical knowledge than the FCC staffers who wrote the order can provide.

It’s difficult to ask the FCC – an institution with its own 75 year tradition in which it has served as the battleground for bitter disputes between monopolists and public interest warriors – to turn on a dime and embrace a new spirit of collaboration, but without such a far-reaching institutional transformation its Internet regulation project will not be successful. Those of us who work with the FCC are required to take a leap of faith to the effect that the Commission is committed to transforming itself from a hidebound analog regulator into a digital age shepherd of innovation. Now that the Open Internet Report & Order has passed, we have no choice but to put our shoulders to the rock to help push it along. There’s no turning back now.

[cross-posted from the Innovation Policy Blog]

Speaking today in DC

This event will be webcast today:

ITIF: Events

ITIF Event: Designed for Change: End-to-End Arguments, Internet Innovation, and the Net Neutrality Debate

Many advocates of strict net neutrality regulation argue that the Internet has always been a “dumb pipe” and that Congress should require that it remains so. A new report by ITIF Research Fellow Richard Bennett reviews the historical development of the Internet architecture and finds that contrary to such claims, an extraordinarily high degree of intelligence is embedded in the network core. Indeed, the fact that the Internet was originally built to serve the needs of the network research community but has grown into a global platform of commerce and communications was only made possible by continuous and innovative Internet engineering. In the new ITIF report “End-to-End Arguments, Internet Innovation, and the Net Neutrality Debate,” Bennett traces the development of the Internet architecture from the CYCLADES network in France to the present, highlighting developments that have implications for Internet policy. This review will help both engineers and policy makers separate the essentials from the incidentals, identify challenges to continued evolution, and develop appropriate policy frameworks.

See you there.

New Broadband Czar

Trusted sources tell me Blair Levin is headed back to the FCC to be the Commissar of the People’s Glorious Five Year Plan for the Production of Bandwidth. He’d be a wonderful choice, of course, because he’s a bright and humorous fellow with no particular delusions about what he knows and what he doesn’t know.

I haven’t been enthusiastic about this National Broadband Plan business myself, but if we’re going to have one, we’re going to have one, and it should be the best one on the planet. And no, that doesn’t mean that the object of the exercise is for America’s broadband users to have big foam number 1 fingers, it means we do something sensible with the people’s tax dollars.

The plan should figure out a meaningful way to measure progress, and it should fund some of the efforts to create the next-generation network that will one day supersede the TCP/IP Internet. We all love TCP/IP, mind you, but it’s a 35-year-old solution to a problem we understand a lot better today than we did in 1974. We’ll get a chance to see just how much vision the New FCC has by their reaction to this proposal.

UPDATE: Press reports are dribbling out about the appointment.

Finally, nominees for the FCC

Amy Schatz of the WSJ reports that a deal has been struck to move the new nominees into the FCC:

Work has slowed to a crawl at the Federal Communications Commission, since President Barack Obama’s pick to be chairman, Julius Genachowski, is still awaiting Senate confirmation.

But the logjam could be broken soon: Republicans appear to have settled on two people to fill the GOP seats on the five-member board, paving the way for a confirmation hearing in June. Senate Republicans have agreed on former Commerce Department official Meredith Attwell Baker and current FCC Commissioner Robert McDowell, officials close to the process say.

This is good news. McDowell has been the best of the FCC commissioners since his appointment, and allowing him a second term is a very bright move. Uncertainty over McDowell’s future was the cause of the slowdown in confirmation hearings, since these things go forward with the whole slate of nominees. So the new FCC is going to look this this:

Chairman Genachowski, new blood
Dem Copps, old hand
Dem Mignon Clyburn, new blood
Rep McDowell
Rep Meredith Baker, new blood

It’s interesting that Baker and Clyburn are both nepotism candidates, as Clyburn is the daughter of powerful Congressman James Clyburn and Baker is the daughter-in-law of the Bush family’s consigliere, James Baker. That’s not necessarily a bad thing, as the best Chairman of recent times was Colin Powell’s son, and neither of the daughters is particularly unqualified. But if you want to get a laugh out of Blair Levin, the former “sixth commissioner” who wasn’t nominated, tell him you understand that he’s not qualified to serve on the FCC because his daddy’s not in politics. You won’t get a laugh exactly, more like a moan.

The first item of business for the nominees, once they’re confirmed, will be the list of 120 questions Copps put to the world. Good luck to the Commission with that.

FCC Comments due in National Broadband Plan

See IEEE Spectrum for a few observations on the FCC’s request for comments on the National Broadband Plan:

Comments are due Monday, June 8, at the FCC on the National Broadband Plan (NBP.) The Notice of Inquiry lists some 120 questions that the Commission would like filers to address, running the gamut from goals and benchmarks to open access to privacy to entrepreneurial activity to job creation. Anyone who compiles a list of so many questions clearly hasn’t given much thought to the problem under discussion, so it’s clear upon reading the NOI that we’re many years away from a good NBP, although we may have some vague and probably counter-productive guidelines much sooner: the FCC is supposed to report a plan to Congress by next February. Bear in mind that it took the US 20 years to convert from analog to digital TV, and we’re not even there yet.

There’s more.

At long last, Genachowski

The long-awaited nomination of Julius Genachowski to the FCC chair finally came to pass yesterday, raising questions about the delay. If everybody with an interest in telecom and Internet regulation knew he was the choice months ago, why did the official announcement take so long? I have no inside information, so I’ll leave it to those who do to enlighten us on that question. Perhaps the Administration was just being extra-cautious after the debacles around a Commerce Secretary and others.

Neutralists are excited about the choice, naturally, as they view Genachowski as one of their own. And indeed, if network neutrality were actually a coherent policy and not just a rag-tag collection of Christmas wishes, they would have cause to be exhilarated. But given the range of restrictions that the movement seeks, it’s less than clear that any particular raft of regulations would satisfy them and leave broadband networks the ability to function, so we’ll see how this pans out. We’re already hearing runblings from Boucher that there may not be any Congressional action on network neutrality this year in any case.

Genachowski brings an interesting (and potentially very dangerous) set of qualifications to the job. A college buddy of the President, he’s an inner circle member with the power to wield enormous influence. As a former FCC staffer, he’s imbued with the Agency’s culture, and as a former venture capitalist funding fluffy applications software, he’s something of a tech buff. But he resembles Kevin Martin in most of the important respects: he’s a Harvard lawyer who’s worked inside the regulatory system for most of his life, and he has strong alliances to an industry that seeks to exercise control over the nation’s network infrastructure for its own purposes. Whether those purposes resemble the public interest remains to be seen.

The largest problem with the FCC and similar agencies is the knowledge gap between regulators and the modern broadband networks that are the subject of their regulatory power. Martin didn’t have the training to appreciate the effect that his orders would have on the infrastructure, and neither does Genachowski. So the new Chairman is just as likely as the old chairman to make things worse while trying to make them better.

In a perfect world, the commissioners would be able to rely on the expert judgment of the Chief Technologist to stay out of trouble, but the current occupant of that job, Jon Peha, has a penchant for playing politics that renders him ineffective. The bizarre, quixotic inquiry the FCC made recently into the quality of service variations between Comcast’s voice service and over-the-top VoIP is an example. This isn’t a serious line of inquiry for a serious Commission, and Peha never should have let it happen. But it did, and that fact should remind us that the FCC is more a creature of politics than of technology.

Court protects the right to bluff

In a rare move, the DC Circuit has upheld an FCC decision

The cable industry has won a big legal victory in the fiercely competitive phone services market. An appeals court has supported the Federal Communications Commission in its ruling that phone carriers—in this case Verizon—can’t try to lure back customers after they’ve initiated a service switch but before their number has been transferred.

The FCC rarely prevails in court, of course, so this may be a sign that we’re living in the End Times. But we can take some comfort from the fact that it wasn’t totally unpredictable, given that Kevin Martin was on the losing side.

The case involved Verizon’s efforts to win back customers when notified by the new carrier that they had to release the phone number. Verizon took this as an occasion to offer sweeter deals, which the court ruled an unlawful violation of the customer’s privacy, despite the fact that Google’s entire business is based on this kind of snooping.

It’s a win for consumers because it preserves the right to bluff. In today’s economy, consumers can frequently get better deals on subscription services merely by threatening to cancel, whether we’re serious or not. As it happens, I got lower prices from Sports Illustrated and Illy Coffee by calling up to cancel my subscriptions, and in both cases they were substantial. DirecTV refused to offer me a sweetner last year when I was tired of their crappy DVR, so they lost my TV business to Comcast. It’s not entirely clear to the business whether any of these threats are serious, of course, so it’s in their interest to err on the side of caution and offer the customer a better deal when they have the chance. Efforts to win back a customer who’s already made a switch have to be harder to pull off.

But the Verizon deal stacked the cards a little too far in the company’s favor, because it allowed them to play hardball until it was absolutely clear that the customer wasn’t bluffing. They only get a switchover for phone service when you’ve made a deal and scheduled a hookup date.

No deal, we all have the right to bluff and the company is going to have to guess just like any other poker player. That’s a good deal for the consumer.

Damned if you do, screwed if you don’t

The FCC has finally noticed that reducing the Quality of Service of an Internet access service affects all the applications that use it, including VoIP. They’ve sent a harsh letter to Comcast seeking ammunition with which to pillory the cable giant, in one of Kevin Martin’s parting shots:

Does Comcast give its own Internet phone service special treatment compared to VoIP competitors who use the ISP’s network? That’s basically the question that the Federal Communications Commission posed in a letter sent to the cable giant on Sunday. The agency has asked Comcast to provide “a detailed justification for Comcast’s disparate treatment of its own VoIP service as compared to that offered by other VoIP providers on its network.” The latest knock on the door comes from FCC Wireline Bureau Chief Dana Shaffer and agency General Counsel Matthew Berry.

Readers of this blog will remember that I raised this issue with the “protocol-agnostic” management scheme Comcast adopted in order to comply with the FCC’s over-reaction to the former application-aware scheme, which prevented P2P from over-consuming bandwidth needed by more latency-sensitive applications. My argument is that network management needs to operate in two stages, one that allocates bandwidth fairly among users, and a second that allocates it sensibly among the applications in use by each user. The old Comcast scheme did one part of this, and the new scheme does the other part. I’d like to see both at the same time, but it’s not at all clear that the FCC will allow that. So we’re left with various forms of compromise.

The fundamental error that the FCC is making in this instance is incorrectly identifying the “service” that it seeks to regulate according to a new attempt to regulate services (skip to 13:30) rather than technologies.

Comcast sells Internet service, telephone service, and TV service. It doesn’t sell “VoIP service” so there’s no basis to this complaint. The Commission has made it very difficult for Comcast to even identify applications running over the Internet service, and the Net Neuts have typically insisted it refrain from even trying to do so; recall David Reed’s fanatical envelope-waving exercise at the Harvard hearing last year.

The telephone service that Comcast and the telephone companies sell uses dedicated bandwidth, while the over-the-top VoIP service that Vonage and Skype offer uses shared bandwidth. I certainly hope that native phone service outperforms ad hoc VoIP; I pay good money to ensure that it does.

This action says a lot about what’s wrong with the FCC. Regardless of the regulatory model it brings to broadband, it lacks the technical expertise to apply it correctly. The result is “damned if you do, damned if you don’t” enforcement actions.

This is just plain silly. The only party the FCC has any right to take to task in this matter is itself.

The pirates who congregate at DSL Reports are in a big tizzy over this, naturally.

Canadian regulators smarter than Americans

Canada’s Internet users have won a measure of victory over bandwidth hogs. In a ruling from the CRTC, Canada’s FCC, Bell Canada is permitted to continue managing network over-use:

Bell Canada today won a largely clear victory in an anti-throttling lawsuit filed with the Canadian Radio-television and Telecommunications Commission (CRTC). The government body has issued a ruling dismissing claims by Internet providers using part of Bell’s network that accused the carrier of unfairly throttling the connection speeds of their services while also constricting its own. These rivals, represented by the Canadian Association of Internet Providers (CAIP), had accused Bell of trying to hinder competition and violating the basic concepts of net neutrality by discouraging large transfers.

The CRTC’s dismissal is based on the observation that peer-to-peer usage does appear to have a detrimental impact on Bell’s network and so requires at least some level of control to keep service running properly for all users. It also rejects neutrality concerns by claim that Bell’s throttling system, which uses deep packet inspection to investigate traffic, is adjusting speed and doesn’t restrict the content itself.

Bell hails its successful defense as proof that those running online networks are “in the best position” to judge how their networks are managed.

Canada’s Larry Lessig, a populist/demagogue law professor named Michael Geist, was heart-broken over the decision, and pro-piracy web site Ars Technica shed a few tears as well:

The proceeding was also notable for the frank admissions from other large ISPs like Rogers—they admitted that they throttle traffic on a discriminatory basis, too. It also produced wild allegations from companies like Cisco that “even if more bandwidth were added to the network, P2P file-sharing applications are designed to use up that bandwidth.” Such assertions allow the ISPs to claim that they must be able to throttle specific protocols simply to stay afloat—survival is at stake.

This is (to put it politely) highly debatable.

Actually it’s not debatable, not by sane people anyhow. Residential broadband is as cheap as it is only because ISPs can count on people sharing the wires in a civilized fashion. People who keep their broadband pipes constantly saturated take resources away from their neighbors. There are alternatives, of course. You can buy a T-1 line with a Service Level Agreement that you can saturate with all the traffic you want. In the US, count on paying $400/mo for 1.5 Mb/s upload and download. Want something cheaper? Learn to share.

Canada is widely regarded as a more left wing, business-hostile country than the US. How to account for the fact that the CRTC got this issue right while Bush’s FCC got it wrong in the Comcast case?

Technorati Tags:

Thirty Profiles

Dave Burstein of DSL Prime has posted profiles of 30 FCC candidates to his web site, including one transition team member:

Susan Crawford, now teaching at Michigan, also has enormous respect from her peers and would bring international perspective from her role at ICANN setting world Internet policy

The selection of Crawford to join Kevin Werbach on the FCC transition team has already gotten some of my colleagues on the deregulatory side pretty excited, as she has the image of being a fierce advocate of a highly-regulated Internet. And indeed, she has written some strong stuff in favor of the “stupid network” construct that demands all packets be treated as equals inside the network. The critics are missing something that’s very important, however: both Werbach and Crawford are “Internet people” rather than “telecom people” and that’s a very important thing. While we may not like Crawford’s willingness to embrace a neutral routing mandate in the past, the more interesting question is how she comes down on a couple of issues that trump neutral routing, network management and multi-service routing.

We all know by now that the network management exception is more powerful than Powell’s “Four Freedoms” where the rubber meets the road, but we lack any clear guidance to ISPs as to how their management practices will be evaluated. Clarification of the rules is as much a benefit to carriers as it is to consumers. The one way to ensure that we all lose is to keep lumbering along in the murk of uncertain authority and secret rules. Internet people are going to ask the right questions to their candidates, and anybody who can satisfy both Werbach and Crawford will have to be a good choice. Check Werbach’s web site for his papers. Unfotunately, the most interesting of them is not yet in print, “The Centripetal Network: How the Internet Holds Itself Together, and the Forces Tearing it Apart”, UC Davis Law Review, forthcoming 2008. Perhaps he’ll post a draft.

The question of multi-service routing is also very important. Crawford has written and testified to the effect that the Internet is the first global, digital, multi-service network, which is substantially correct. The Internet is not fully multi-service today, however, and can’t be unless it exposes multiple service levels at the end points for applications to use easily. The generic public Internet has a single transport service which has to meet the needs of diverse applications today, which is not really an achievable goal in the peer-to-peer world.
Continue reading “Thirty Profiles”