After five years of bickering, the FCC passed an Open Internet Report & Order on a partisan 3-2 vote this week. The order is meant to guarantee that the Internet of the future will be just as free and open as the Internet of the past. Its success depends on how fast the Commission can transform itself from an old school telecom regulator wired to resist change into an innovation stimulator embracing opportunity. One thing we can be sure about is that the order hasn’t tamped down the hyperbole that’s fueled the fight to control the Internet’s constituent parts for all these years.
Advocates of net neutrality professed deep disappointment that the FCC’s rules weren’t more proscriptive and severe. Free Press called the order “fake net neutrality,” Public Knowledge said it “fell far short,” Media Access Project called it “inadequate and riddled with loopholes,” and New America Foundation accused the FCC of “caving to telecom lobbyists.” These were their official statements to the press; their Tweets were even harsher.
Free marketers were almost as angry: Cato denounced the order as “speech control,” Washington Policy Center said it “fundamentally changes many aspects of the infrastructure of the Internet,” and the Reason Foundation said it will lead to “quagmire after quagmire of technicalities, which as they add up will have a toll on investment, service and development.”
Republican Congressional leaders made no secret of their displeasure with the FCC’s disregard for their will: Rep. Fred Upton (R, Michigan,) the incoming Commerce Committee Chairman called it a “hostile action against innovation that can’t be allowed to stand,” Rep. Greg Walden (R, Oregon,) incoming Chairman of the Subcommittee on Communications and Technology called it a “power grab,” and vowed to hold hearings to overturn it, while Sen. Kay Bailey Hutchison (R, Texas,) Ranking Member of the Senate Commerce, Science, and Transportation Committee said the order “threatens the future economic growth of the Internet.” Setting Internet policy is indeed a Congressional prerogative rather than an agency matter, so the longer-term solution must come from the Hill, and sooner would be better than later.
Contrary to this criticism and to snarky blogger claims, not everyone was upset with the FCC’s action, coming as it did after a year-long proceeding on Internet regulation meant to fulfill an Obama campaign pledge to advance net neutrality. The President himself declared the FCC action an important part of his strategy to “advance American innovation, economic growth, and job creation,” and Senator John Kerry (D, Massachusetts) applauded the FCC for reaching consensus.
Technology industry reaction ranged from positive to resigned: Information Technology Industry Council President and CEO Dean Garfield declared the measure “ensures continued innovation and investment in the Internet,” TechNet supported it, and National Cable and Telecommunications Association head Kyle McSlarrow said it could have been much worse. At the Information Technology and Innovation Foundation, we were pleased by the promises of a relatively humble set of the rules, less so with the final details; we remain encouraged by the robust process the FCC intends to create for judging complaints, one that puts technical people on the front lines. In the end, the order got the support of the only majority that counts, three FCC commissioners.
Most of us who reacted favorably acknowledged the FCC’s order wasn’t exactly as we would have written it, but accepted it as a pragmatic political compromise that produces more positives than negatives. The hoped-for closing of the raucous debate will have immense benefits on its own, as simply bringing this distracting chapter in the Internet’s story to an end will allow more time for sober discussion about the directions we’d like the Internet to take in its future development. There is no shortage of policy issues that have been cramped by the tendency to view net neutrality as the one great magic wand with the power to solve all the Internet’s problems: The FCC has work to do on freeing up spectrum for mobile networking, the Universal Service Fund needs to be reformed, and the National Broadband Plan needs to be implemented.
If the FCC’s approach proves sound, it might well be exported to other countries, forming the basis of a consistent international approach to the oversight of an international network developed on consistent standards of its own. Such an outcome would have positive consequences for the Internet standards community, which has its own backlog of unfinished business such as scalable routing, congestion management, security, and the domestication of peer-to-peer file sharing and content delivery networks to resolve. This outcome is far from inevitable; last minute rule changes make it less likely than it might have been.
The most important thing the FCC can do in implementing its system of Internet oversight is to elevate process over proscriptive rules. The traditional approach to telecom regulation is to develop a thick sheath of regulations that govern everything from the insignias on the telephone repair person’s uniform to the colors of the insulators on RJ11 cables and apply them in top-down, command-and-control fashion. Many of those on the pro-net neutrality side are steeped in telecom tradition, and they expected such an approach from the FCC for the Internet; theirs are the angry reactions.
But the Internet isn’t a telecom network, and a foot-high stack of regulations certainly would produce the negative consequences for innovation and progress the FCC’s critics have forecast. The appropriate way to address Internet regulation as to follow the model that the Internet has developed for itself, based on a small number of abstract but meaningful principles (each of which is subject to change for good reason) applied by a broad-based community of experts in a collaborative, consultative setting. Internet standards are not devised in an adversarial setting populated by angels and devils locked into mortal combat; they come from a process that values “rough consensus and running code.”
The specifics of the FCC’s order nevertheless give pause to those well-schooled in networking. A few hours before the Commission’s vote, Commissioner Copps persuaded Chairman Genachowski to reverse the Waxman Bill’s presumption regarding the premium transport services that enable Internet TV and video conferencing to enjoy the same level of quality as cable TV. Where the early drafts permitted these services as long as they were offered for sale on a non-discriminatory basis, the final rule arbitrarily presumes them harmful.
The order makes hash of the relationship of the content accelerators provided by Akamai and others to the presumptively impermissible communication accelerators that ISPs might provide one day in order to enable HD group video conferencing and similar emerging applications. The Commission majority fears that allowing network operators to offer premium transport to leading edge apps will put the squeeze on generic transport, but fails to consider that such potential downsides of well-accepted technical practices for Quality of Service can be prevented by applying a simple quota limit on the percentage of a pipe that can be sold as “premium.” This fact, which is obvious to skilled protocol engineers, goes unmentioned in the order.
The poor reasoning for this rule casts doubt on the FCC’s ability to enforce it effectively without outside expertise. By rejecting Internet standards such as RFC 2475 and IEEE standards such as 802.1Q that don’t conform to the telecom activists’ nostalgic, “all packets are equal” vision of the Internet, the FCC chose to blind itself to one of the central points in Tim Wu’s “Network Neutrality, Broadband Discrimination” paper that started the fight: A neutral Internet favors content applications, as a class, over communication applications and is therefore not truly an open network. The only way to make a network neutral among all applications is to differentiate loss and delay among applications; preferably, this is done by user-controlled means. That’s not always possible, so other means are sometimes necessary as well.
All in all, the Commission has built a stone too heavy for it to lift all by itself. The rules have just enough flexibility that the outside technical advisory groups that will examine complaints may be able to correct the order’s errors, but to be effective, the advisors need much deeper technical knowledge than the FCC staffers who wrote the order can provide.
It’s difficult to ask the FCC – an institution with its own 75 year tradition in which it has served as the battleground for bitter disputes between monopolists and public interest warriors – to turn on a dime and embrace a new spirit of collaboration, but without such a far-reaching institutional transformation its Internet regulation project will not be successful. Those of us who work with the FCC are required to take a leap of faith to the effect that the Commission is committed to transforming itself from a hidebound analog regulator into a digital age shepherd of innovation. Now that the Open Internet Report & Order has passed, we have no choice but to put our shoulders to the rock to help push it along. There’s no turning back now.
[cross-posted from the Innovation Policy Blog]
Richard,
Congratulations on the first article or blog post I’ve read (and I’ve been reading a lot of them) that shows actual engineering and policy clue. I’ll admit that I’m struggling a bit with this distinction between “content applications” and “communication applications” (the latter seems like an oxymoron to this old OSI bigot), but I agree with your larger point. It’s also kind of difficult to think of the Freepers or Common Knowledge crowd as being steeped in Telecom culture — that would require more technical sophistication and bias toward centralized architectures than I’ve seen evidenced in their pronouncements.
One slight disagreement, though. Admission control is more complex than a “simple quota limit”. More than one good PHD dissertation has been written on admission control algorithms for multi-service networks. A quota would be overly prescriptive. Instead, I’d have the FCC build on their transparency principle, and require broadband operators to publish worst-case performance objectives for best-effort service. The requirement for new flows with QoS guarantees would be they not be admitted if doing so would degrade packet loss and/or delay variance for best-effort service below the published objective.
But to your larger point, I strongly agree that the FCC has gotten it mostly right from a policy perspective. However, I remain slightly skeptical that this will stand up in the DC Circuit. Time will tell.
Thanks for the comment, Dan. The distinction between content and communications apps is meant to capture the difference between apps that can be accelerated by a CDN and those (like video conferencing) that can’t. The means of accelerating communications apps are overlay networks and QoS.
You’re right that admission control isn’t strictly a quota based system, although all the running systems I’ve seen do work that way. If the rules said “quota limit or a similar system” I think we’d get where we need to be.
The comment about the telecom background of the Freepers and PKs was meant to suggest that they’re steeped in telecom law, not technology. I don’t get a good feeling that they understand any of the technologies.
Thanks for the clarification on content vs communications. Now that I understand it, your framework is useful. I was thinking about the same thing in a different way, but working under the framework articulated in the ATM Forum Traffic Management spec, amplified by the fact that we don’t live in an “end-to-end arguments” world. My more concrete argument has been that prioritization – in the context of content apps, under your framework – has an inconsequential effect on end users. The common complaint about Fox News colluding with network providers to somehow make their content more easily accessed than anybody else’s is moot: it is already accelerated by Akamai, and any incremental number of microseconds of latency reduction from priority queueing is not going to be perceptible.
On admission control, I still think you’re getting into the weeds of policy vs mechanism. I don’t think that either of us want the FCC micromanaging mechanism (noting how good a job they did on CableCard, for example). If the policy is that new “AS” or “GS”-like flows don’t get admitted if they will impact advertised minimum performance of “BE” flows, then it doesn’t matter whether mechanism to implement the policy is pre-computed quotas, or some more sophisticated algorithm. Whatever it is, the FCC doesn’t have to be in the business of engineering or adjudicating it.
Net neuts argue, naively, that allowing operators to sell prioritized flows reduces their incentive to invest in raw bandwidth.
A quota system is a fair approximation of a system in which the sale of prioritized flows increases the incentive to invest in raw capacity, as the more bandwidth in the system, the more prioritized flows the operator can sell.
The NN args are full of either/or choices like this that come from network engineering ignorance.