Has the FCC Created a Stone Too Heavy for It to Lift?

After five years of bickering, the FCC passed an Open Internet Report & Order on a partisan 3-2 vote this week. The order is meant to guarantee that the Internet of the future will be just as free and open as the Internet of the past. Its success depends on how fast the Commission can … Continue reading “Has the FCC Created a Stone Too Heavy for It to Lift?”

After five years of bickering, the FCC passed an Open Internet Report & Order on a partisan 3-2 vote this week. The order is meant to guarantee that the Internet of the future will be just as free and open as the Internet of the past. Its success depends on how fast the Commission can transform itself from an old school telecom regulator wired to resist change into an innovation stimulator embracing opportunity. One thing we can be sure about is that the order hasn’t tamped down the hyperbole that’s fueled the fight to control the Internet’s constituent parts for all these years.

Advocates of net neutrality professed deep disappointment that the FCC’s rules weren’t more proscriptive and severe. Free Press called the order “fake net neutrality,” Public Knowledge said it “fell far short,” Media Access Project called it “inadequate and riddled with loopholes,” and New America Foundation accused the FCC of “caving to telecom lobbyists.” These were their official statements to the press; their Tweets were even harsher.

Free marketers were almost as angry: Cato denounced the order as “speech control,” Washington Policy Center said it “fundamentally changes many aspects of the infrastructure of the Internet,” and the Reason Foundation said it will lead to “quagmire after quagmire of technicalities, which as they add up will have a toll on investment, service and development.”

Republican Congressional leaders made no secret of their displeasure with the FCC’s disregard for their will: Rep. Fred Upton (R, Michigan,) the incoming Commerce Committee Chairman called it a “hostile action against innovation that can’t be allowed to stand,” Rep. Greg Walden (R, Oregon,) incoming Chairman of the Subcommittee on Communications and Technology called it a “power grab,” and vowed to hold hearings to overturn it, while Sen. Kay Bailey Hutchison (R, Texas,) Ranking Member of the Senate Commerce, Science, and Transportation Committee said the order “threatens the future economic growth of the Internet.” Setting Internet policy is indeed a Congressional prerogative rather than an agency matter, so the longer-term solution must come from the Hill, and sooner would be better than later.

Contrary to this criticism and to snarky blogger claims, not everyone was upset with the FCC’s action, coming as it did after a year-long proceeding on Internet regulation meant to fulfill an Obama campaign pledge to advance net neutrality. The President himself declared the FCC action an important part of his strategy to “advance American innovation, economic growth, and job creation,” and Senator John Kerry (D, Massachusetts) applauded the FCC for reaching consensus.

Technology industry reaction ranged from positive to resigned: Information Technology Industry Council President and CEO Dean Garfield declared the measure “ensures continued innovation and investment in the Internet,” TechNet supported it, and National Cable and Telecommunications Association head Kyle McSlarrow said it could have been much worse. At the Information Technology and Innovation Foundation, we were pleased by the promises of a relatively humble set of the rules, less so with the final details; we remain encouraged by the robust process the FCC intends to create for judging complaints, one that puts technical people on the front lines. In the end, the order got the support of the only majority that counts, three FCC commissioners.

Most of us who reacted favorably acknowledged the FCC’s order wasn’t exactly as we would have written it, but accepted it as a pragmatic political compromise that produces more positives than negatives. The hoped-for closing of the raucous debate will have immense benefits on its own, as simply bringing this distracting chapter in the Internet’s story to an end will allow more time for sober discussion about the directions we’d like the Internet to take in its future development. There is no shortage of policy issues that have been cramped by the tendency to view net neutrality as the one great magic wand with the power to solve all the Internet’s problems: The FCC has work to do on freeing up spectrum for mobile networking, the Universal Service Fund needs to be reformed, and the National Broadband Plan needs to be implemented.

If the FCC’s approach proves sound, it might well be exported to other countries, forming the basis of a consistent international approach to the oversight of an international network developed on consistent standards of its own. Such an outcome would have positive consequences for the Internet standards community, which has its own backlog of unfinished business such as scalable routing, congestion management, security, and the domestication of peer-to-peer file sharing and content delivery networks to resolve. This outcome is far from inevitable; last minute rule changes make it less likely than it might have been.

The most important thing the FCC can do in implementing its system of Internet oversight is to elevate process over proscriptive rules. The traditional approach to telecom regulation is to develop a thick sheath of regulations that govern everything from the insignias on the telephone repair person’s uniform to the colors of the insulators on RJ11 cables and apply them in top-down, command-and-control fashion. Many of those on the pro-net neutrality side are steeped in telecom tradition, and they expected such an approach from the FCC for the Internet; theirs are the angry reactions.

But the Internet isn’t a telecom network, and a foot-high stack of regulations certainly would produce the negative consequences for innovation and progress the FCC’s critics have forecast. The appropriate way to address Internet regulation as to follow the model that the Internet has developed for itself, based on a small number of abstract but meaningful principles (each of which is subject to change for good reason) applied by a broad-based community of experts in a collaborative, consultative setting. Internet standards are not devised in an adversarial setting populated by angels and devils locked into mortal combat; they come from a process that values “rough consensus and running code.”

The specifics of the FCC’s order nevertheless give pause to those well-schooled in networking. A few hours before the Commission’s vote, Commissioner Copps persuaded Chairman Genachowski to reverse the Waxman Bill’s presumption regarding the premium transport services that enable Internet TV and video conferencing to enjoy the same level of quality as cable TV. Where the early drafts permitted these services as long as they were offered for sale on a non-discriminatory basis, the final rule arbitrarily presumes them harmful.

The order makes hash of the relationship of the content accelerators provided by Akamai and others to the presumptively impermissible communication accelerators that ISPs might provide one day in order to enable HD group video conferencing and similar emerging applications. The Commission majority fears that allowing network operators to offer premium transport to leading edge apps will put the squeeze on generic transport, but fails to consider that such potential downsides of well-accepted technical practices for Quality of Service can be prevented by applying a simple quota limit on the percentage of a pipe that can be sold as “premium.” This fact, which is obvious to skilled protocol engineers, goes unmentioned in the order.

The poor reasoning for this rule casts doubt on the FCC’s ability to enforce it effectively without outside expertise. By rejecting Internet standards such as RFC 2475 and IEEE standards such as 802.1Q that don’t conform to the telecom activists’ nostalgic, “all packets are equal” vision of the Internet, the FCC chose to blind itself to one of the central points in Tim Wu’s “Network Neutrality, Broadband Discrimination” paper that started the fight: A neutral Internet favors content applications, as a class, over communication applications and is therefore not truly an open network. The only way to make a network neutral among all applications is to differentiate loss and delay among applications; preferably, this is done by user-controlled means. That’s not always possible, so other means are sometimes necessary as well.

All in all, the Commission has built a stone too heavy for it to lift all by itself. The rules have just enough flexibility that the outside technical advisory groups that will examine complaints may be able to correct the order’s errors, but to be effective, the advisors need much deeper technical knowledge than the FCC staffers who wrote the order can provide.

It’s difficult to ask the FCC – an institution with its own 75 year tradition in which it has served as the battleground for bitter disputes between monopolists and public interest warriors – to turn on a dime and embrace a new spirit of collaboration, but without such a far-reaching institutional transformation its Internet regulation project will not be successful. Those of us who work with the FCC are required to take a leap of faith to the effect that the Commission is committed to transforming itself from a hidebound analog regulator into a digital age shepherd of innovation. Now that the Open Internet Report & Order has passed, we have no choice but to put our shoulders to the rock to help push it along. There’s no turning back now.

[cross-posted from the Innovation Policy Blog]

Speaking today in DC

This event will be webcast today: ITIF: Events ITIF Event: Designed for Change: End-to-End Arguments, Internet Innovation, and the Net Neutrality Debate Many advocates of strict net neutrality regulation argue that the Internet has always been a “dumb pipe” and that Congress should require that it remains so. A new report by ITIF Research Fellow … Continue reading “Speaking today in DC”

This event will be webcast today:

ITIF: Events

ITIF Event: Designed for Change: End-to-End Arguments, Internet Innovation, and the Net Neutrality Debate

Many advocates of strict net neutrality regulation argue that the Internet has always been a “dumb pipe” and that Congress should require that it remains so. A new report by ITIF Research Fellow Richard Bennett reviews the historical development of the Internet architecture and finds that contrary to such claims, an extraordinarily high degree of intelligence is embedded in the network core. Indeed, the fact that the Internet was originally built to serve the needs of the network research community but has grown into a global platform of commerce and communications was only made possible by continuous and innovative Internet engineering. In the new ITIF report “End-to-End Arguments, Internet Innovation, and the Net Neutrality Debate,” Bennett traces the development of the Internet architecture from the CYCLADES network in France to the present, highlighting developments that have implications for Internet policy. This review will help both engineers and policy makers separate the essentials from the incidentals, identify challenges to continued evolution, and develop appropriate policy frameworks.

See you there.

Nostalgia Blues

San Jose Mercury News columnist Troy Wolverton engaged in a bit of nostalgia in Friday’s paper. He pines for the Golden Age of dial-up Internet access, when Internet users had a plethora of choices: A decade ago, when dial-up Internet access was the norm, you could choose from dozens of providers. With so many rivals, … Continue reading “Nostalgia Blues”

San Jose Mercury News columnist Troy Wolverton engaged in a bit of nostalgia in Friday’s paper. He pines for the Golden Age of dial-up Internet access, when Internet users had a plethora of choices:

A decade ago, when dial-up Internet access was the norm, you could choose from dozens of providers. With so many rivals, you could find Internet access at a reasonable price all by itself, without having to buy a bundle of other services with it.

There was competition because regulators forced the local phone giants to allow such services on their networks. But regulators backed away from open-access rules as the broadband era got under way. While local phone and cable companies could permit other companies to use their networks to offer competing services, regulators didn’t require them to do so and cable providers typically didn’t.

Wolverton’s chief complaint is that the DSL service he buys from Earthlink is slow and unreliable. He acknowledges that he could get cheaper service from AT&T and faster service from Comcast, but doesn’t choose to switch because he doesn’t want to “pay through the nose.”

The trouble with nostalgia is that the past never really was as rosy as we tend remember it, and the present is rarely as bad as it appears through the lens of imagination. Let’s consider the facts.

Back in the dial-up days, there were no more than three first-class ISPs in the Bay Area: Best Internet, Netcom, and Rahul. They charged $25-30/month, over the $15-20 we also paid for a phone line dedicated to Internet access; we didn’t want our friends to get a busy signal when we were on-line. So we paid roughly $45/month to access the Internet at 40 Kb/s download and 14 Kb/s or so upstream.

Now that the nirvana of dial-up competition (read: several companies selling Twinkies and nobody selling steak) has ended, what can we get for $45/month? One choice in the Bay Area is Comcast, who will gladly provide you with a 15 Mb/s service for a bit less than $45 ($42.95 after the promotion ends,) or a 20 Mb/s service for a bit more, $52.95. If this is “paying through the nose,” then what were we doing when we paid the same prices for 400 times less performance back in the Golden Age? And if you don’t want or need this much speed, you can get reasonable DSL-class service from a number of ISPs that’s 40 times faster and roughly half the price of dial-up.

Wolverton’s column is making the rounds of the Internet mailing lists and blogs where broadband service is discussed, to mixed reviews. Selective memory fails to provide a sound basis for broadband policy, and that’s really all that Wolverton provides.

, ,

My New Job

Incidentally, I’ve started working for the Information Technology and Innovation Foundation in DC as of this week as a Research Fellow. I’ll be working on the issues that I’ve been working on as a consultant for the past few years: pro-innovation Internet regulation, the National Broadband Plan, and regulatory and policy considerations in the wireless … Continue reading “My New Job”

Incidentally, I’ve started working for the Information Technology and Innovation Foundation in DC as of this week as a Research Fellow. I’ll be working on the issues that I’ve been working on as a consultant for the past few years: pro-innovation Internet regulation, the National Broadband Plan, and regulatory and policy considerations in the wireless networking space. I’m staying in Silicon Valley for the time being, but I will be making more regular visits to DC.

I like ITIF because their policy line is pragmatic and moderate: they appreciate the fact that sound regulatory policy makes good things happen, and don’t support or oppose any particular line on reflex.

Second Hearing in Internet Privacy tomorrow

From House Energy and Commerce:

From House Energy and Commerce:

Energy and Commerce Subcommittee Hearing on “Behavioral Advertising: Industry Practices and Consumers’ Expectations”

Energy and Commerce Subcommittee Hearing on “Behavioral Advertising: Industry Practices and Consumers’ Expectations”
Publications
June 16, 2009

The Subcommittee on Communications, Technology and the Internet and the Subcommittee on Commerce, Trade, and Consumer Protection will hold a joint hearing titled, “Behavioral Advertising: Industry Practices and Consumers’ Expectations” on Thursday, June 18, 2009, in 2123 Rayburn House Office Building. The hearing will examine the potential privacy implications of behavioral advertising.

INVITED WITNESSES:

* Jeffrey Chester, Executive Director, Center for Digital Democracy
* Scott Cleland, President, Precursor LLC
* Charles D. Curran, Executive Director, Network Advertising Initiative
* Christopher M. Kelly, Chief Privacy Officer, Facebook
* Edward W. Felten, Professor of Computer Science and Public Affairs, Princeton University
* Anne Toth, Vice President of Policy, Head of Privacy, Yahoo! Inc.
* Nicole Wong, Deputy General Counsel, Google Inc.

WHEN: 10:00 a.m. on Thursday, June 18

WHERE: 2123 Rayburn House Office Building



This is the second in a series of hearings on the subject of behavioral advertising. I’ll predict that the Democrats will praise Google, the Republicans will criticize them, and nobody will pay much notice to Yahoo.

I only know four of the six personally, I need to get out more.

FCC Comments due in National Broadband Plan

See IEEE Spectrum for a few observations on the FCC’s request for comments on the National Broadband Plan: Comments are due Monday, June 8, at the FCC on the National Broadband Plan (NBP.) The Notice of Inquiry lists some 120 questions that the Commission would like filers to address, running the gamut from goals and … Continue reading “FCC Comments due in National Broadband Plan”

See IEEE Spectrum for a few observations on the FCC’s request for comments on the National Broadband Plan:

Comments are due Monday, June 8, at the FCC on the National Broadband Plan (NBP.) The Notice of Inquiry lists some 120 questions that the Commission would like filers to address, running the gamut from goals and benchmarks to open access to privacy to entrepreneurial activity to job creation. Anyone who compiles a list of so many questions clearly hasn’t given much thought to the problem under discussion, so it’s clear upon reading the NOI that we’re many years away from a good NBP, although we may have some vague and probably counter-productive guidelines much sooner: the FCC is supposed to report a plan to Congress by next February. Bear in mind that it took the US 20 years to convert from analog to digital TV, and we’re not even there yet.

There’s more.

Catching up

I’ve been too busy to blog lately, what with the conferences, a white paper I’m writing about protocols and regulation, a recalcitrant editor (at a local paper,) and a new gig blogging for IEEE Spectrum’s Tech Talk. My observations on networking and policy will be appearing there for the while. The focus over here is … Continue reading “Catching up”

I’ve been too busy to blog lately, what with the conferences, a white paper I’m writing about protocols and regulation, a recalcitrant editor (at a local paper,) and a new gig blogging for IEEE Spectrum’s Tech Talk. My observations on networking and policy will be appearing there for the while.

The focus over here is going to be pure politics and pure technology, with a little bit of baseball.

We’re in a silly season for politics at the moment:

* Pro-lifers committing murder
* Intellectuals practicing tribal politics
* Critics of tribal politics complaining about pronunciation
* Morons playing with statistics

The funniest among these (please note, there’s nothing funny about murder) is the conspiracy theory about Hillary fans among the car dealers getting off the shutdown hook. It comes as no big surprise that the source of the rumor is Doug Ross, the big net neutrality booster who used to comment here as “Director Blue” until I shut him off. The common thread is conspiracy theory, the essential philosophical basis of American politics.

What slows down your Wi-Fi?

The Register stumbled upon an eye-opening report commissioned by the UK telecom regulator, Ofcom, on sources of Wi-Fi interference in the UK: What Mass discovered (pdf) is that while Wi-Fi users blame nearby networks for slowing down their connectivity, in reality the problem is people watching retransmitted TV in the bedroom while listening to their … Continue reading “What slows down your Wi-Fi?”

The Register stumbled upon an eye-opening report commissioned by the UK telecom regulator, Ofcom, on sources of Wi-Fi interference in the UK:

What Mass discovered (pdf) is that while Wi-Fi users blame nearby networks for slowing down their connectivity, in reality the problem is people watching retransmitted TV in the bedroom while listening to their offspring sleeping, and there’s not a lot the regulator can do about it.

Outside central London that is: in the middle of The Smoke there really are too many networks, with resends, beacons and housekeeping filling 90 per cent of the data frames sent over Wi-Fi. This leaves only 10 per cent for users’ data. In fact, the study found that operating overheads for wireless Ethernet were much higher than anticipated, except in Bournemouth for some reason: down on the south coast 44 per cent of frames contain user data.

When 90% of the frames are overhead, the technology itself has a problem, and in this case it’s largely the fact that there’s such a high backward-compatibility burden in Wi-Fi. Older versions of the protocol weren’t designed for obsolescence, so the newer systems have to take steps to ensure the older systems can see them, expensive ones, or collisions happen, and that’s not good for anybody. Licensed spectrum can deal with the obsolescence problem by replacing older equipment; open spectrum has to bear the costs of compatibility forever. So this is one more example of the fact that “open” is not always better.

What Policy Framework Will Further Enable Innovation on the Mobile Net?

Here’s the video of the panel I was on at the Congressional Internet Caucus Advisory Committee’s “State of the Mobile Net” conference in DC last Thursday. This was the closing panel of the conference, where all the loose ends were tied together. For those who don’t live and breath Washington politics, I should do what … Continue reading “What Policy Framework Will Further Enable Innovation on the Mobile Net?”

Here’s the video of the panel I was on at the Congressional Internet Caucus Advisory Committee’s “State of the Mobile Net” conference in DC last Thursday. This was the closing panel of the conference, where all the loose ends were tied together. For those who don’t live and breath Washington politics, I should do what moderator Blair Levin didn’t do and introduce the panel. Levin was the head of the TIGR task force for the Obama transition, the master group for the review of the regulatory agencies and the administration’s use of technology. Kevin Werbach is a professor at the Wharton School, and took part in the FCC review for the transition along with Susan Crawford. He runs the Supernova conference. Larry Irving was part of the review of NTIA for the transition, and is a former Assistant Secretary of Commerce. Ben Scott is the policy guy at Free Press, and Alex Hoehn-Saric is legal counsel to the Senate Committee on Commerce, Science and Transportation.

Regulatory policy needs to be technically grounded, so I emphasized the tech side of things.