Nostalgia Blues

San Jose Mercury News columnist Troy Wolverton engaged in a bit of nostalgia in Friday’s paper. He pines for the Golden Age of dial-up Internet access, when Internet users had a plethora of choices:

A decade ago, when dial-up Internet access was the norm, you could choose from dozens of providers. With so many rivals, you could find Internet access at a reasonable price all by itself, without having to buy a bundle of other services with it.

There was competition because regulators forced the local phone giants to allow such services on their networks. But regulators backed away from open-access rules as the broadband era got under way. While local phone and cable companies could permit other companies to use their networks to offer competing services, regulators didn’t require them to do so and cable providers typically didn’t.

Wolverton’s chief complaint is that the DSL service he buys from Earthlink is slow and unreliable. He acknowledges that he could get cheaper service from AT&T and faster service from Comcast, but doesn’t choose to switch because he doesn’t want to “pay through the nose.”

The trouble with nostalgia is that the past never really was as rosy as we tend remember it, and the present is rarely as bad as it appears through the lens of imagination. Let’s consider the facts.

Back in the dial-up days, there were no more than three first-class ISPs in the Bay Area: Best Internet, Netcom, and Rahul. They charged $25-30/month, over the $15-20 we also paid for a phone line dedicated to Internet access; we didn’t want our friends to get a busy signal when we were on-line. So we paid roughly $45/month to access the Internet at 40 Kb/s download and 14 Kb/s or so upstream.

Now that the nirvana of dial-up competition (read: several companies selling Twinkies and nobody selling steak) has ended, what can we get for $45/month? One choice in the Bay Area is Comcast, who will gladly provide you with a 15 Mb/s service for a bit less than $45 ($42.95 after the promotion ends,) or a 20 Mb/s service for a bit more, $52.95. If this is “paying through the nose,” then what were we doing when we paid the same prices for 400 times less performance back in the Golden Age? And if you don’t want or need this much speed, you can get reasonable DSL-class service from a number of ISPs that’s 40 times faster and roughly half the price of dial-up.

Wolverton’s column is making the rounds of the Internet mailing lists and blogs where broadband service is discussed, to mixed reviews. Selective memory fails to provide a sound basis for broadband policy, and that’s really all that Wolverton provides.

, ,

Second Hearing in Internet Privacy tomorrow

From House Energy and Commerce:

Energy and Commerce Subcommittee Hearing on “Behavioral Advertising: Industry Practices and Consumers’ Expectations”

Energy and Commerce Subcommittee Hearing on “Behavioral Advertising: Industry Practices and Consumers’ Expectations”
Publications
June 16, 2009

The Subcommittee on Communications, Technology and the Internet and the Subcommittee on Commerce, Trade, and Consumer Protection will hold a joint hearing titled, “Behavioral Advertising: Industry Practices and Consumers’ Expectations” on Thursday, June 18, 2009, in 2123 Rayburn House Office Building. The hearing will examine the potential privacy implications of behavioral advertising.

INVITED WITNESSES:

* Jeffrey Chester, Executive Director, Center for Digital Democracy
* Scott Cleland, President, Precursor LLC
* Charles D. Curran, Executive Director, Network Advertising Initiative
* Christopher M. Kelly, Chief Privacy Officer, Facebook
* Edward W. Felten, Professor of Computer Science and Public Affairs, Princeton University
* Anne Toth, Vice President of Policy, Head of Privacy, Yahoo! Inc.
* Nicole Wong, Deputy General Counsel, Google Inc.

WHEN: 10:00 a.m. on Thursday, June 18

WHERE: 2123 Rayburn House Office Building



This is the second in a series of hearings on the subject of behavioral advertising. I’ll predict that the Democrats will praise Google, the Republicans will criticize them, and nobody will pay much notice to Yahoo.

I only know four of the six personally, I need to get out more.

New Broadband Czar

Trusted sources tell me Blair Levin is headed back to the FCC to be the Commissar of the People’s Glorious Five Year Plan for the Production of Bandwidth. He’d be a wonderful choice, of course, because he’s a bright and humorous fellow with no particular delusions about what he knows and what he doesn’t know.

I haven’t been enthusiastic about this National Broadband Plan business myself, but if we’re going to have one, we’re going to have one, and it should be the best one on the planet. And no, that doesn’t mean that the object of the exercise is for America’s broadband users to have big foam number 1 fingers, it means we do something sensible with the people’s tax dollars.

The plan should figure out a meaningful way to measure progress, and it should fund some of the efforts to create the next-generation network that will one day supersede the TCP/IP Internet. We all love TCP/IP, mind you, but it’s a 35-year-old solution to a problem we understand a lot better today than we did in 1974. We’ll get a chance to see just how much vision the New FCC has by their reaction to this proposal.

UPDATE: Press reports are dribbling out about the appointment.

Interlocking Directorates

The New York Times reports that regulators have an interest in the structure of the Apple and Google boards of directors:

The Federal Trade Commission has begun an inquiry into whether the close ties between the boards of two of technology’s most prominent companies, Apple and Google, amount to a violation of antitrust laws, according to several people briefed on the inquiry.

I doubt this will go very far, as the interlocking directors (Eric Schmidt and former Genentech CEO Arthur Levinson,) will simply resign before any enforcement action is imminent, but it does raise some interesting questions about the market for mobile phone operating systems, currently split between Apple, Google, Microsoft, Palm, and a few others. These systems are rife with limitations, each of which could be considered a network neutrality violation when viewed in just the right way.

I imagine Apple itself might wish to give Dr. Schmidt his walking papers before he becomes an anti-trust problem, which he actually isn’t at this point. The FTC’s interest in this obscure situation is probably a signal that the Administration wants to be viewed as an anti-trust hawk without doing anything substantial.

But this is what the law calls an “occasion of sin.” Dear me.

What Policy Framework Will Further Enable Innovation on the Mobile Net?

Here’s the video of the panel I was on at the Congressional Internet Caucus Advisory Committee’s “State of the Mobile Net” conference in DC last Thursday. This was the closing panel of the conference, where all the loose ends were tied together. For those who don’t live and breath Washington politics, I should do what moderator Blair Levin didn’t do and introduce the panel. Levin was the head of the TIGR task force for the Obama transition, the master group for the review of the regulatory agencies and the administration’s use of technology. Kevin Werbach is a professor at the Wharton School, and took part in the FCC review for the transition along with Susan Crawford. He runs the Supernova conference. Larry Irving was part of the review of NTIA for the transition, and is a former Assistant Secretary of Commerce. Ben Scott is the policy guy at Free Press, and Alex Hoehn-Saric is legal counsel to the Senate Committee on Commerce, Science and Transportation.

Regulatory policy needs to be technically grounded, so I emphasized the tech side of things.

What I Did This Morning

While California was sleeping, I enjoyed a bit of broadband politics in the heart of the beast, testifying at the House Subcommittee on Communications, Technology, and the Internet on Communications Networks and Consumer Privacy: Recent Developments

The Subcommittee on Communications, Technology, and the Internet held a hearing titled, “Communications Networks and Consumer Privacy: Recent Developments” on Thursday, April 23, 2009, in 2322 Rayburn House Office Building. The hearing focused on technologies that network operators utilize to monitor consumer usage and how those technologies intersect with consumer privacy. The hearing explored three ways to monitor consumer usage on broadband and wireless networks: deep packet inspection (DPI); new uses for digital set-top boxes; and wireless Global Positioning System (GPS) tracking.

Witness List

* Ben Scott, Policy Director, Free Press
* Leslie Harris, President and CEO, Center for Democracy and Technology
* Kyle McSlarrow, President and CEO, National Cable and Telecommunications Association
* Dorothy Attwood, Chief Privacy Officer and Senior Vice President, Public Policy, AT&T Services, Inc.
* Brian R. Knapp, Chief Operating Officer, Loopt, Inc.
* Marc Rotenberg, Executive Director, The Electronic Privacy Information Center
* Richard Bennett, Publisher, BroadbandPolitics.com

It went pretty well, all in all; it’s really good to be last on a panel, and the Reps aren’t as snarky as California legislators. I’ll have more on this later.

, ,

eComm Spectrum 2.0 Panel Video

Here’s the licensing panel from eComm live and in color. Seeing yourself on TV is weird; my immediate reaction is to fast for about a month.

On a related note, see Saul Hansell’s musings on spectrum.

The issue I wanted to raise at eComm and couldn’t due to lack of time and the meandering speculations about collision-free networks is spectrum sharing. Two-way communications systems all need a shared pipe at some level, and the means by which access to the pipe are mediated distinguish one system from another. So far, the debate on white spaces in particular and open spectrum in general is about coding and power levels, the easy parts of the problem. The hard part is how the system decides which of a number of competing transmitters can access the pipe at any given time. The fact that speculative coding systems might permit multiple simultaneous connections on the same frequency in the same space/time moment doesn’t make this question go away, since they only help point-to-point communications. Internet access is inherently a point-to-multipoint problem as theses system all aggregate wireless systems in order to move them to the fiber backbone.

The advantage of licensing is that it provides the spectrum with an authorized bandwidth manager who can mediate among the desires of competing users and ensure fairness per dollar (or some similar policy.) The idea that we can simply dispense with a bandwidth manager in a wide-area network access system remains to be proved.

So I would submit that one of the principles that regulators need to consider when deciding between licensed and unlicensed uses is the efficiency of access. The notion that efficiency can be discarded in favor of ever-fatter pipes is obviously problematic in relation to wireless systems; they’re not making more spectrum.

Obama’s Missed Opportunity

According to National Journal, Susan Crawford is joining the Obama administration in a significant new role:

Internet law expert Susan Crawford has joined President Barack Obama’s lineup of tech policy experts at the White House, according to several sources. She will likely hold the title of special assistant to the president for science, technology, and innovation policy, they said.

This does not make me happy. Crawford is not a scientist, technologist, or innovator, and the job that’s been created for her needs to be filled by someone who is; and an exceptional one at that, a person with deep knowledge of technology, the technology business, and the dynamics of research and business that promote innovation. A life as a legal academic is not good preparation for this kind of a job. Crawford is a sweet and well-meaning person, who fervently believes that the policy agenda she’s been promoting is good for the average citizen and the general health of the democracy and that sort of thing, but she illustrates the adage that a little knowledge is a dangerous thing.

As much as she loves the Internet and all that it’s done for modern society, she has precious little knowledge about the practical realities of its operation. Her principal background is service on the ICANN Board, where she listened to debates on the number of TLDs that can dance on the head of pin and similarly weighty matters. IETF engineers generally scoff at ICANN as a bloated, inefficient, and ineffective organization that deals with issues no serious engineer wants anything to do with. Her other qualification is an advisory role at Public Knowledge, a big player on the Google side of the net neutrality and copyright debates.

At my recent net neutrality panel discussion at MAAWG, I warned the audience that Crawford’s selection to co-manage the Obama transition team’s FCC oversight was an indication that extreme views on Internet regulation might become mainstream. It appears that my worst fears have been realized. Crawford has said that Internet traffic must not be shaped, managed, or prioritized by ISPs and core networking providers, which is a mistake of the worst kind. While work is being done all over the world to adapt the Internet to the needs of a more diverse mix of applications than it’s traditionally handled, Crawford harbors the seriously misguided belief that it already handles diverse applications well enough. Nothing could be farther from the truth, of course: P2P has interesting uses, but it degrades the performance of VoIP and video calling unless managed.

This is an engineering problem that can be solved, but which won’t be if the constraints on traffic management are too severe. People who harbor the religious approach to network management that Crawford professes have so far been an interesting sideshow in the network management wars, but if their views come to dominate the regulatory framework, the Internet will be in serious danger.

Creating a position for a special adviser on science, technology and innovation gave President Obama the opportunity to to lay the foundation of a strong policy in a significant area. Filling it with a law professor instead of an actual scientist, technologist, or innovator simply reinforces the creeping suspicion that Obama is less about transformational change than about business as usual. That’s a shame.

Cross-posted at CircleID.

, , , ,

Opting-out of Adsense

Regular readers are aware that this blog used to feature Google ads. We never made serious money from Adsense, so it was easy to decide to drop it when the Terms and Conditions of Google’s new behavioral advertising campaign were relased. Here’s what Google suggests re: a privacy disclosure:

What should I put in my privacy policy?

Your posted privacy policy should include the following information about Google and the DoubleClick DART cookie:

* Google, as a third party vendor, uses cookies to serve ads on your site.
* Google’s use of the DART cookie enables it to serve ads to your users based on their visit to your sites and other sites on the Internet.
* Users may opt out of the use of the DART cookie by visiting the Google ad and content network privacy policy.

Because publisher sites and laws across countries vary, we’re unable to suggest specific privacy policy language. However, you may wish to review resources such as the Network Advertising Initiative, or NAI, which suggests the following language for data collection of non-personally identifying information:

We use third-party advertising companies to serve ads when you visit our website. These companies may use information (not including your name, address, email address, or telephone number) about your visits to this and other websites in order to provide advertisements about goods and services of interest to you. If you would like more information about this practice and to know your choices about not having this information used by these companies, click here.

You can find additional information in Appendix A of the NAI Self-Regulatory principles for publishers (PDF). Please note that the NAI may change this sample language at any time.

People don’t come to this site to buy stuff, and they shouldn’t have to undergo a vexing decision-making process before visiting this blog, so we’ve dropped Google as an advertiser. Not because Google is Evil, but simply because this is one too many hoops for our readers to jump through. Plus, the commission rate sucks.

So please continue to read Broadband Politics without fear of being reported to Big Brother.

Digital Britain and Hokey Tools

It’s helpful to see how other countries deal with the typically over-excited accusations of our colleagues regarding ISP management practices. Case in point is the Digital Britain Interim Report from the UK’s Department for Culture, Media and Sport and Department for Business, Enterprise and Regulatory Reform, which says (p. 27):

Internet Service Providers can take action to manage the flow of data – the traffic – on their networks to retain levels of service to users or for other reasons. The concept of so-called ‘net neutrality’, requires those managing a network to refrain from taking action to manage traffic on that network. It also prevents giving to the delivery of any one service preference over the delivery of others. Net neutrality is sometimes cited by various parties in defence of internet freedom, innovation and consumer choice. The debate over possible legislation in pursuit of this goal has been stronger in the US than in the UK. Ofcom has in the past acknowledged the claims in the debate but have also acknowledged that ISPs might in future wish to offer guaranteed service levels to content providers in exchange for increased fees. In turn this could lead to differentiation of offers and promote investment in higher-speed access networks. Net neutrality regulation might prevent this sort of innovation.

Ofcom has stated that provided consumers are properly informed, such new business models could be an important part of the investment case for Next Generation Access, provided consumers are properly informed.

On the same basis, the Government has yet to see a case for legislation in favour of net neutrality. In consequence, unless Ofcom find network operators or ISPs to have Significant Market Power and justify intervention on competition grounds, traffic management will not be prevented.

(Ofcom is the UK’s FCC). Net neutrality is, in essence, a movement driven by fears of hypothetical harm that might be visited upon the Internet given a highly unlikely set of circumstances. Given the fact that 1.4 billion people use the Internet every day, and the actual instances of harmful discrimination by ISPs can be counted on one hand (and pales in comparison to harm caused by malicious software and deliberate bandwidth hogging in any case,) Ofcom’s stance is the only one that makes any sense: keep an eye on things, and don’t act without provocation. This position would have kept us out of Iraq, BTW.

Yet we have lawmakers in the US drafting bills full of nebulous language and undefined terms aimed at stemming this invisible menace.

Are Americans that much less educated than Brits, or are we just stupid? In fact, we have a net neutrality movement in the US simply because we have some well-funded interests manipulating a gullible public and a system of government that responds to emotion.

A good example of these forces at work is the freshly released suite of network test tools on some of Google’s servers. Measurement Lab checks how quickly interested users can reach Google’s complex in Mountain View, breaking down the process into hops. As far as I can tell, this is essentially a dolled-up version of the Unix “traceroute” which speculates about link congestion and takes a very long time to run.

The speed, latency, and consistency of access to Google is certainly an important part of the Internet experience, but it’s hardly definitive regarding who’s doing what to whom. But the tech press loves this sort of thing because it’s just mysterious enough in its operation to invite speculation and sweeping enough in its conclusions to get users excited. It’s early days for Measurement Lab, but I don’t have high expectations for its validity.