The Fall of Joi Ito

Back in 2003 I got into a blog tussle with Joi Ito, the disgraced former director of the MIT Media Lab who was forced to resign from the Lab and a number of corporate boards over ethical lapses related to Jeffrey Epstein. I was fairly amazed that Ito was hired by the Media Lab in the first place.

It seems like a job for a futurist, a technologist, or an intellectual, and Ito is none of those things. But he is well-connected, which is great for fundraising if nothing else.

Emergent Democracy

I’m not going to rehash the issues at MIT because they’ve been well covered by Ronan Farrow, Andrew Orlowski, and Evgeny Morozov. I’d like to share a post I wrote about Ito’s ideas about something he called “emergent democracy”, my reaction to them, and Ito’s reaction to my commentary. This is about schadenfreude, in other words.

Ito was one of the first to jump aboard the blog train in the days when we still called blogs “weblogs”. He tried to put together an essay mashing up the ideas Steven Johnson laid out in his 2001 book Emergence: The Connected Lives of Ants, Brains, Cities, and Software with Howard Rheingold’s 1993 musings about the Internet in The Virtual Community.

In essence, Ito claimed that the Internet could, given the creation of new tools, revolutionize the ways societies govern themselves. Instead of the musty old top-down, command-and-control model of representative democracy, the Internet could expand the circle of participation in governmental decision-making and usher in a new era of direct democracy.

Since [1993, Rheingold] has been criticized as being naive about his views. This is because the tools and protocols of the Internet have not yet evolved enough to allow the emergence of Internet democracy to create a higher-level order. As these tools evolve we are on the verge of an awakening of the Internet. This awakening will facilitate a political model enabled by technology to support those basic attributes of democracy which have eroded as power has become concentrated within corporations and governments. It is possible that new technologies may enable a higher-level order, which in turn will enable a form of emergent democracy able to manage complex issues and support, change or replace our current representative democracy. It is also possible that new technologies will empower terrorists or totalitarian regimes. These tools will have the ability to either enhance or deteriorate democracy and we must do what is possible to influence the development of the tools for better democracy.

Emergent Democracy, Joi Ito, 2003.

Mixed Results

To Ito’s and Rheingold’s credit, they didn’t see a future that was all peaches and cream. But it was fairly obvious even in those days that the popularization of the Internet was going to bring forth both good and bad results.

We can learn obscure subjects quickly, we can shop at the world’s largest store without leaving our desks, and we can learn how to fix things. But we also have Trump in the White House and a networked terror cult known as ISIS tearing it up in the Middle East.

It wasn’t either/or, it was both/and: new conveniences and new threats at the same time. Ito didn’t anticipate this, but it was always the most likely future. He also found himself unable to complete the essay, so he turned it over to one of his Wellbert friends, Jon Lebkowsky, to finish.

My Criticism

I addressed an early draft of Emergent Democracy in this post, Emergence Fantasies. It appeared to me that Ito was effectively touting a form of government like the California initiative process that would be informed by blog posts and effectively controlled by a blogger elite. The elite bar was pretty low among the blogs in 2003, so this didn’t look like progress to me.

The larger problem was the essential incoherence of Ito’s reasoning. Well-connected as he is socially, Ito is no intellectual. He also lacks a reasonable understanding of the ways legislative bodies work, at least according to my frame of reference as someone who’s been working with them for twenty years or so.

The emergence thing is also suspect. A the time, it was a fixation among the crowd that thinks of Jared Diamond, Steven Pinker, and Nassim Taleb as great thinkers, but it’s little more than trivia about the behavior of animal groups. Ant colonies are far from grass-roots democracies in any case, and they’ve fascinated political thinkers for thousands of years. I’d be happy to read a book on the biochemistry of ant colonies, but Emergence is not it.

So I said this:

Emergent democracy apparently differs from representative democracy by virtue of being unmediated, and is claimed by the author to offer superior solutions to complex social problems because governments don’t scale, or something. Emergent democracy belief requires us to abandon notions of intellectual property and corporations, apparently because such old-fashioned constructs would prevent democratic ants from figuring out where to bury their dead partners, I think. One thing that is clear is that weblogs are an essential tool for bringing emergent democracy to its full development, and another is that the cross-blog debate on the liberation of Iraq is a really cool example of the kind of advanced discourse that will solve all these problems we’ve had over the years as soon as we evolve our tools to the ant colony level.

Emergence fantasies, me, 2003.

The conversation continued on Ito’s blog under a post ironically titled Can we control our lust for power? The answer to that question was obviously “no”.

The Black List

Ito was not amused, so he black-listed me:

Mr. Bennett has a very dismissive and insulting way of engaging and is a good example of “noise” when we talk about the “signal to noise ratio”. Adam has recently taken over the fight for me on my blog. My Bennett filter is now officially on so I won’t link to his site or engage directly with the fellow any more. At moments he seems to have a point, but it’s very tiring engaging with him and I would recommend others from wasting as much time as I have.

So that’s Joi Ito for you: a man who loves Jeffrey Epstein so much that he’s willing to lie to his bosses to keep him in the Media Lab social network but can’t take honest criticism. His fall from grace was long overdue, and I’m proud to have such enemies.

Has the FCC Created a Stone Too Heavy for It to Lift?

After five years of bickering, the FCC passed an Open Internet Report & Order on a partisan 3-2 vote this week. The order is meant to guarantee that the Internet of the future will be just as free and open as the Internet of the past. Its success depends on how fast the Commission can transform itself from an old school telecom regulator wired to resist change into an innovation stimulator embracing opportunity. One thing we can be sure about is that the order hasn’t tamped down the hyperbole that’s fueled the fight to control the Internet’s constituent parts for all these years.

Advocates of net neutrality professed deep disappointment that the FCC’s rules weren’t more proscriptive and severe. Free Press called the order “fake net neutrality,” Public Knowledge said it “fell far short,” Media Access Project called it “inadequate and riddled with loopholes,” and New America Foundation accused the FCC of “caving to telecom lobbyists.” These were their official statements to the press; their Tweets were even harsher.

Free marketers were almost as angry: Cato denounced the order as “speech control,” Washington Policy Center said it “fundamentally changes many aspects of the infrastructure of the Internet,” and the Reason Foundation said it will lead to “quagmire after quagmire of technicalities, which as they add up will have a toll on investment, service and development.”

Republican Congressional leaders made no secret of their displeasure with the FCC’s disregard for their will: Rep. Fred Upton (R, Michigan,) the incoming Commerce Committee Chairman called it a “hostile action against innovation that can’t be allowed to stand,” Rep. Greg Walden (R, Oregon,) incoming Chairman of the Subcommittee on Communications and Technology called it a “power grab,” and vowed to hold hearings to overturn it, while Sen. Kay Bailey Hutchison (R, Texas,) Ranking Member of the Senate Commerce, Science, and Transportation Committee said the order “threatens the future economic growth of the Internet.” Setting Internet policy is indeed a Congressional prerogative rather than an agency matter, so the longer-term solution must come from the Hill, and sooner would be better than later.

Contrary to this criticism and to snarky blogger claims, not everyone was upset with the FCC’s action, coming as it did after a year-long proceeding on Internet regulation meant to fulfill an Obama campaign pledge to advance net neutrality. The President himself declared the FCC action an important part of his strategy to “advance American innovation, economic growth, and job creation,” and Senator John Kerry (D, Massachusetts) applauded the FCC for reaching consensus.

Technology industry reaction ranged from positive to resigned: Information Technology Industry Council President and CEO Dean Garfield declared the measure “ensures continued innovation and investment in the Internet,” TechNet supported it, and National Cable and Telecommunications Association head Kyle McSlarrow said it could have been much worse. At the Information Technology and Innovation Foundation, we were pleased by the promises of a relatively humble set of the rules, less so with the final details; we remain encouraged by the robust process the FCC intends to create for judging complaints, one that puts technical people on the front lines. In the end, the order got the support of the only majority that counts, three FCC commissioners.

Most of us who reacted favorably acknowledged the FCC’s order wasn’t exactly as we would have written it, but accepted it as a pragmatic political compromise that produces more positives than negatives. The hoped-for closing of the raucous debate will have immense benefits on its own, as simply bringing this distracting chapter in the Internet’s story to an end will allow more time for sober discussion about the directions we’d like the Internet to take in its future development. There is no shortage of policy issues that have been cramped by the tendency to view net neutrality as the one great magic wand with the power to solve all the Internet’s problems: The FCC has work to do on freeing up spectrum for mobile networking, the Universal Service Fund needs to be reformed, and the National Broadband Plan needs to be implemented.

If the FCC’s approach proves sound, it might well be exported to other countries, forming the basis of a consistent international approach to the oversight of an international network developed on consistent standards of its own. Such an outcome would have positive consequences for the Internet standards community, which has its own backlog of unfinished business such as scalable routing, congestion management, security, and the domestication of peer-to-peer file sharing and content delivery networks to resolve. This outcome is far from inevitable; last minute rule changes make it less likely than it might have been.

The most important thing the FCC can do in implementing its system of Internet oversight is to elevate process over proscriptive rules. The traditional approach to telecom regulation is to develop a thick sheath of regulations that govern everything from the insignias on the telephone repair person’s uniform to the colors of the insulators on RJ11 cables and apply them in top-down, command-and-control fashion. Many of those on the pro-net neutrality side are steeped in telecom tradition, and they expected such an approach from the FCC for the Internet; theirs are the angry reactions.

But the Internet isn’t a telecom network, and a foot-high stack of regulations certainly would produce the negative consequences for innovation and progress the FCC’s critics have forecast. The appropriate way to address Internet regulation as to follow the model that the Internet has developed for itself, based on a small number of abstract but meaningful principles (each of which is subject to change for good reason) applied by a broad-based community of experts in a collaborative, consultative setting. Internet standards are not devised in an adversarial setting populated by angels and devils locked into mortal combat; they come from a process that values “rough consensus and running code.”

The specifics of the FCC’s order nevertheless give pause to those well-schooled in networking. A few hours before the Commission’s vote, Commissioner Copps persuaded Chairman Genachowski to reverse the Waxman Bill’s presumption regarding the premium transport services that enable Internet TV and video conferencing to enjoy the same level of quality as cable TV. Where the early drafts permitted these services as long as they were offered for sale on a non-discriminatory basis, the final rule arbitrarily presumes them harmful.

The order makes hash of the relationship of the content accelerators provided by Akamai and others to the presumptively impermissible communication accelerators that ISPs might provide one day in order to enable HD group video conferencing and similar emerging applications. The Commission majority fears that allowing network operators to offer premium transport to leading edge apps will put the squeeze on generic transport, but fails to consider that such potential downsides of well-accepted technical practices for Quality of Service can be prevented by applying a simple quota limit on the percentage of a pipe that can be sold as “premium.” This fact, which is obvious to skilled protocol engineers, goes unmentioned in the order.

The poor reasoning for this rule casts doubt on the FCC’s ability to enforce it effectively without outside expertise. By rejecting Internet standards such as RFC 2475 and IEEE standards such as 802.1Q that don’t conform to the telecom activists’ nostalgic, “all packets are equal” vision of the Internet, the FCC chose to blind itself to one of the central points in Tim Wu’s “Network Neutrality, Broadband Discrimination” paper that started the fight: A neutral Internet favors content applications, as a class, over communication applications and is therefore not truly an open network. The only way to make a network neutral among all applications is to differentiate loss and delay among applications; preferably, this is done by user-controlled means. That’s not always possible, so other means are sometimes necessary as well.

All in all, the Commission has built a stone too heavy for it to lift all by itself. The rules have just enough flexibility that the outside technical advisory groups that will examine complaints may be able to correct the order’s errors, but to be effective, the advisors need much deeper technical knowledge than the FCC staffers who wrote the order can provide.

It’s difficult to ask the FCC – an institution with its own 75 year tradition in which it has served as the battleground for bitter disputes between monopolists and public interest warriors – to turn on a dime and embrace a new spirit of collaboration, but without such a far-reaching institutional transformation its Internet regulation project will not be successful. Those of us who work with the FCC are required to take a leap of faith to the effect that the Commission is committed to transforming itself from a hidebound analog regulator into a digital age shepherd of innovation. Now that the Open Internet Report & Order has passed, we have no choice but to put our shoulders to the rock to help push it along. There’s no turning back now.

[cross-posted from the Innovation Policy Blog]

Iranian Protests

Andrew Sullivan is the one-man, citizen journalism aggregator of the protests in Iran today. His collection of Tweets and YouTube videos convey the impression of a large-scale uprising that the government is trying to control with riot police, chemical weapons, and propaganda. It certainly appears that the uprising is gathering steam and that the government is out-matched. Given that the Supreme Leader relies on his moral authority to govern, and that authority is now shot full of holes, it seems unlikely that he can hang on to power.

Twitter and YouTube are certainly playing a role in getting the news out of the blackout the Iranian government has sought to impose.

What’s happening in Iran?

BusinessWeek isn’t buying the story that Twitter is the essential organizing tool for the protests in Iran over suspicious election results:

“I think the idea of a Twitter revolution is very suspect,” says Gaurav Mishra, co-founder of 20:20 WebTech, a company that analyzes the effects of social media. “The amount of people who use these tools in Iran is very small and could not support protests that size.”

Their assessment is that people are organizing the old-fashioned way, by word-of-mouth and SMS. Ancient technology, that SMS. But it is a great story, either way.

Finally, nominees for the FCC

Amy Schatz of the WSJ reports that a deal has been struck to move the new nominees into the FCC:

Work has slowed to a crawl at the Federal Communications Commission, since President Barack Obama’s pick to be chairman, Julius Genachowski, is still awaiting Senate confirmation.

But the logjam could be broken soon: Republicans appear to have settled on two people to fill the GOP seats on the five-member board, paving the way for a confirmation hearing in June. Senate Republicans have agreed on former Commerce Department official Meredith Attwell Baker and current FCC Commissioner Robert McDowell, officials close to the process say.

This is good news. McDowell has been the best of the FCC commissioners since his appointment, and allowing him a second term is a very bright move. Uncertainty over McDowell’s future was the cause of the slowdown in confirmation hearings, since these things go forward with the whole slate of nominees. So the new FCC is going to look this this:

Chairman Genachowski, new blood
Dem Copps, old hand
Dem Mignon Clyburn, new blood
Rep McDowell
Rep Meredith Baker, new blood

It’s interesting that Baker and Clyburn are both nepotism candidates, as Clyburn is the daughter of powerful Congressman James Clyburn and Baker is the daughter-in-law of the Bush family’s consigliere, James Baker. That’s not necessarily a bad thing, as the best Chairman of recent times was Colin Powell’s son, and neither of the daughters is particularly unqualified. But if you want to get a laugh out of Blair Levin, the former “sixth commissioner” who wasn’t nominated, tell him you understand that he’s not qualified to serve on the FCC because his daddy’s not in politics. You won’t get a laugh exactly, more like a moan.

The first item of business for the nominees, once they’re confirmed, will be the list of 120 questions Copps put to the world. Good luck to the Commission with that.

The Privacy Hearing

Here’s some news on Boucher’s privacy campaign:

It’s not clear how broad a law Boucher has in mind, though it’s likely to be some codification of generally accepted data-privacy practices. Those include telling people when you collect data and why, letting them choose to join in or not, using the data only for the reason you collected it, letting people see and correct the information and destroying it when its not longer needed.

But engineer Richard Bennett argued that DPI and network management techniques were getting a bad name and are simply the logical extension of the tools used in the early days of the internet.

Hoping to convince the subcommittee not to write legislation, AT&T’s chief privacy officer Dorothy Atwood said that the committee’s previous hearings and investigations have led to “robust self-regulation,” code-words for “no laws needed.” There’s some truth in that statement, since last summer, the subcommittee single-handedly ended ISPs dreams of letting outside companies spy on their subscribers in exchange for a little more revenue.

If the privacy is the problem, it needs to be the focus of the bill, not one of many techniques that may be used to compromise it, of course.

What I Did This Morning

While California was sleeping, I enjoyed a bit of broadband politics in the heart of the beast, testifying at the House Subcommittee on Communications, Technology, and the Internet on Communications Networks and Consumer Privacy: Recent Developments

The Subcommittee on Communications, Technology, and the Internet held a hearing titled, “Communications Networks and Consumer Privacy: Recent Developments” on Thursday, April 23, 2009, in 2322 Rayburn House Office Building. The hearing focused on technologies that network operators utilize to monitor consumer usage and how those technologies intersect with consumer privacy. The hearing explored three ways to monitor consumer usage on broadband and wireless networks: deep packet inspection (DPI); new uses for digital set-top boxes; and wireless Global Positioning System (GPS) tracking.

Witness List

* Ben Scott, Policy Director, Free Press
* Leslie Harris, President and CEO, Center for Democracy and Technology
* Kyle McSlarrow, President and CEO, National Cable and Telecommunications Association
* Dorothy Attwood, Chief Privacy Officer and Senior Vice President, Public Policy, AT&T Services, Inc.
* Brian R. Knapp, Chief Operating Officer, Loopt, Inc.
* Marc Rotenberg, Executive Director, The Electronic Privacy Information Center
* Richard Bennett, Publisher, BroadbandPolitics.com

It went pretty well, all in all; it’s really good to be last on a panel, and the Reps aren’t as snarky as California legislators. I’ll have more on this later.

, ,

Obama’s Missed Opportunity

According to National Journal, Susan Crawford is joining the Obama administration in a significant new role:

Internet law expert Susan Crawford has joined President Barack Obama’s lineup of tech policy experts at the White House, according to several sources. She will likely hold the title of special assistant to the president for science, technology, and innovation policy, they said.

This does not make me happy. Crawford is not a scientist, technologist, or innovator, and the job that’s been created for her needs to be filled by someone who is; and an exceptional one at that, a person with deep knowledge of technology, the technology business, and the dynamics of research and business that promote innovation. A life as a legal academic is not good preparation for this kind of a job. Crawford is a sweet and well-meaning person, who fervently believes that the policy agenda she’s been promoting is good for the average citizen and the general health of the democracy and that sort of thing, but she illustrates the adage that a little knowledge is a dangerous thing.

As much as she loves the Internet and all that it’s done for modern society, she has precious little knowledge about the practical realities of its operation. Her principal background is service on the ICANN Board, where she listened to debates on the number of TLDs that can dance on the head of pin and similarly weighty matters. IETF engineers generally scoff at ICANN as a bloated, inefficient, and ineffective organization that deals with issues no serious engineer wants anything to do with. Her other qualification is an advisory role at Public Knowledge, a big player on the Google side of the net neutrality and copyright debates.

At my recent net neutrality panel discussion at MAAWG, I warned the audience that Crawford’s selection to co-manage the Obama transition team’s FCC oversight was an indication that extreme views on Internet regulation might become mainstream. It appears that my worst fears have been realized. Crawford has said that Internet traffic must not be shaped, managed, or prioritized by ISPs and core networking providers, which is a mistake of the worst kind. While work is being done all over the world to adapt the Internet to the needs of a more diverse mix of applications than it’s traditionally handled, Crawford harbors the seriously misguided belief that it already handles diverse applications well enough. Nothing could be farther from the truth, of course: P2P has interesting uses, but it degrades the performance of VoIP and video calling unless managed.

This is an engineering problem that can be solved, but which won’t be if the constraints on traffic management are too severe. People who harbor the religious approach to network management that Crawford professes have so far been an interesting sideshow in the network management wars, but if their views come to dominate the regulatory framework, the Internet will be in serious danger.

Creating a position for a special adviser on science, technology and innovation gave President Obama the opportunity to to lay the foundation of a strong policy in a significant area. Filling it with a law professor instead of an actual scientist, technologist, or innovator simply reinforces the creeping suspicion that Obama is less about transformational change than about business as usual. That’s a shame.

Cross-posted at CircleID.

, , , ,

Thirty Profiles

Dave Burstein of DSL Prime has posted profiles of 30 FCC candidates to his web site, including one transition team member:

Susan Crawford, now teaching at Michigan, also has enormous respect from her peers and would bring international perspective from her role at ICANN setting world Internet policy

The selection of Crawford to join Kevin Werbach on the FCC transition team has already gotten some of my colleagues on the deregulatory side pretty excited, as she has the image of being a fierce advocate of a highly-regulated Internet. And indeed, she has written some strong stuff in favor of the “stupid network” construct that demands all packets be treated as equals inside the network. The critics are missing something that’s very important, however: both Werbach and Crawford are “Internet people” rather than “telecom people” and that’s a very important thing. While we may not like Crawford’s willingness to embrace a neutral routing mandate in the past, the more interesting question is how she comes down on a couple of issues that trump neutral routing, network management and multi-service routing.

We all know by now that the network management exception is more powerful than Powell’s “Four Freedoms” where the rubber meets the road, but we lack any clear guidance to ISPs as to how their management practices will be evaluated. Clarification of the rules is as much a benefit to carriers as it is to consumers. The one way to ensure that we all lose is to keep lumbering along in the murk of uncertain authority and secret rules. Internet people are going to ask the right questions to their candidates, and anybody who can satisfy both Werbach and Crawford will have to be a good choice. Check Werbach’s web site for his papers. Unfotunately, the most interesting of them is not yet in print, “The Centripetal Network: How the Internet Holds Itself Together, and the Forces Tearing it Apart”, UC Davis Law Review, forthcoming 2008. Perhaps he’ll post a draft.

The question of multi-service routing is also very important. Crawford has written and testified to the effect that the Internet is the first global, digital, multi-service network, which is substantially correct. The Internet is not fully multi-service today, however, and can’t be unless it exposes multiple service levels at the end points for applications to use easily. The generic public Internet has a single transport service which has to meet the needs of diverse applications today, which is not really an achievable goal in the peer-to-peer world.
Continue reading “Thirty Profiles”

Election Story

This little gem is from FiveThirtyEight.com

So a canvasser goes to a woman’s door in Washington, Pennsylvania. Knocks. Woman answers. Knocker asks who she’s planning to vote for. She isn’t sure, has to ask her husband who she’s voting for. Husband is off in another room watching some game. Canvasser hears him yell back, “We’re votin’ for the n***er!”

Woman turns back to canvasser, and says brightly and matter of factly: “We’re voting for the n***er.”

In this economy, racism is officially a luxury. How is John McCain going to win if he can’t win those voters?

I surmise that Tuesday night’s election night coverage isn’t going to take very long.