What’s happening in Iran?

BusinessWeek isn’t buying the story that Twitter is the essential organizing tool for the protests in Iran over suspicious election results:

“I think the idea of a Twitter revolution is very suspect,” says Gaurav Mishra, co-founder of 20:20 WebTech, a company that analyzes the effects of social media. “The amount of people who use these tools in Iran is very small and could not support protests that size.”

Their assessment is that people are organizing the old-fashioned way, by word-of-mouth and SMS. Ancient technology, that SMS. But it is a great story, either way.

What slows down your Wi-Fi?

The Register stumbled upon an eye-opening report commissioned by the UK telecom regulator, Ofcom, on sources of Wi-Fi interference in the UK:

What Mass discovered (pdf) is that while Wi-Fi users blame nearby networks for slowing down their connectivity, in reality the problem is people watching retransmitted TV in the bedroom while listening to their offspring sleeping, and there’s not a lot the regulator can do about it.

Outside central London that is: in the middle of The Smoke there really are too many networks, with resends, beacons and housekeeping filling 90 per cent of the data frames sent over Wi-Fi. This leaves only 10 per cent for users’ data. In fact, the study found that operating overheads for wireless Ethernet were much higher than anticipated, except in Bournemouth for some reason: down on the south coast 44 per cent of frames contain user data.

When 90% of the frames are overhead, the technology itself has a problem, and in this case it’s largely the fact that there’s such a high backward-compatibility burden in Wi-Fi. Older versions of the protocol weren’t designed for obsolescence, so the newer systems have to take steps to ensure the older systems can see them, expensive ones, or collisions happen, and that’s not good for anybody. Licensed spectrum can deal with the obsolescence problem by replacing older equipment; open spectrum has to bear the costs of compatibility forever. So this is one more example of the fact that “open” is not always better.

What Policy Framework Will Further Enable Innovation on the Mobile Net?

Here’s the video of the panel I was on at the Congressional Internet Caucus Advisory Committee’s “State of the Mobile Net” conference in DC last Thursday. This was the closing panel of the conference, where all the loose ends were tied together. For those who don’t live and breath Washington politics, I should do what moderator Blair Levin didn’t do and introduce the panel. Levin was the head of the TIGR task force for the Obama transition, the master group for the review of the regulatory agencies and the administration’s use of technology. Kevin Werbach is a professor at the Wharton School, and took part in the FCC review for the transition along with Susan Crawford. He runs the Supernova conference. Larry Irving was part of the review of NTIA for the transition, and is a former Assistant Secretary of Commerce. Ben Scott is the policy guy at Free Press, and Alex Hoehn-Saric is legal counsel to the Senate Committee on Commerce, Science and Transportation.

Regulatory policy needs to be technically grounded, so I emphasized the tech side of things.

eComm Spectrum 2.0 Panel Video

Here’s the licensing panel from eComm live and in color. Seeing yourself on TV is weird; my immediate reaction is to fast for about a month.

On a related note, see Saul Hansell’s musings on spectrum.

The issue I wanted to raise at eComm and couldn’t due to lack of time and the meandering speculations about collision-free networks is spectrum sharing. Two-way communications systems all need a shared pipe at some level, and the means by which access to the pipe are mediated distinguish one system from another. So far, the debate on white spaces in particular and open spectrum in general is about coding and power levels, the easy parts of the problem. The hard part is how the system decides which of a number of competing transmitters can access the pipe at any given time. The fact that speculative coding systems might permit multiple simultaneous connections on the same frequency in the same space/time moment doesn’t make this question go away, since they only help point-to-point communications. Internet access is inherently a point-to-multipoint problem as theses system all aggregate wireless systems in order to move them to the fiber backbone.

The advantage of licensing is that it provides the spectrum with an authorized bandwidth manager who can mediate among the desires of competing users and ensure fairness per dollar (or some similar policy.) The idea that we can simply dispense with a bandwidth manager in a wide-area network access system remains to be proved.

So I would submit that one of the principles that regulators need to consider when deciding between licensed and unlicensed uses is the efficiency of access. The notion that efficiency can be discarded in favor of ever-fatter pipes is obviously problematic in relation to wireless systems; they’re not making more spectrum.

SxSW Wireless Meltdown

There’s nothing like a hoarde of iPhone users to kill access to to AT&T’s wireless network: my AT&T Blackberry Bold was nearly unusable at eComm because of the large number of iPhones in the room, and the situation at SxSW is roughly the same. The silver lining in Austin this week is that the show’s Wi-Fi network is working well. Part of the trick is the deployment of Cisco 1252 Access Points with 5 GHz support. Unlike the Bold, iPhones can’t operate on 5 GHz channels, so all that spectrum is free for the taking by Bolds and laptops that can operate on it. In a concession to Macbook users who aren’t allowed to select a Wi-Fi band, the show net had different ESSID’s for 2.4 and 5 GHz operation. It also has a load of reasonable restrictions:

Acceptable Use Policy

The Wireless network at the Convention Center is designed for blogging, e-mail, surfing and other general low bandwidth applications. It is not intended for streaming of any sort.

a) Peer-to-peer traffic such as bittorrent and the like, use a disproportionate amount of bandwidth and are unfair to other attendees. Please refrain from non-conference related peer-to-peer activities to minimize this effect.

b) Please be considerate and share the bandwidth with your fellow attendees. Downloading all of the videos from a video sharing service for example, is being a hog.

c) Please do not actively scan the network. Many of the tools for scanning an address range are too efficient at using as much bandwidth as possible, this will likely be noticed.

Despite this AUP, I can confidently predict that speakers will demand unrestricted use of wireless spectrum.

Slight disconnect, eh?

UPDATE: Om of GigaOm reports that AT&T is addressing the problems in Austin by switching on the 850 MHz band in their downtown Austin towers:

AT&T’s network choked and suddenly everyone was up in arms. And then Ma Bell got in touch with Stacey, who reported that AT&T was boosting its network capacity.

How did they do this? By switching on 850 MHz band on eight cell towers to blanket the downtown Austin area. This was in addition to the existing capacity on the 900 MHz band. AT&T is going to make the same arrangements in San Francisco and New York by end of 2009, AT&T Mobility CEO Ralph de la Vega told Engadget.

Not all of your AT&T devices support the 850 MHz band, but the Bold does. The larger takeaway, however, is that all wireless systems become victim to their own success. The more people use them, the worse they get. C’est la vie.

, ,

Spectrum 2.0 panel from eComm

Courtesy of James Duncan Davidson, here’s a snap from the Spectrum 2.0 panel at eComm09.

Maura Corbett, Rick Whitt, Peter Ecclesine, Darrin Mylet, and Richard Bennett at eComm
Maura Corbett, Rick Whitt, Peter Ecclesine, Darrin Mylet, and Richard Bennett at eComm

The general discussion was about the lessons learned from light licensing of wireless spectrum in the US, on the success of Wi-Fi and the failure of UWB, and what we can realistically hope to gain from the White Spaces licensing regime. As a person with a foot in both camps – technical and regulatory – it was an interesting exercise in the contrast in the ways that engineers and policy people deal with these issues. In general, hard-core RF engineer Peter Ecclesine and I were the most pessimistic about White Space futures, while the policy folks still see the FCC’s Report and Order as a victory.

In lobbying, you frequently run into circumstances where the bill you’re trying to pass becomes so heavily encumbered with amendments that it’s not worth passing. Rather than get your policy vehicle adopted in a crippled form, it’s better in such circumstances to take it off the table and work with the decision-makers to revive it in a future session without the shackles. While this is a judgment call – sometimes you go ahead and take the victory hoping to fix it later – it’s dangerous to pass crippled bills in a tit-for-tat system because you’re conceding a win in the next round to the other side.

I suggested that the FCC’s order was so badly flawed that the best thing for White Space Liberation would be to have the court void the order and the FCC to start over. This message wasn’t well-received by Rick Whitt, but I had the feeling Peter is on board with it.

The problem with the White Spaces is that the FCC couldn’t make up its mind whether these bands are best used for home networking or for a Third (or is it fourth or fifth?) pipe. The power limits (40 milliwatts to 1 watt) doom it to home networking use only, which simply leads to more fragmentation in the home net market and no additional WAN pipes. That’s not the outcome the champions of open networks wanted, but it’s what they got.

eComm, incidentally, is a terrific conference. The focus is very much on the applications people are developing for mobile phones, and it’s essential for people like me who build networks to see what people want to do with them, especially the things they can’t do very well today. Lee Dryburgh did a fantastic job of organization and selecting speakers, and is to be congratulated for putting on such a stellar meeting of the minds.

Storm not winning any raves

Om Malik isn’t impressed by the BlackBerry Storm and neither am I:

The Storm reminds me of the St. Louis Cardinals phenom Rich Ankiel, who was an awesome pitcher till he flamed out, got hurt and came back as an outfielder and a hitter. He scored a lot of runs last seasons, but he isn’t a center fielder like Mickey Mantle. He is just another player. Storm will be that — just another touch-screen smartphone.

He points out that Blackberry excels at text, which is merely adequate on a touch screen. The omission of Wi-Fi makes the Storm unacceptable for me, so I reluctantly got a G1 to replace my lost Blackberry Curve, and I’m not exactly Google’s biggest fan (see next post.)

Ankiel’s OPS, .843, ranks 78th in the National League, BTW, which is the definition of mediocre.

The Trouble with White Spaces

Like several other engineers, I’m disturbed by the white spaces debate. The White Space Coalition, and its para-technical boosters, argue something like this: “The NAB is a tiger, therefore the White Spaces must be unlicensed.” And they go on to offer the comparison with Wi-Fi and Bluetooth, arguing as Tom Evslin does on CircleID today that “If we got a lot of innovation from just a little unlicensed spectrum, it’s reasonable to assume that we’ll get a lot more innovation if there’s a lot more [unlicensed] spectrum available.”

According to this argument, Wi-Fi has been an unqualified success in every dimension. People who make this argument haven’t worked with Wi-Fi or Bluetooth systems in a serious way, or they would be aware that there are in fact problems, serious problems, with Wi-Fi deployments.

For one thing, Wi-Fi systems are affected by sources of interference they can’t detect directly, such as FM Baby Monitors, cordless phones, and wireless security cameras. Running Wi-Fi on the same channel as one of these devices causes extremely high error rates. If 2.4 and 5.x GHz devices were required to emit a universally detectable frame preamble much of this nonsense could be avoided.

And for another, we have the problem of newer Wi-Fi devices producing frames that aren’t detectable by older (esp. 802.11 and 802.11b gear) without an overhead frame that reduces throughput substantially. If we could declare anything older than 802.11a and .11g illegal, we could use the spectrum we have much more efficiently.

For another, we don’t have enough adjacent channel spectrum to use the newest version of Wi-Fi, 40 MHz 802.11n, effectively in the 2.4 GHz band. Speed inevitably depends on channel width, and the white spaces offer little dribs and drabs of spectrum all over the place, much of it in non-adjacent frequencies.

But most importantly, Wi-Fi is the victim of its own success. As more people use Wi-Fi, we have share the limited number of channels across more Access Points, and they are not required to share channel space with each other in a particularly efficient way. We can certainly expect a lot of collisions, and therefore packet loss, from any uncoordinated channel access scheme, as Wi-Fi is, on a large geographic scale. This is the old “tragedy of the commons” scenario.

The problem of deploying wireless broadband is mainly a tradeoff of propagation, population, and bandwidth. The larger the population your signal covers, the greater the bandwidth needs to be in order to provide good performance. The nice thing about Wi-Fi is its limited propagation, because it permits extensive channel re-use without collisions. if the Wi-Fi signal in your neighbor’s house propagated twice as far, it has four times as many chances to collide with other users. So high power and great propagation isn’t an unmitigated good.

The advantage of licensing is that the license holder can apply authoritarian rules that ensure the spectrum is used efficiently. The disadvantage is that the license holder can over-charge for the use of such tightly-managed spectrum, and needs to in order to pay off the cost of his license.

The FCC needs to move into the 21st century and develop some digital rules for the use of unlicensed or lightly-licensed spectrum. The experiment I want to see concerns the development of these modern rules. We don’t need another Wi-Fi, we know how it worked out.

So let’s don’t squander the White Spaces opportunity with another knee-jerk response to the spectre of capitalism. I fully believe that people like Evslin, the White Space Coalition, and Susan Crawford are sincere in their belief that unlicensed White Spaces would be a boon to democracy, it’s just that their technical grasp of the subject matter is insufficient for their beliefs to amount to serious policy.

Google open-sources Android

I lost my Blackberry Curve somewhere in England last week, so I ordered an HTC G1 from T-Mobile as a replacement. The Curve doesn’t do 3G, so it’s an obsolete product at this point. And as I’m already a T-Mobile customer (I chose them for the Wi-Fi capability of their Curves,) the path of least resistance to 3G goes through the G1. Just yesterday I was explaining to somebody that Android wasn’t really open source, but Google was apparently listening and decided to make a liar of me by open-sourcing Android:

With the availability of Android to the open-source community, consumers will soon start to see more applications like location-based travel tools, games and social networking offerings being made available to them directly; cheaper and faster phones at lower costs; and a better mobile web experience through 3G networks with richer screens.The easy access to the mobile platform will not only allow handset makers to download the code, but to build devices around it. Those not looking to build a device from scratch will be able to take the code and modify it to give their devices more of a unique flavor.

“Now OEMs and ODMs who are interested in building Android-based handsets can do so without our involvement,” Rich Miner, Google’s group manager for mobile platforms, told us earlier today. Some of these equipment makers are going to expand the role of Android beyond handsets.

This is good news, of course. I haven’t enjoyed the fact that T-Mobile sat between me and RIM for Blackberry software upgrades. The first add-on app that I’d like to see for the G1 is something to allow tethering a laptop to 3G via Bluetooth. I could tether the Curve, but as it only supports Edge it wasn’t incredibly useful.

In a more perfect world, I’d prefer the Treo Pro over the G1, but it doesn’t work on T-Mobile’s crazy array of AWS and normal frequencies, and is also not subsidized, so the G1 is a better deal. The Blackberry Storm is probably a better overall device than the G1, but it’s exclusive to Verizon so I would have had to pay a $200 early termination fee to get it. These phones are mainly for fun, so paying a fee to leave a carrier I basically like makes it all too serious.