SxSW Wireless Meltdown

There’s nothing like a hoarde of iPhone users to kill access to to AT&T’s wireless network: my AT&T Blackberry Bold was nearly unusable at eComm because of the large number of iPhones in the room, and the situation at SxSW is roughly the same. The silver lining in Austin this week is that the show’s … Continue reading “SxSW Wireless Meltdown”

There’s nothing like a hoarde of iPhone users to kill access to to AT&T’s wireless network: my AT&T Blackberry Bold was nearly unusable at eComm because of the large number of iPhones in the room, and the situation at SxSW is roughly the same. The silver lining in Austin this week is that the show’s Wi-Fi network is working well. Part of the trick is the deployment of Cisco 1252 Access Points with 5 GHz support. Unlike the Bold, iPhones can’t operate on 5 GHz channels, so all that spectrum is free for the taking by Bolds and laptops that can operate on it. In a concession to Macbook users who aren’t allowed to select a Wi-Fi band, the show net had different ESSID’s for 2.4 and 5 GHz operation. It also has a load of reasonable restrictions:

Acceptable Use Policy

The Wireless network at the Convention Center is designed for blogging, e-mail, surfing and other general low bandwidth applications. It is not intended for streaming of any sort.

a) Peer-to-peer traffic such as bittorrent and the like, use a disproportionate amount of bandwidth and are unfair to other attendees. Please refrain from non-conference related peer-to-peer activities to minimize this effect.

b) Please be considerate and share the bandwidth with your fellow attendees. Downloading all of the videos from a video sharing service for example, is being a hog.

c) Please do not actively scan the network. Many of the tools for scanning an address range are too efficient at using as much bandwidth as possible, this will likely be noticed.

Despite this AUP, I can confidently predict that speakers will demand unrestricted use of wireless spectrum.

Slight disconnect, eh?

UPDATE: Om of GigaOm reports that AT&T is addressing the problems in Austin by switching on the 850 MHz band in their downtown Austin towers:

AT&T’s network choked and suddenly everyone was up in arms. And then Ma Bell got in touch with Stacey, who reported that AT&T was boosting its network capacity.

How did they do this? By switching on 850 MHz band on eight cell towers to blanket the downtown Austin area. This was in addition to the existing capacity on the 900 MHz band. AT&T is going to make the same arrangements in San Francisco and New York by end of 2009, AT&T Mobility CEO Ralph de la Vega told Engadget.

Not all of your AT&T devices support the 850 MHz band, but the Bold does. The larger takeaway, however, is that all wireless systems become victim to their own success. The more people use them, the worse they get. C’est la vie.

, ,

Notable debates in the House of Lords

We’re quite fond of Sir Tim Berners-Lee. As the first web designer, he personally converted the Internet from an odd curiosity of network engineering into a generally useful vehicle for social intercourse, changing the world. That this was a contribution of inestimable value goes without saying. It’s therefore distressing to read that he’s been mumbling … Continue reading “Notable debates in the House of Lords”

We’re quite fond of Sir Tim Berners-Lee. As the first web designer, he personally converted the Internet from an odd curiosity of network engineering into a generally useful vehicle for social intercourse, changing the world. That this was a contribution of inestimable value goes without saying. It’s therefore distressing to read that he’s been mumbling nonsense in public fora about Internet management practices.

For all his brilliance, Sir Tim has never really been on top of the whole traffic thing. His invention, HTTP 1.0, did strange things to the Internet’s traffic handling system: his decision to chunk segments into 512 byte pieces tripled the number of packets the Internet had to carry per unit of information transfer, and his decision to open a unique TCP stream for every object (section of text or graphic image) on a web page required each part of each page to load in TCP’s “slow start” mode. Carriers massively expanded the capacity of their pipes in a vain attempt to speed up web pages, as poor performance was designed into Sir Tim’s protocol. Hence the term “world-wide wait” had to be coined to describe the system, and more experienced engineers had to produce HTTP 1.1 to eliminate the tortured delay. This is not to bash His Eminence, but rather to point out that all of us, even the geniuses, have limited knowledge.

At a House of Lords roundtable last week, Sir Tim took up a new cause by way of complaining about one of the ways that personal information may be obtained on the Internet:

Speaking at a House of Lords event on the 20th anniversary of the invention of the World Wide Web, Berners-Lee said that deep packet inspection was the electronic equivalent of opening people’s mail.

“This is very important to me, as what is at stake is the integrity of the internet as a communications medium,” Berners-Lee said on Wednesday. “Clearly we must not interfere with the internet, and we must not snoop on the internet. If we snoop on clicks and data, we can find out a lot more information about people than if we listen to their conversations.”

Deep packet inspection involves examining both the data and the header of an information packet as it passes a ‘black box’ on a network, in order to reveal the content of the communication.

Like many opponents of the scary-sounding “deep packet inspection,” His Eminence confuses means and ends. There are many ways to obtain personal information on the Internet; the preceding post was about one of them. Given the choice, most of us would gladly surrender some level of information in order to obtain free services or simply better-targeted ads. As long as the Internet is considered a bastion of “free-” (actually, “advertising-supported-“) culture and information, personal information gathering will be the coin of the realm. So it doesn’t much matter if my privacy is violated by a silly packet-snooping system that I can easily thwart by encrypting my data or by an overly-invasive ad placement system, it’s gone either way. So if he’s manic about privacy, he should address the practice of information-gathering itself and not simply one means of doing it.

Nonsense is not unknown in the House of Lords, however. One of the most entertaining debates in the history of Western democracy took place in that august body, the infamous UFO debate:

The big day came on 18 January 1979 in the middle of a national rail strike. But the industrial crisis did nothing to dampen interest in UFOs. The debate was one of the best attended ever held in the Lords, with sixty peers and hundreds of onlookers – including several famous UFOlogists – packing the public gallery.

Lord Clancarty opened the three hour session at 7pm “to call attention to the increasing number of sightings and landings on a world wide scale of UFOs, and to the need for an intra-governmental study of UFOs.” He wound up his speech by asking the Government reveal publicly what they knew about the phenomenon. And he appealed to the Labour Minister of Defence, Fred Mulley, to give a TV broadcast on the issue in the same way his French counterpart, M. Robert Galley, had done in 1974.

The pro-UFO lobby was supported eloquently by the Earl of Kimberley, a former Liberal spokesman on aerospace, who drew upon a briefing by the Aetherius Society for his UFO facts (see obituary, FT 199:24). Kimberley’s views were evident from an intervention he made when a Tory peer referred to the Jodrell Bank radio telescope’s failure to detect a single UFO: “Does the noble Lord not think it conceivable that Jodrell Bank says there are no UFOs because that is what it has been told to say?”

More than a dozen peers, including two eminent retired scientists, made contributions to the debate. Several reported their own sightings including Lord Gainford who gave a good description of the Cosmos rocket, “a bright white ball” like a comet flying low over the Scottish hills on New Year’s Eve. Others referred to the link between belief in UFOs and religious cults. In his contribution the Bishop of Norwich said he was concerned the UFO mystery “is in danger of producing a 20th century superstition” that sought to undermine the Christian faith.

Perhaps their Lordships will invite His Eminence to observe an actual debate on Internet privacy, now that he’s set the stage with the roundtable. I think it would be absolutely smashing to see 40 of Bertie Wooster’s elderly uncles re-design the Web. Maybe they can add a comprehensive security model to the darned thing.

On a related note, Robb Topolski presented the worthies with a vision of the Web in a parallel universe that sent many scurrying back to their country estates to look after their hedgehogs. Topolski actually spoke about North American gophers, but the general discussion brings to mind the hedgehog’s dilemma of an open, advertising-supported Internet: a system that depends on making the private public is easily exploited.

UPDATE: Incidentally, Topolski’s revisionist history of the Web has been harshly slapped-down by the Boing-Boing readers who should be a friendly audience:

Huh? What a bizarre claim. Is he saying that network admins weren’t capable of blocking port 80 when HTTP was getting off its feet?!?

Wha? Even ignoring the fact that network admins at the time _did_ have the tools to block/filter this kind of traffic, this would still have little or nothing to do with endpoint computing power.

Oh, man. This is defintely junk.

Revisionist history in the name of greater freedom is still a lie.

Follow this link to a discussion from 1993 about how to make a Cisco firewall block or permit access to various Internet services by port. HTTP isn’t in the example, but the same rules apply. The power was clearly there.

Welcome to the NAF, Robb, do your homework next time.

, ,

Opting-out of Adsense

Regular readers are aware that this blog used to feature Google ads. We never made serious money from Adsense, so it was easy to decide to drop it when the Terms and Conditions of Google’s new behavioral advertising campaign were relased. Here’s what Google suggests re: a privacy disclosure: What should I put in my … Continue reading “Opting-out of Adsense”

Regular readers are aware that this blog used to feature Google ads. We never made serious money from Adsense, so it was easy to decide to drop it when the Terms and Conditions of Google’s new behavioral advertising campaign were relased. Here’s what Google suggests re: a privacy disclosure:

What should I put in my privacy policy?

Your posted privacy policy should include the following information about Google and the DoubleClick DART cookie:

* Google, as a third party vendor, uses cookies to serve ads on your site.
* Google’s use of the DART cookie enables it to serve ads to your users based on their visit to your sites and other sites on the Internet.
* Users may opt out of the use of the DART cookie by visiting the Google ad and content network privacy policy.

Because publisher sites and laws across countries vary, we’re unable to suggest specific privacy policy language. However, you may wish to review resources such as the Network Advertising Initiative, or NAI, which suggests the following language for data collection of non-personally identifying information:

We use third-party advertising companies to serve ads when you visit our website. These companies may use information (not including your name, address, email address, or telephone number) about your visits to this and other websites in order to provide advertisements about goods and services of interest to you. If you would like more information about this practice and to know your choices about not having this information used by these companies, click here.

You can find additional information in Appendix A of the NAI Self-Regulatory principles for publishers (PDF). Please note that the NAI may change this sample language at any time.

People don’t come to this site to buy stuff, and they shouldn’t have to undergo a vexing decision-making process before visiting this blog, so we’ve dropped Google as an advertiser. Not because Google is Evil, but simply because this is one too many hoops for our readers to jump through. Plus, the commission rate sucks.

So please continue to read Broadband Politics without fear of being reported to Big Brother.

This is not a misprint

How many companies are actually increasing headcount these days? Not many, but AT&T is adding 3,000 jobs to expand its 3G network: Despite a capex cut of up to $3 billion this year, AT&T Inc. (NYSE: T) made it clear today that it intends to spend to improve and expand its 3G network — adding … Continue reading “This is not a misprint”

How many companies are actually increasing headcount these days? Not many, but AT&T is adding 3,000 jobs to expand its 3G network:

Despite a capex cut of up to $3 billion this year, AT&T Inc. (NYSE: T) made it clear today that it intends to spend to improve and expand its 3G network — adding 3,000 jobs in the process to support “mobility, broadband, and video.”

This comes on the heels of an announcement last December of a cut of 12,000 jobs, so it’s not quite as against-the-grain as it might seem. Still, it’s good news for 3,000 people and a counter-indicator of permanent global economic collapse.

A little bit breathless

The UK has offered some language to the EU regulators on Internet services that would clarify the relationship between users and providers and require full disclosure of management practices by the latter. The measure address the prime source of friction between the package of end user freedoms and the network management exception that we currently … Continue reading “A little bit breathless”

The UK has offered some language to the EU regulators on Internet services that would clarify the relationship between users and providers and require full disclosure of management practices by the latter. The measure address the prime source of friction between the package of end user freedoms and the network management exception that we currently have in the US, absent a coherent regulatory framework for Internet services.

Most of us would probably say, after reading the whole package, that consumer rights are advanced by it. But most of us aren’t fire-breathing neutrality monsters who can’t be bothered with the practical realities of network operation. The actual document the Brits are circulating is here; pay special attention to the Rationale.

The operative language establishes the principle that there are in fact limits to “running the application of your choice” and “accessing and sharing the information of your choice” on the Internet, which is simply stating some of the facts of life. If you’re not allowed to engage in identity theft in real life, you’re also not allowed to do so on the Internet; if you’re not allowed to violate copyright in real life, you’re also not allowed to do so on the Internet; and so on. Similarly, while you’re allowed to access the legal content and services of your choice, you’re not allowed to access them at rates that exceed the capacity of the Internet or any of its component links at any given moment, nor without the finite delays inherent in moving a packet through a mesh of switches, nor with such frequency as to pose a nuisance to the Internet Community as a whole or to your immediate neighbors. Such is life.

In the place of the current text which touts the freedoms without acknowledging the existing legal and practical limits on them, the amendment would require the carriers to disclose service plan limits and actual management practices.

So essentially what you have here is a retreat from a statement that does not accurately describe reasonable expectations of Internet experience with one that does. You can call it the adoption of a reality-based policy statement over a faith-based statement. Who could be upset about this?

Plenty of people, as it turns out. A blog called IPtegrity is hopping mad:

Amendments to the Telecoms Package circulated in Brussels by the UK government, seek to cross out users’ rights to access and distribute Internet content and services. And they want to replace it with a ‘principle’ that users can be told not only the conditions for access, but also the conditions for the use of applications and services.

…as is science fiction writer and blogger Cory Doctorow:

The UK government’s reps in the European Union are pushing to gut the right of Internet users to access and contribute to networked services, replacing it with the “right” to abide by EULAs.

…and Slashdot contributor Glyn Moody:

UK Government Wants To Kill Net Neutrality In EU
…The amendments, if carried, would reverse the principle of end-to-end connectivity which has underpinned not only the Internet, but also European telecommunications policy, to date.’

The general argument these folks make is that the Internet’s magic end-to-end argument isn’t just a guideline for developers of experimental protocols (as I’ve always thought it was,) but an all-powerful axiom that confers immunity from the laws of physics and economics as well as those of human legislative bodies. Seriously.

So what would you rather have, a policy statement that grants more freedoms to you than any carrier can actually provide, or one that honestly and truthfully discloses the actual limits to you? This, my friends, is a fundamental choice: live amongst the clouds railing at the facts or in a real world where up is up and down is down. Sometimes you have to choose.

H/T Hit and Run.

The Fiber Formula

In part three of Saul Hansell’s series on broadband in the Rest of the World, we learn that taxpayers in the fiber havens are doing all the heavy lifting: But the biggest question is whether the country needs to actually provide subsidies or tax breaks to the telephone and cable companies to increase the speeds … Continue reading “The Fiber Formula”

In part three of Saul Hansell’s series on broadband in the Rest of the World, we learn that taxpayers in the fiber havens are doing all the heavy lifting:

But the biggest question is whether the country needs to actually provide subsidies or tax breaks to the telephone and cable companies to increase the speeds of their existing broadband service, other than in rural areas. Many people served by Verizon and Comcast are likely to have the option to get super-fast service very soon. But people whose cable and phone companies are in more financial trouble, such as Qwest Communications and Charter Communications, may well be in the slow lane to fast surfing. Still, it’s a good bet that all the cable companies will eventually get around to upgrading to the faster Docsis 3 standard and the phone companies will be forced to upgrade their networks to compete.

The lesson from the rest of the world is that if the Obama administration really wants to bring very-high-speed Internet access to most people faster than the leisurely pace of the market, it will most likely have to bring out the taxpayers’ checkbook.

None of this should come as a surprise to our regular readers. Businesses invest in fiber infrastructure on a 20-year basis, and government subsidies can compress the investment timeline to one tenth of that. And Hansell finds that a lot of the foreign spending is driven by nationalist pride rather than more prudent factors. The problem I have with massive government spending on ultra-highspeed fiber projects is the conflicting priorities. I like fast networks, but I know that my tastes and interests aren’t the universal ones. And then there’s the question of utility: mobile networks aren’t as fast as locked-down fiber, but they’re an order of magnitude more useful.

So why don’t we strive to make the US number one in wireless, and leave the fiber race to the smaller nations? The long-term benefits of pervasive, high-speed wireless are much greater than those of heavily subsidized (and therefore heavily regulated) stationary networks.

, ,

Explaining the Price Gap

This is old news to those of you who read the other sources of broadband politics news on the new-fangled world wide computernet, but the esteemed Saul Hansell (a sometime reader of this blog) has released the second part of his analysis of American broadband, addressing the pricing issue. Broadband is cheaper in other countries … Continue reading “Explaining the Price Gap”

This is old news to those of you who read the other sources of broadband politics news on the new-fangled world wide computernet, but the esteemed Saul Hansell (a sometime reader of this blog) has released the second part of his analysis of American broadband, addressing the pricing issue. Broadband is cheaper in other countries due to subsidies and differences in demographics, but also because of unbundling, the practice of requiring carriers to offer wholesale access to their customers:

Unbundling can be seen as a slightly disguised form of price regulation. Profits dropped. Many of the new entrants have found it difficult to build sustainable businesses, while margins for the incumbent phone companies have been squeezed as well.

It’s not exactly clear, however, that this approach is in the public’s long-term interest. Phone companies have less incentive to invest and upgrade their networks if they are going to be forced to share their networks.

Some argue that this is the main reason that there is little investment in bringing fiber to homes in Europe. “Investing in fiber is a huge risk,” Kalyan Dasgupta, a London-based consultant with LECG, wrote me in an e-mail, “and the prospect of taking that risk alone, but having to ’share’ the rewards with other players, is not a prospect that most rational businesses would consider.”

Britain, which has been the biggest proponent of line sharing, has decided to deregulate the wholesale price BT can charge for fiber, so long as it doesn’t favor its own brand of Internet service.

Like any form of price control, unbundling produces short-term gains in access diversity at the expense of long-term investment. Adopting this approach ultimately requires the government to bear the cost of infrastructure improvements, as it ceases to be a rational use of investor dollars to build out enhancements that don’t produce substantial returns in a non-monopoly market. Many of the folks seeking net neutrality regard broadband as a utility, and this becomes a self-fulfilling prophecy. If we treat it that way, that’s that it becomes.

Just as our electric utility networks include less-efficient generating plants that belch excessive amounts of CO2 into the air because the regulators won’t approve rate hikes to pay replacement costs, so too will price-capping broadband stifle innovation in transport networks.

, ,

Debunking the Broadband Gap

Today we learn, via Saul Hansell at Bits Blog, that the US isn’t as far behind the Rest of the World with broadband as was previously thought: Even without any change in government policies, Internet speeds in the United States are getting faster. Verizon is wiring half its territory with its FiOS service, which strings … Continue reading “Debunking the Broadband Gap”

Today we learn, via Saul Hansell at Bits Blog, that the US isn’t as far behind the Rest of the World with broadband as was previously thought:

Even without any change in government policies, Internet speeds in the United States are getting faster. Verizon is wiring half its territory with its FiOS service, which strings fiber optic cable to people’s homes. FiOS now offers 50 Mbps service and has the capacity to offer much faster speeds. As of the end of 2008, 4.1 million homes in the United States had fiber service, which puts the United States right behind Japan, which has brought fiber directly to 8.2 million homes, according to the Fiber to the Home Council. Much of what is called fiber broadband in Korea, Sweden and until recently Japan, only brings the fiber to the basement of apartment buildings or street-corner switch boxes.

Actual download speeds are more important that raw signaling rates: The United States has an average speed of 5.2 Mbps, Japan is 16.7 Mbps, Sweden was 8.8 Mbps, and Korea averaged 7.2 Mbps. There’s a gap alright, but it’s not nearly as large as we’ve been lead to believe.

In fact, the gap is entirely consistent with population density and the extent of government subsidies.

Spectrum 2.0 panel from eComm

Courtesy of James Duncan Davidson, here’s a snap from the Spectrum 2.0 panel at eComm09. The general discussion was about the lessons learned from light licensing of wireless spectrum in the US, on the success of Wi-Fi and the failure of UWB, and what we can realistically hope to gain from the White Spaces licensing … Continue reading “Spectrum 2.0 panel from eComm”

Courtesy of James Duncan Davidson, here’s a snap from the Spectrum 2.0 panel at eComm09.

Maura Corbett, Rick Whitt, Peter Ecclesine, Darrin Mylet, and Richard Bennett at eComm
Maura Corbett, Rick Whitt, Peter Ecclesine, Darrin Mylet, and Richard Bennett at eComm

The general discussion was about the lessons learned from light licensing of wireless spectrum in the US, on the success of Wi-Fi and the failure of UWB, and what we can realistically hope to gain from the White Spaces licensing regime. As a person with a foot in both camps – technical and regulatory – it was an interesting exercise in the contrast in the ways that engineers and policy people deal with these issues. In general, hard-core RF engineer Peter Ecclesine and I were the most pessimistic about White Space futures, while the policy folks still see the FCC’s Report and Order as a victory.

In lobbying, you frequently run into circumstances where the bill you’re trying to pass becomes so heavily encumbered with amendments that it’s not worth passing. Rather than get your policy vehicle adopted in a crippled form, it’s better in such circumstances to take it off the table and work with the decision-makers to revive it in a future session without the shackles. While this is a judgment call – sometimes you go ahead and take the victory hoping to fix it later – it’s dangerous to pass crippled bills in a tit-for-tat system because you’re conceding a win in the next round to the other side.

I suggested that the FCC’s order was so badly flawed that the best thing for White Space Liberation would be to have the court void the order and the FCC to start over. This message wasn’t well-received by Rick Whitt, but I had the feeling Peter is on board with it.

The problem with the White Spaces is that the FCC couldn’t make up its mind whether these bands are best used for home networking or for a Third (or is it fourth or fifth?) pipe. The power limits (40 milliwatts to 1 watt) doom it to home networking use only, which simply leads to more fragmentation in the home net market and no additional WAN pipes. That’s not the outcome the champions of open networks wanted, but it’s what they got.

eComm, incidentally, is a terrific conference. The focus is very much on the applications people are developing for mobile phones, and it’s essential for people like me who build networks to see what people want to do with them, especially the things they can’t do very well today. Lee Dryburgh did a fantastic job of organization and selecting speakers, and is to be congratulated for putting on such a stellar meeting of the minds.

At long last, Genachowski

The long-awaited nomination of Julius Genachowski to the FCC chair finally came to pass yesterday, raising questions about the delay. If everybody with an interest in telecom and Internet regulation knew he was the choice months ago, why did the official announcement take so long? I have no inside information, so I’ll leave it to … Continue reading “At long last, Genachowski”

The long-awaited nomination of Julius Genachowski to the FCC chair finally came to pass yesterday, raising questions about the delay. If everybody with an interest in telecom and Internet regulation knew he was the choice months ago, why did the official announcement take so long? I have no inside information, so I’ll leave it to those who do to enlighten us on that question. Perhaps the Administration was just being extra-cautious after the debacles around a Commerce Secretary and others.

Neutralists are excited about the choice, naturally, as they view Genachowski as one of their own. And indeed, if network neutrality were actually a coherent policy and not just a rag-tag collection of Christmas wishes, they would have cause to be exhilarated. But given the range of restrictions that the movement seeks, it’s less than clear that any particular raft of regulations would satisfy them and leave broadband networks the ability to function, so we’ll see how this pans out. We’re already hearing runblings from Boucher that there may not be any Congressional action on network neutrality this year in any case.

Genachowski brings an interesting (and potentially very dangerous) set of qualifications to the job. A college buddy of the President, he’s an inner circle member with the power to wield enormous influence. As a former FCC staffer, he’s imbued with the Agency’s culture, and as a former venture capitalist funding fluffy applications software, he’s something of a tech buff. But he resembles Kevin Martin in most of the important respects: he’s a Harvard lawyer who’s worked inside the regulatory system for most of his life, and he has strong alliances to an industry that seeks to exercise control over the nation’s network infrastructure for its own purposes. Whether those purposes resemble the public interest remains to be seen.

The largest problem with the FCC and similar agencies is the knowledge gap between regulators and the modern broadband networks that are the subject of their regulatory power. Martin didn’t have the training to appreciate the effect that his orders would have on the infrastructure, and neither does Genachowski. So the new Chairman is just as likely as the old chairman to make things worse while trying to make them better.

In a perfect world, the commissioners would be able to rely on the expert judgment of the Chief Technologist to stay out of trouble, but the current occupant of that job, Jon Peha, has a penchant for playing politics that renders him ineffective. The bizarre, quixotic inquiry the FCC made recently into the quality of service variations between Comcast’s voice service and over-the-top VoIP is an example. This isn’t a serious line of inquiry for a serious Commission, and Peha never should have let it happen. But it did, and that fact should remind us that the FCC is more a creature of politics than of technology.