The Register stumbled upon an eye-opening report commissioned by the UK telecom regulator, Ofcom, on sources of Wi-Fi interference in the UK:
What Mass discovered (pdf) is that while Wi-Fi users blame nearby networks for slowing down their connectivity, in reality the problem is people watching retransmitted TV in the bedroom while listening to their offspring sleeping, and there’s not a lot the regulator can do about it.
Outside central London that is: in the middle of The Smoke there really are too many networks, with resends, beacons and housekeeping filling 90 per cent of the data frames sent over Wi-Fi. This leaves only 10 per cent for users’ data. In fact, the study found that operating overheads for wireless Ethernet were much higher than anticipated, except in Bournemouth for some reason: down on the south coast 44 per cent of frames contain user data.
When 90% of the frames are overhead, the technology itself has a problem, and in this case it’s largely the fact that there’s such a high backward-compatibility burden in Wi-Fi. Older versions of the protocol weren’t designed for obsolescence, so the newer systems have to take steps to ensure the older systems can see them, expensive ones, or collisions happen, and that’s not good for anybody. Licensed spectrum can deal with the obsolescence problem by replacing older equipment; open spectrum has to bear the costs of compatibility forever. So this is one more example of the fact that “open” is not always better.
This is why my main home wireless network is 5ghz-only 802.11n-only.
I have a separate b/g router purely for the Wii/DS/etc. that can’t use 5ghz-band n.
I use a similar setup – a Linksys WRT 610N dual-concurrent access point with 5G as my preferred SSID. The 5G is for my own stuff, and the 2.4 is for whoever else may be around. No interference at all, since my neighbors are stuck in the 20th century.
I wonder if it would be wise to turn beacons down to once every 500 ms as opposed to the default once every 100 ms.
Also, would putting APs in 802.11g (and above) mode fix this problem? Or will it require a full change to 802.11n to fix this problem?
There are a FEW 802.11g implementations in which “802.11b protection” — the sending of CTS packets prior to transmission of 802.11g packets by the AP — can be turned off. But this is, technically, a standards violation and most APs can’t do it.
Brett, every 11g AP I’ve tested can run CTS protection in “auto” mode where it’s based on 11b sensing via Beacon, or in “off” mode where it doesn’t care. You can’t get Wi-Fi cert if you can’t sense 11b nodes and act accordingly. I don’t recommend turning this feature off as it really does help. 11g only mode just prevents 11b nodes from associating. The real fix is to retire 11b nodes entirely.
Richard: You’re correct that every standards-conforming 802.11g AP can do “802.11b protection” by sending CTS frames. The problem is that not every one can turn it off. This means that if you get a single 802.11b node on your network, it slows down dramatically.
So even when you put an AP in 802.11g mode only and choose to not support 802.11b, you’re telling me that it will still send the wasteful CTS frames unless you’re running some non-Wi-Fi compliant firmware?
The protection frames aren’t “wasteful” if you’re sharing a channel with an active .11b node – they prevent collisions, which is a good thing. But if the level of traffic from the.11b node is low and yours is high, they do become an excessive overhead. All the .11g/n routers I’ve checked – around a dozen or so – allow this feature to be turned off, so it’s not a big deal either way. The real issue is the presence of .11b nodes, because they transmit at such low rates that they hog the channel.
The protection frames are wasteful, because there should be no need for them. The original standard should have anticipated more efficient modulation techniques and created a spectrum etiquette that was independent of them.
In any event, some of the points made in the interference paper are spot on. I’ve seen Wi-Fi disrupted in a surprisingly large area by a single wireless “security” camera. And my ISP has had to do some serious engineering to work around massive interference from outdoor video and TDM links which created continuous interference over a wide section of an unlicensed band. In one case, we had to change the polarization of the antenna, move to the other end of the band, drop from 802.11g to 802.11b (which allows a little more output power and has more robust modulation) and then add a high-Q channel filter to keep an access point from being totally “blinded” by the interference. And then, we discovered that even the clients of that access point — which had highly directional antennas — were still being bamboozled by the noise, even though they too were cross-polarized to the noise and were on the most distant possible channel within the band. (We finally found a single brand and model of radio which had good enough filtering to work… and then the manufacturer discontinued it because they didn’t want to make radios that were 802.11b-only.) Bottom line: the FCC’s failure to establish a common spectrum etiquette almost kept us from serving a large rural area that really needed us.