About polluting the 5GHz WiFi band

by Michael Tremer, May 10, 2014

Do you like what you are reading? Subscribe to our newsletter and don't miss out on the latest...   Join Now

It is time for an other rant about some emerging technologies. I do not apologise in advance for it. It feels rather liberating to explain what is going wrong (in my opinion) and it makes me happy that I am sometimes able to open the eyes of some of you who are reading this.

Today: Don’t believe what marketing is telling you – or Gigabit WiFi 802.11ac

Since the beginning for WiFi, we all know that we cannot believe what they print on the boxes of access points. 802.11g claimed it is able to transfer 54 MBit/s, but it was actually half of it. We knew it right from the beginning and were fine with that CSMA/CA used up half of the available bandwidth. Now, there are so called “Dual Band” routers in shops with “up to 900 MBit/s” where they simply add up the bandwidth that is theoretically possible on each band – although this theoretical bandwidth can by far never be used by one client.

No wonder that vendors finally want to break through the wall that is Gigabit Wifi. It is “up to 1.75 GBit/s” or “1.3 Gbit/s” if you leave out the 2.4 GHz band and supposed to stream all sorts of HD television streams in your home network.

The broken 2.4 GHz band

The magical word is 802.11ac. The successor of the 802.11n standard which finally made wireless LAN usable. 802.11b and 802.11g were limited to the 2.4 GHz band, which on the one hand is license-free in most parts of the world and on the other hand is used for loads of other applications like garage openers, Bluetooth and microwave ovens – because it is license-free. In a typical household, this band is “polluted” with radio waves. If you are living in bigger cities, it is almost unusable, because there are hundreds of wireless networks you can receive, people who are warming up meals and what not.

The wireless radio devices do a very funny thing then. They will stop sending data and wait until the channel they are transmitting on is free (this is part of CSMA/CA). If it is not free, they will wait… and wait… and wait… until the meal is cooked. It is easy to see, that the solution is to move out of that frequency band and leave the other applications to it. Wireless networks have become more important and more spectrum is needed for them.

The alternative, 802.11a, has been around for the same time as 802.11g and is basically the same, except that it uses an other frequency band at about 5 GHz. Unfortunately, only a few devices were able to use it and therefore, it never got any market share. There are some technical reasons to which we will get in a minute.

It was in 2007, when 802.11n was finally published. The engineers saw that was no longer an option to stick with the 2.4 GHz band and therefore added the 5 GHz band into the standard, so that no longer two very similar standards like 802.11g and 802.11a co-exist. Devices to not necessarily need to support the 5 GHz band, but as of writing this article, I think the vast majority of devices does. There are only very few devices in the cheap market sector left that only support the 2.4 GHz band.

So, why didn’t they do a complete move to the 5 GHz band if it is so much better? Well, it isn’t. There are many downsides to using a higher frequency. The most basic one is that the range is getting exponentially shorter. Higher frequencies also don’t pass as easy through concrete objects like walls. In order to compensate both, the transmit power is increased up to 40 times than maximally allowed on the 2.4 GHz band. While this is not a real issue with access points, where you can get as much power as you want from the socket on the wall, it is a real struggle for mobile devices. The maximum transmit power for 802.11n in Europe is 4W and even higher in the US – almost impossible to get this much out of the battery of a mobile phone.

The engineers had high throughput in mind when they designed 802.11n. It has been under development since 2002 and hence the standard has a bit of a history: The name of the group was “High-Throughput Study Group” (HTSG) when they first met. With that topic already in the name, they tried to invent solutions and tricks to get the throughput as high as possible. Technologies like beamforming were introduced, MIMO (multiple input, multiple output) was added so that access points and stations now had multiple antennas which are able to transmit all at the same time and so on. The outcome is a maximum of 72.2 MBit/s per stream. With up to four MIMO streams, that makes a total of 288.8 MBit/s. Pretty great, right?

The theoretical throughput is the same in both bands. In practice it will be much better in the 5 GHz band because there are no other services that interfere so much. Maybe it is that sexiness of so much empty spectrum that made them want to use more of it. And that is basically done in a breeze: Combining two channels.

Channel bandwidth

A common (802.11a/b/g/n) wireless channel is 20 MHz wide. In the 5 GHz band, the channels are not overlapping, so you can easily pick the next one to the left or right and use it, too. In the 2.4 GHz band, the channel overlap with each other in that way that when someone is transmitting on channel 6, the two channels to the left (4 and 5) and the two channels to the right (7 and 8) on the spectrum cannot be used at the same time. So, combining two channels on the 2.4 GHz band means to actually block 8 channels – or in other words 40 MHz. Doubling the bandwidth of the channel doubles the throughput as well to 150 MBit/s per stream or 600 MBit/s in total. Even more impressive now, right?

Of those 600 MBit/s it should be possible to outperform an ordinary 100 MBit/s Ethernet connection and applications like large file transfers or streaming of high definition video should be possible. That all comes at a great cost that this almost uses up the entire 2.4 GHz frequency spectrum. The other applications and other wireless networks of your neighbours take their part and massively decrease the throughput again as wireless networks are highly sensitive against interference and data loss on the wireless link.

But hey, even if we could use 200 or 100 MBit/s of actual throughput, this would still be nice. The good news is that we can! Let’s just move to the 5 GHz band and we have a lot of spectrum available that we need. The shorter range is beneficial to not interfere so much with close wireless networks from your neighbours (so don’t buy bigger antennas). What else do we want?

Backwards-compatibility

There is always one device in your network that cannot use the 5 GHz band. So operating a second access point on the 2.4 GHz band is still necessary to connect these to the LAN. Some devices don’t support the optional feature of 40 MHz channel bandwidth, so that they can only connect with 20 MHz. This is not big trouble as it only reduces the usable bandwidth for the other devices.

We will probably have to carry around this legacy around for a long long time. I suggest that 2.4 GHz won’t ever go away as long as there is an interest into building small, battery-powered devices which connect to the network. These could be WiFi-enabled light bulbs, your fridge, temperature sensors and so on. I also count mobile phones and tablets into this category of devices that don’t need high-throughput wireless networks.

We don’t need high-throughput wireless networks

How much data does your phone transfer on a day? How much data does your light bulb transfer per day? Do you need hundreds of megabits per second to do that? The answer is probably no. However, wireless networks are tuned to do that and only that.

It has been completely forgotten to think about this class of devices that don’t transfer a lot of data and therefore, should be able to save the energy for constantly transmitting and receiving data. Keep that in mind when I tell you that it will get much much worse: 802.11ac

802.11ac – The Gigabit WiFi standard

Up to this point, I only talked about the older WiFi standards. How they evolved and what the problem was that was tried to be solved with them. While 802.11b was the first standard to make wireless networks usable in the home, 802.11g added much more bandwidth. It was 802.11n then which made it really usable, stable, massively increased the bandwidth which was still too low in 802.11g and basically made wireless networks to what they are today. Now, 802.11ac is coming and will ruin all of this.

Finally, 802.11ac drops using the 2.4 GHz band and only uses the 5 GHz band. That is everything positive I can say about 802.11ac.

While 802.11a/b and g came with a default channel bandwidth of 20 MHz (which equals to one channel), 802.11n is able to use 40 MHz (equals to two channels), the minimum and default channel bandwidth of 802.11ac is 80(!) MHz which equals four non-overlapping channels.

The authorities in Europe allow the unlicensed use of 16 channels in the 5 GHz band. With 802.11a, that would be enough frequency spectrum to have 16 individual wireless networks at the same spot without one interfering with the other. With 802.11n up to eight if you use 40 MHz channels. With 802.11ac that leaves us with only four wireless networks at a spot that don’t interfere with each other – we know that interference is really bad, especially for 802.11ac as we will see in a moment. In comparison: In the 2.4 GHz band, three (20 MHz wide) channels fitted without interfering with each other. Apparently that was not enough for most users and now we are up to four in the 5 GHz band. This does not look like real progress to me…

It gets worse (of course): 802.11ac may also use 160 MHz bandwidth channels. Doing the maths: With eight spacial streams and 160 MHz channel bandwidth, the aggregate data rate is 6933.6 MBit/s, or roughly 7 GBit/s by blocking half of the spectrum. With 80 MHz channels, it is about 3.5 GBit/s.

Modulation

Great numbers. We don’t achieve these rates even on wired connections.

If you are dreaming what wonderful things you would do with your 802.11ac, then I am here to tell you the truth. This won’t actually surprise you that these bandwidths are the theoretical maximum. The actually usable bandwidth is far below that – partly of CSMA/CA. Partly because the station (your laptop, tablet, phone) has to have eight antennas just as the access point, which will probably not happen.

If CSMA/CA decides that the time is right to start transmitting a packet, there are a lot of other radios listening on the same channel trying to receive a that packet (called frame). A frame is nothing else than a string of bits which are not transmitted one after an other, but a so called modulation is used. In case of 802.11ac this is 256-QAM in an optimal case as opposed to 64-QAM in 802.11n. Without going into much detail, I want to point out that this is a pretty bad choice:

The modulation makes it possible to transmit a group of bits at once. The number indicates how many groups there are. 256 for 256-QAM for instance. As there are more choices, the radio signal must be precisely received to find the right group. If there is interference during the transmission, the wrong group will be selected and the frame be corrupted. Error correction mechanisms will try to correct these issues, but won’t always succeed. In that case, a retransmission of the entire frame is necessary.

We know from TCP that retransmissions massively decrease the throughput and it is just the same with wireless networks. So, radio modules must be of a very high quality, which we don’t have, yet. On top of that comes, that the quality of the received signal gets worse with increased channel bandwidth. 80 MHz is already very high. Now we are going up to 160 MHz.

There is no tolerance for overdriving the radios or poor signal quality because one communication partner is out of range. So 802.11ac is practically unusable outdoors and its magnificent bandwidths that are printed on the boxes fall into bits and pieces – quite literally.

I am sure, that we will very soon see faster access points. When that happens, the 5 GHz band will die the same death as the 2.4 GHz band does now.

So, Wifi hardware that are available today does not (yet) achieve these phenomenal bandwidths, because we don’t have radio modules. The vendors advertise their access points with “up to 1.3 GBit/s” and only support 3 spacial streams and 80 MHz. A box full of compromises and not the bleeding-edge of the new technology that is advertised.

What for?

Devices with many antennas will now be able to transfer a couple of hundred of gigabits per second. TVs that can stream – yeah – nearly uncompressed HD streams? Doing backups of large data collections over the air? Is is worth to optimise our wireless networks for these use-cases? Is it worth to use so much of the frequency spectrum for a wireless network?

I think it is not. The disadvantages are quite obvious: The majority of the devices in my wireless networks is of the mobile phone/tablet class. Battery-powered devices that are most of the time asleep in my pocket and don’t use any bandwidth at all. However, the channel is blocked and will slow down and interfere with all other wireless networks on the same channel. That causes not only a reduced bandwidth, but long latency and a bad user experience. I guess you have figured out where I am going with this…

In the future, I want to see the praised Internet of things. I want light bulbs that I can configure over the network and pick the color that I am in the mood for right now. I want my a fridge that knows what is inside and sends me a message if I am running out of milk and I want to read emails on my phone when I am lying on the sofa. We currently have a great wireless network standard for that but 802.11ac is moving into a completely opposite direction.

802.11ac is not the successor of 802.11n. I am not saying that it is technically a failure. It just is for the purpose I use wireless networks and I think the majority of people uses them. 802.11n provided us with enough bandwidth for now and gives us a great compromise in terms of power consumption on mobile devices and performance. 802.11ac does not. For applications that need high bandwidths, a piece of cable will for ever be the best solution, either if it is copper or fibre. We simply don’t have enough spectrum (or not the ways to use it better, yet) that we can afford to transmit everything over the air. It is exactly the same problem with LTE and 3G networks which are running out of spectrum, too.

Maybe, this failed design of 802.11ac will cause us to support the old standards for a long time. I don’t consider it as a proper successor and others don’t do it either. There are so many problems that we won’t see 802.11ac chipsets in many devices for a long time – maybe never, because for many applications it is completely unusable and makes no sense. Hence there will be 802.11n/a/g/b networks for supporting those devices, using even more spectrum in both bands.

If that is the future of WiFi, we are in severe trouble.

I hope you have learned some things from this longish article. Except some details about how wireless networks work, I hope that you are able to make your own decision now and will reconsider deploying 802.11ac networks at your offices and homes. The marketing departments will tell you that it is the “new” one and that you want to have it. Their motivation is probably just selling new devices when there is in fact no use for the consumers. Don’t believe what marketing is telling you. And if you can, please stop your neighbours from buying 802.11ac devices, because they will slow down your networks, too.