I'm trying to create perfect screen captures of SDR to explain the world of radio around us. In this blogpost, I'm going to discuss some of the imperfect captures I'm getting, specifically, some notes about WiFi and Bluetooth.
An SDR is a "software defined radio" which digitally samples radio waves and uses number crunching to decode the signal into data. Among the simplest thing an SDR can do is look at a chunk of spectrum and see signal strength. This is shown below, where I'm monitoring part of the famous 2.4 GHz pectrum used by WiFi/Bluetooth/microwave-ovens:
There are two panes. The top shows the current signal strength as graph. The bottom pane is the "waterfall" graph showing signal strength over time, display strength as colors: black means almost no signal, blue means some, and yellow means a strong signal.
The signal strength graph is a bowl shape, because we are actually sampling at a specific frequency of 2.42 GHz, and the further away from this "center", the less accurate the analysis. Thus, the algorithms think there is more signal the further away from the center we are.
What we do see here is two peaks, at 2.402 GHz toward the left and 2.426 GHz toward the right (which I've marked with the red line). These are the "Bluetooth beacon" channels. I was able to capture the screen at the moment some packets were sent, showing signal at this point. Below in the waterfall chart, we see packets constantly being sent at these frequencies.
We are surrounded by devices giving off packets here: our phones, our watches, "tags" attached to devices, televisions, remote controls, speakers, computers, and so on. This is a picture from my home, showing only my devices and perhaps my neighbors. In a crowded area, these two bands are saturated with traffic.
The 2.4 GHz region also includes WiFi. So I connected to a WiFi access-point to watch the signal.
WiFi uses more bandwidth than Bluetooth. The term "bandwidth" is used today to mean "faster speeds", but it comes from the world of radio where it quite literally means the width of the band. The width of the Bluetooth transmissions seen above is 2 MHz, the width of the WiFi band shown here is 20 MHz.
It took about 50 screenshots before getting these two. I had to hit the "capture" button right at the moment things were being transmitted. And easier way is a setting that graphs the current signal strength compared to the maximum recently seen as a separate line. That's shown below: the instant it was taken, there was no signal, but it shows the maximum of recent signals as a separate line:
You can see there is WiFi traffic on multiple channels. My traffic is on channel #1 at 2.412 GHz. My neighbor has traffic on channel #6 at 2.437 GHz. Another neighbor has traffic on channel #8 at 2.447 GHz. WiFi splits the spectrum assigned to it into 11 overlapping channels set 5 MHz apart.
Now the reason I wanted to take these pictures was to highlight the difference between old WiFi (802.11b) and new WiFi (802.11n). The newer standard uses the spectrum more efficiently. Notice in the picture above how signal strength for a WiFi channel is strongest in the center but gets weaker toward the edges. That means it's not fully using all the band.
Newer WiFi uses a different scheme to encode data into radio waves, using all the band given to it. We can see the difference in shape below, when I change from 802.11b to 802.11n:
Instead of a curve it's more of a square block. It fills its entire 20 MHz bandwidth instead of only using the center.
What we see here is the limits of math and physics, known as the Shannon Limit, that governs the maximum possible speed for something like WiFi (or mobile phone radios like LTE). It's simply the size of that box: its width times its height. The width is measured in frequency, 20 MHz wide. It's height is signal strength measure above the noise floor (which should be straight line across the bottom of our graph, but as I mentioned before, is shown in this SDR by a curved line increasingly inaccurate near the edges).
As we move toward faster and faster speeds, we cannot exceed this theoretical limit.
One solution is directional antennas, such as the yagi antennas you see on top of houses or satellite dishes. A directional antenna or dish means getting a stronger signal with less noise -- thus, increasing the "height" of the box.
The same effect can be achieved with something called "phased arrays", using multiple antennas that transmit/receive at (very) slightly different times, such that waves they produce reinforce each other in one direction but cancel each other out in other directions. This is how SpaceX "Starlink" space-based Internet works. The low Earth orbit satellites whizzing by overhead travel too fast to keep an antenna pointed at them, so their antenna is a phases array instead. The antennas are fixed, but the timing is slightly altered to aim the beam toward the satellite.
What's even more interesting is MIMO: receiving different signals on different antennas. With fancy circuits and math, doubling the number of antennas doubles the effective bandwidth.
The latest mobile phones and WiFi use MIMO and phases arrays to increase bandwidth.
But mostly, higher frequencies give more bandwidth. That's why WiFi at 5 GHz is better -- bands are a minimum of 40 MHz (instead of 20 MHz as in 2.4 GHz WiFi), are more commonly 80 MHz, and can go up to 160 MHz.
Anyway, these are more imperfect picture I'm creating to explain WiFi and Bluetooth. At some point in the time, I'll be generating more perfect ones.
There is no 40 MHz minimum requirement for 5 GHz. 40 MHz is just more common at 5 GHz. You can still use 20 MHz channels at 5 GHz to maximize channel efficiency since you can get more bits/sec/MHz using smaller width channels.
Post a Comment