The Bane of Backwards Compatibility

I’m a huge fan of video games. I love playing them, especially on my old consoles from my formative years. The original Nintendo consoles were my childhood friends as much as anything else. By the time I graduated from high school, everyone had started moving toward the Sony Playstation. I didn’t end up buying into that ecosystem as I started college. Instead, I just waited for my brother to pick up a new console and give me his old one.

This meant I was always behind the curve on getting to play the latest games. I was fine with that, since the games I wanted to play were on the old console. The new one didn’t have anything that interested me. And by the time the games that I wanted to play did come out it wouldn’t be long until my brother got a new one anyway. But one thing I kept hearing was that the Playstation was backwards compatible with the old generation of games. I could buy a current console and play most of the older games on it. I wondered how they managed to pull that off since Nintendo never did.

When I was older, I did some research into how they managed to build backwards compatibility into the old consoles. I always assumed it was some kind of translation engine or enhanced capabilities. Instead, I found out it was something much less complicated. For the PS2, the same controller chip from the PS1 was used, which ensured backwards compatibility. For the PS3, they essentially built the guts of a PS2 into the main board. It was about as elegant as you could get. However, later in the life of those consoles, system redesigns made them less compatible. Turns out that it isn’t easy to create backwards compatibility when you redesign things to remove the extra hardware you added.

Bringing It Back To The Old School

Cool story, but what does it have to do with enterprise technology? Well, the odds are good that you’re about to fight a backwards compatibility nightmare on two fronts. The first is with WPA3, the newest security protocol from the Wi-Fi Alliance. WPA3 fixes a lot of holes that were present in the ancient WPA2 and includes options to protect public traffic and secure systems from race conditions and key exchange exploits. You’d think it was designed to be more secure and would take a long time to break right? Well, you’d be wrong. That’s because WPA3 was exploited last year thanks to a vulnerability in the WPA3-Transition mode designed to enhance backwards compatibility.

WPA3-Transition Mode is designed to keep people from needing to upgrade their wireless cards and client software in one fell swoop. It can configure a WPA3 SSID with the ability for WPA2 clients to connect to it without all the new enhanced requirements. Practically, it means you don’t have to run two separate SSIDs for all your devices as you move from older to newer. But practical doesn’t cover the fact that security vulnerabilities exist in the transition mechanism. Enterprising attackers can exploit the weaknesses in the transition setup to crack your security.

It’s not unlike the old vulnerabilities in WPA when it used TKIP. TKIP was found to have a vulnerability that allowed for exploiting. People were advised to upgrade to WPA-AES as soon as possible to prevent this. But if you enabled older non-AES capable clients to connect to your SSIDs for compatibility reasons you invalidated all that extra security. Because AES had to operate in TKIP mode to connect the TKIP clients. And because the newer clients were happy to use TKIP over AES you were stuck using a vulnerable mode. The only real solution was to have a WPA-AES SSID to connect to for your newer secure clients and leave a WPA-TKIP SSID active for the clients that had to use it until they could be upgraded.

4Gs for the Price of 5

The second major area where we’re going see issues with backwards compatibility is with 5G networking. We’re hearing about the move to using 5G everywhere. We’ve no doubt heard by now that 5G is going to replace enterprise wireless or change the way we connect to things. Honestly, I’m not surprised someone has tried to make the claim that 5G can make waffles and coffee yet. But 5G is rife with the same backwards compatibility issues present in enterprise wireless too.

5G is an evolution of the 4G standards. Phones issued today are going to have 4G and 5G radios and the base stations are going to mix the radio types to ensure those phones can connect. Just like any new technology, they’re going to maximize the connectivity of the existing infrastructure and hope that it’s enough to keep things running as they build out the new setup. But by running devices with two radios or having a better connection from the older devices, you’re going to set yourself up to have your new protocol inherently insecure thanks to vulnerabilities in the old versions. It’s already projected that governments are going to take advantage of this for a variety of purposes.

We find ourselves in the same boat as we do with WPA3. Because we have to ensure maximum compatibility, we make sacrifices. We keep two different versions running at the same time, which increases complexity. We even mark a lot of necessary security upgrades as optional in order to keep people from refusing to implement them or fall behind because they don’t understand them1.

The biggest failing for me is that we’re pushing for backwards compatibility and performance over security. We’re not willing to make the hard choices to reduce functionality in order to save our privacy and security. We want things to be backwards compatible so we can buy one device today and have it work on everything. We’ll just make the next one more secure. Or the one after that. Until we realize that we’re still running old 802.11 data rates in our newest protocols because no one bothered to remove them. We have to make hard choices sometimes and sacrifice some compatibility in order to ensure that we’re safe and secure with the newer technology.


Tom’s Take

Backwards compatibility is like the worst kind of nostalgia. I want the old thing but I want it on a new thing that runs faster. I want the glowing warmth of my youth but with the convenience of modern technology. It’s like buying an old sports car. Sure, you get all the look and feel of an old powerful engine. You also lose the safety features of the new body along with the comforts you’ve become accustomed to. You have to make a hard choice. Do you keep the old car original and lose out on what you like to get what you want? Or do you create some kind of hybrid that has exactly what you want and need but isn’t what you started with? It’s a tough choice to make. In the world of technology, there’s no right answer. But we need to remember that every compromise we make for performance can lead to compromises in security.


  1. I’m looking at you, OWE ↩︎

Will Spectrum Hunger Kill Weather Forecasting?

If you are a fan of the work we do each week with our Gestalt IT Rundown on Facebook, you probably saw a story in this week’s episode about the race for 5G spectrum causing some potential problems with weather forecasting. I didn’t have the time to dig into the details behind the story on that episode, so I wanted to take a few minutes and explain why it’s such a big deal.

First, you have to know that 5G (and many other) speeds are entirely dependent upon the amount of spectrum they can use to communicate. The more spectrum available to them, the more channels they have available to communicate. Which increases the speed they can exchange information and reduces the amount of interference between devices. Sounds simple right?

Except mobile devices aren’t the only things that are using the spectrum. We have all kinds of other devices out there that use radio waves to communicate. We’ve known for several years that there are a lot of devices in the 5 GHz spectrum used by 802.11 that interfere with wireless devices. Things like ISM radios for industrial and medical applications or government radar systems. The government has instituted many regulations in those frequency ranges to ensure that critical infrastructure isn’t interfered with.

When Nature Calls

However, sometimes you can’t regulate away interference. According to this Wired article the FCC, back in March, opened up auctions for the 24 GHz frequency band. This was over strenuous objections from NASA, NOAA, and the American Meteorological Society (AMS). Why is 24 GHz so important? Well, as it turns out, there’s a natural phenomenon that exists at that range.

Recall your kitchen microwave. How does it work? Basically, it uses microwave radiation to heat the water in the food you’re cooking. How does it do that? Turns out the natural frequency of water is 2.38 GHz. Now, thanks to the magic of math, 23.8 GHz is a multiple of that frequency. Which means that anything that broadcasts at 23.8 GHz will have issues with water, such as water in tree leaves or in water pipes.

So, why is NOAA and the AMS freaking out about auctioning off spectrum in the 23.8 GHz range? Because anything broadcasting in that range is not only going to have issues with water interference but it’s also going to look like water to sensitive equipment. That means that orbiting weather satellites that use microwaves to detect water vapor in the air that reacts to 23.8 GHz are going to encounter co-channel interference from 5G radio sources.

You might say to yourself, “So what? It’s just a little buzz, right?” Well, except that that little buzz creates interference in the data being fed into forecast prediction models. Those models are the basis for the weather forecasts we have today. And if you haven’t noticed the reliability of our long range forecasts has been steadily improving for the past 30 years or so. Today’s 7-day forecasts are almost 80% accurate, which is pretty good compared to how bad things were in the 80s, where you could only guarantee 80% accuracy from a 3-day forecast.

Guess what? NOAA says that if the 24 GHz spectrum gets auctioned off for 5G use, we could see the accuracy of our forecasting regress almost 30%, which would push our models back to where they were in the 80s. Now, for those of you that live in places that are fortunate enough to only get sun and the occasional rain shower that doesn’t sound too bad, right? Just make sure to pack an umbrella. But for those that live in places where there is a significant chance for severe weather, it’s a bit more problematic.

I live in Oklahoma. We’re right in the middle of Tornado Alley. In the spring between April 1 and June 1 my state becomes a fun place full of nasty weather that can destroy homes and cause widespread devastation. It’s never boring for sure. But in the last 30 years we’ve managed to go from being able to give people a few minutes warning about an impending tornado to being able to issue Potential Dangerous Situation (PDS) Tornado Watches up to 48 hours in advance. While a PDS Tornado Watch doesn’t mean that we’re going to get one in a specific area, it does mean that you need to be on the lookout for something that day. It gives enough warning to make sure you’re not going to get caught flat footed when things get nasty.

Yes Man

The easiest way to avoid this problem is probably the least likely to happen. The FCC needs to restrict the auction of that spectrum range identified by NOAA and NASA until it can be proven that there won’t be any interference or that the forecast accuracy isn’t going to be impacted. 5G rollouts are still far enough in the future that leaving a part of the spectrum out of the equation isn’t going to cause huge issues for the next few years. But if we have to start creating rules for how we have to change power settings for device manufacturers or create updates for fixed-position sensors and old satellites we’re going to have a lot more issues down the road than just slightly slow mobile devices.

The reason why this is hard is because an FCC focused on opening things up for corporations doesn’t care about the forecast accuracy of a farmer in Iowa. They care about money. They care about progress. And ultimately they care about looking good over saving lives. There’s no guarantee that reducing forecast accuracy will impact life saving, but the odds are that better forecasts will help people make better decisions. And ultimately, when you boil it down to the actual choices, the appearance is that the FCC is picking money over lives. And that’s a pretty easy choice for most people to make.


Tom’s Take

If I’m a bit passionate about weather tech, it’s because I live in one of the most weather-active places on the planet. The National Severe Storms Laboratory and the National Weather Center are both about 5-6 miles from my house. I see the use of this tech every day. And I know that it saves lives. It’s saved lives for years for people that need to know

Generation Lost

I’m not trying to cause a big sensation (talking about my generation) – The Who

GenTiltedNaming products is an art form.  When you let the technical engineering staff figure out what to call something, you end up with a model number like X440 or 8086.  When the marketing people get involved at first, you tend to get more order in the naming of things, usually in a series like the 6500 series or the MX series.  The idea that you can easily identify a product’s specs based on its name or a model number is nice for those that try to figure out which widget to use.  However, that’s all changing.

It started with mobile telephones.  Cellular technology has been around in identifiable form since the late 1970s.  The original analog signals worked on specific frequencies and didn’t have great coverage.  It wasn’t until the second generation of this technology moved entirely to digital transmission with superior encoding that the technology really started to take off.  In order to differentiate this new technology from the older analog version, many people made sure to market it as “second generation”, often shortening this to “2G” to save syllables.  When it came time to introduce a successor to the second generation personal carrier service (PCS) systems, many carriers started marketing their offerings withe moniker of “3G”, skipping straight past the idea of third generation offering in favor of the catchier marketing term.  AT&T especially loved touting the call quality and data transmission rate of 3G in advertising.  The 3G campaigns were so successful that when the successor to 3G was being decided, many companies started trying to market their incremental improvements as “4G” to get consumers to adopt them quickly.

Famously, the incremental improvement to high speed packet access (HSPA) that was being deployed en masse before the adoption of Long Term Evolution (LTE) as the official standard was known as high speed packet downlink access (HSPDA).  AT&T petitioned Apple to allow their carrier badge on the iPhone to say “4G” when HSPDA was active.  Even though speeds weren’t nearly as fast as the true 4G LTE standard, AT&T wanted a bit of marketing clout with customers over their Verizon rivals.  When the third generation iPad was released with a true LTE radio later on, Apple made sure to use the “LTE” carrier badge for it.  When the iOS 5 software release came out, Apple finally acquiesced to AT&T’s demands and rebranded the HSPDA network to be “4G” with a carrier update.  In fact, to this day my iPhone 4S still tells me I’m on 4G no matter where I am.  Only when I drop down to 2G does it say anything different.

The fact that we have started referring to carrier standards a “xG” something means the marketing is working.  And when marketing works, you naturally have to copy it in other fields.  The two most recent entries in the Generation Marketing contest come from Broadcom and Brocade.  Broadcom has started marketing their 802.11ac chipsets as “5G wireless.”  It’s somewhat accurate when you consider the original 802.11 standard through 802.11b, 802.11g, 802.11n, and now 802.11ac.  However, most wireless professionals see this more as an attempt to cash in on the current market trend of “G” naming rather than showing true differentiation.  In Brocade’s case, they recently changed the name of their 16G fibre channel solution to “Gen 5” in an attempt to shift the marketing message away from a pure speed measurement (16 gigabit) especially when starting to put it up against the coming 40 gigabit fibre channel over Ethernet (FCoE) offerings coming from their competitor at Cisco.

In both of these cases, the shift has moved away from strict protocol references or speed ratings.  That’s not necessarily a bad thing.  However, the shift to naming it “something G” reeks quite frankly.  Are we as consumers and purchases so jaded by the idea of 3G/4G/5G that we don’t get any other marketing campaigns?  What if they’d call it Revision Five or Fifth Iteration instead?  Doesn’t that convey the same point?  Perhaps it does, but I doubt more than an handful of CxO type people know what iteration means without help from a pocket dictionary.  Those same CxOs know what 4G/5G mean because they can look down at their phone and see it.  More Gs are better, right?

Generational naming should only be used in the broadest sense of the idea.  It should only be taken seriously when more than one company uses it.  Is Aruba going to jump on the 5G wireless bandwagon?  Will EMC release a 6G FC array?  If you’re shaking your head in answer to these questions, you probably aren’t the only one.  Also of note in this discussion – What determines a generation?  IT people have trouble keeping track of what constitutes the difference between a major version change and a point release update.  Why did 3 features cause this to be software version 8.0 but the 97 new features in the last version only made it go from 7.0 to 7.5?  Also, what’s to say that a company doesn’t just skip over a generation?  Why was HSPDA not good enough to be 4G?  Because the ITU said it was just an iteration of 3G and not truly a new generation.  How many companies would have looked at the advantage of jumping straight to 5G by “counting” HSPDA as the fourth generation absent oversight from the ITU?


Tom’s Take

My mom always told me to “call a spade a spade.”  I don’t like the idea of randomly changing the name of something just to give it a competitive edge.  Fibre channel has made it this far as 2 gig, 4 gig, and 8 gig.  Why the sudden shift away from 16 gig?  Especially if you’re going to have to say it runs at 16 gig so people will know what you’re talking about?  Is it a smoke-and-mirrors play?  Why does Broadcom insist on naming their wireless 5G?  802.11a/b/g/n has worked just fine up to now.  Is this just an attempt to confuse the consumer?  We may never know.  What we need to do in the meantime is consider holding feet to the fire and ensuring that consumers and purchasers ensure that this current naming “generation” gets lost.