Getting Blasted by Backdoors

Open Door from

I wanted to take minute to talk about a story I’ve been following that’s had some new developments this week. You may have seen an article talking about a backdoor in Juniper equipment that caused some issues. The issue at hand is complicated at the linked article does a good job of explaining some of the nuance. Here’s the short version:

  • The NSA develops a version of Dual EC random number generation that includes a pretty substantial flaw.
  • That flaw? If you know the pseudorandom value used to start the process you can figure out the values, which means you can decrypt any traffic that uses the algorithm.
  • NIST proposes the use of Dual EC and makes it a requirement for vendors to be included on future work. Don’t support this one? You don’t get to even be considered.
  • Vendors adopt the standard per the requirement but don’t make it the default for some pretty obvious reasons.
  • Netscreen, a part of Juniper, does use Dual EC as part of their default setup.
  • The Chinese APT 5 hacking group figures out the vulnerability and breaks into Juniper to add code to Netscreen’s OS.
  • They use their own seed value, which allows them to decrypt packets being encrypted through the firewall.
  • Hilarity does not ensue and we spend the better part of a decade figuring out what has happened.

That any of this even came to light is impressive considering the government agencies involved have stonewalled reporters and it took a probe from a US Senator, Ron Wyden, to get as far as we have in the investigation.

Protecting Your Platform

My readers know that I’m a pretty staunch advocate for not weakening encryption. Backdoors and “special” keys for organizations that claim they need them are a horrible idea. The safest lock is one that can’t be bypassed. The best secret is one that no one knows about. Likewise, the best encryption algorithms are ones that can’t be reversed or calculated by anyone other than the people using them to send messages.

I get that the flood of encrypted communications today is making life difficult for law enforcement agencies all over the world. It’s tempting to make it a requirement to allow them a special code that will decrypt messages to keep us safe and secure. That’s the messaging I see every time a politician wants to compel a security company to create a vulnerability in their software just for them. It’s all about being safe.

Once you create that special key you’ve already lost. As we saw above, the intentions of creating a backdoor into an OS so that we could spy on other people using it backfired spectacularly. Once someone else figured out that you could guess the values and decrypt the traffic they set about doing it for themselves. I can only imagine the surprise at the NSA when they realized that someone had changed the values in the OS and that, while they themselves were no longer able to spy with impunity, someone else could be decrypting their communications at that very moment. If you make a key for a lock someone will figure out how to make a copy. It’s that simple.

We focus so much on the responsible use of these backdoors that we miss the bigger picture. Sure, maybe we can make it extremely difficult for someone in law enforcement to get the information needed to access the backdoor in the name of national security. But what about other nations? What about actors not tied to a political process or bound by oversight from the populace. I’m more scared that someone that actively wishes to do me harm could find a way to exploit something that I was told had to be there for my own safety.

The Juniper story gets worse the more we read into it but they were only the unlucky dancer with a musical chair to slip into when the music stopped. Any one of the other companies that were compelled to include Dual EC by government order could have gotten the short straw here. It’s one thing to create a known-bad version of software and hope that someone installs it. It’s an entirely different matter to force people to include it. I’m honestly shocked the government didn’t try to mandate that it must be used exclusively of other algorithms. In some other timeline Cisco or Palo Alto or even Fortinet are having very bad days unwinding what happened.

Tom’s Take

The easiest way to avoid having your software exploited is not to create your own exploit for it. Bugs happen. Strange things occur in development. Even the most powerful algorithms must eventually yield to Moore’s Law or Shor’s Algorithm. Why accelerate the process by cutting a master key? Why weaken yourself on purpose by repeating over and over again that this is “for the greater good”? Remember that the greater good may not include people that want the best for you. If you’re wiling to hand them a key to unlock the chaos that we’re seeing in this case then you have overestimated your value to the process and become the very bad actor you hoped to stop.

Thoughts On Encryption


The debate on encryption has heated up significantly in the last couple of months. Most of the recent discussion has revolved around a particular device in a specific case but encryption is older than that. Modern encryption systems represent the culmination of centuries of development of making sure things aren’t seen.

Encryption As A Weapon

Did you know that twenty years ago the U.S. Government classified encryption as a munition? Data encryption was classified as a military asset and placed on the U.S. Munitions List as an auxiliary asset. The control of encryption as a military asset meant that exporting strong encryption to foreign countries was against the law. For a number of years the only thing that could be exported without fear of legal impact was regular old Data Encryption Standard (DES) methods. Even 3DES, which is theoretically much stronger but practically not much better than it’s older counterpart, was restricted for export to foreign countries.

While the rules around encryption export have been relaxed since the early 2000s, there are still some restrictions in place. Those rules are for countries that are on U.S. Government watch lists for terror states or governments deemed “rogue” states. This is why you must attest to not living in or doing business with one of those named countries when you download any software that contains encryption protocols. The problem today is that almost every piece of software includes some form of encryption thanks to ubiquitous functions like OpenSSL.

Even the father of Pretty Good Privacy (PGP) was forced to circumvent U.S. Law to get PGP into the hands of the world. Phil Zimmerman took the novel approach of printing the entirety of PGP’s source code in book form and selling it far and wide. Since books are protected speech, no one could stop them from being sold. The only barrier to creating PGP from the text was how fast one could type. Today, PGP is one of the most popular forms of encrypting written communications, such as emails.

Encryption As A Target

Today’s issues with encryption are rooted in the idea that it shouldn’t be available to people that would use it for “bad” reasons. However, instead of being able to draw a line around a place on a map and say “The people inside this line can’t get access to strong encryption”, we now live in a society where strong encryption is available on virtually any device to protect the growing amount of data we store there. Twenty years ago no one would have guessed that we could use a watch to pay for coffee, board an airplane, or communicate with loved ones.

All of that capability comes with a high information cost. Our devices need to know more and more about us in order to seem so smart. The amount of data contained in innocuous things causes no end of trouble should that data become public. Take the amount of data contained on the average boarding pass. That information is enough to know more about you than is publicly available in most places. All from one little piece of paper.

Keeping that information hidden from prying eyes is the mission of encryption. The spotlight right now is on the government and their predilection to looking at communications. Even the NSA once stated that strong encryption abroad would weaken the ability of their own technology to crack signal intelligence (SIGINT) communications. Instead, the NSA embarked on a policy of sniffing the data before it was ever encrypted by installing backdoors in ISPs and other areas to grab the data in flight. Add in the recent vulnerabilities found in the key exchange process and you can see why robust encryption is critical to protecting data.

Weakening encryption to enable it to be easily overcome by brute force is asking for a huge Pandora’s box to be opened. Perhaps in the early nineties it was unthinkable for someone to be able to command enough compute resources to overcome large number theory. Today it’s not unheard of to have control over resources vast enough to reverse engineer simple problems in a matter or hours or days instead of weeks or years. Every time a new vulnerability comes out that uses vast computing power to break theory it weakens us all.

Tom’s Take

Encryption isn’t about one device. It’s not about one person. It’s not even about a group of people that live in a place with lines drawn on a map that believe a certain way. It’s about the need for people to protect their information from exposure and harm. It’s about the need to ensure that that information can’t be stolen or modified or rerouted. It’s not about setting precedents or looking good for people in the press.

Encryption comes down to one simple question: If you dropped your phone on the floor of DefCon or BlackHat, would you feel comfortable knowing that it would take longer to break into than the average person would care to try versus picking on an easier target or a more reliable method? If the answer to that question is “yes”, then perhaps you know which side of the encryption debate you’re on without even asking the question.