In this week’s episode of the Gestalt IT Rundown, I jumped on my soapbox a bit regarding the latest Pegasus exploit. If you’re not familiar with Pegasus you should catch up with the latest news.
Pegasus is a toolkit designed by NSO Group from Israel. It’s designed for counterterrorism investigations. It’s essentially a piece of malware that can be dropped on a mobile phone through a series of unpatched exploits that allows you to create records of text messages, photos, and phone calls and send them to a location for analysis. On the surface it sounds like a tool that could be used to covertly gather intelligence on someone of interest and ensure that they’re known to law enforcement agencies so they can be stopped in the event of some kind of criminal activity.
Letting the Horses Out
If that’s where Pegasus stopped, I’d probably not care one way or the other. A tool used by law enforcement to figure out how to stop things that are tough to defend against. But because you’re reading this post you know that’s not where it stopped. Pegasus wasn’t merely a tool developed by intelligence agencies for targeted use. If I had to guess, I’d say the groundwork for it was laid when the creators did work in some intelligence capacity. Where things went off the rails was when they no longer did.
I’m sure that all of the development work on the tool that was done for the government they worked for stayed there. however, things like Pegasus evolve all the time. Exploits get patches. Avenues of installation get closed. And some smart targets figure out how to avoid getting caught or even how to detect that they’ve been compromised. That means that work has to continue for this to be effective in the future. And if the government isn’t paying for it who is?
If you guessed interested parties you’d be right! Pegasus is for sale for anyone that wants to buy it. I’m sure there are cursory checks done to ensure that people that aren’t supposed to be using it can’t buy it. But I also know that in those cases a few extra zeros at the end of a wire transfer can work wonders to alleviate those concerns.Whether or not it was supposed to be sold to everyone or just a select group of people it got out.
Here’s where my hackles get raised a bit. The best way to prevent a tool like this from escaping is to never have created it in the first place. Just like a biological or nuclear weapon, the only way to be sure it can never be used is to never have it. Weapons are a temptation. Bombs were built to be dropped. Pegasus was built to be installed somewhere. Sure, the original intentions were pure. This tool was designed to save lives. What happens when the intentions aren’t so pure? What happens when your enemies are terrorist but politicians with different views? You might scoff at the suggestion of using a counterterrorism tool to spy on your ideological opponents, but look around the world today and ask yourself if your opponents are so inclined.
Once Pegasus was more widely available I’m sure it became a very tempting way to eavesdrop on people you wanted to know more about. Journalist getting leaks from someone in your government? Just drop Pegasus on that phone and find out who it is. Annoying activist making the media hate you? Text him the Pegasus installer and dump his phone looking for incriminating evidence to shut him up. Suspect your girlfriend of being unfaithful? Pegasus can tell you for sure! See how quickly we went from “necessary evil to protect the people” to “petty personal reasons”?
The danger of the slippery slope is that once you’re on it you can’t stop. Pegasus may have saved some lives but it has undoubtedly cost many others too. It has been detected as far back as 2014. That means every source that has been compromised or every journalist killed doing their work could have been found out thanks to this tool. That’s an awful lot of unknowns to carry on your shoulders. I’m sure that NSO Group will protest and say that they never knowingly sold it to someone that used it for less-than-honorable purposes. Can they say for sure that their clients never shared it? Or that it was never stolen and used by the very people that it was designed to be deployed against?
Closing the Barn Door
The escalation of digital espionage is only going to increase. In the US we already have political leaders calling on manufacturers and developers to create special backdoors for law enforcement to use to detect criminals and arrest them as needed. This is along the same lines as Pegasus, just formalized and legislated. It’s a terrible idea. If the backdoor is created it will be misused. Count on that. Even if the people that developed it never intended to use it improperly someone without the same moral fortitude will eventually. Oppenheimer and Einstein may have regretted the development of nuclear weapons but you can believe that by 1983 the powers that held onto them weren’t so opposed to using them if the need should arise.
I’m also not so naive as to believe for an instant that the governments of the world are just going to agree to play nice and not developer these tools any longer. They represent a competitive advantage over their opponents and that’s not something they’re going to give up easily. The only thing holding them back is oversight and accountability to the people they protect.
What about commercial entities though? If governments are restrained by the people then businesses are only restrained by their stakeholders and shareholders. And those people only seem to care about making money. So if the best tool to do the thing appears and it can make them a fortune, would they forego they profits to take a stand against categorically evil behavior? Can you say for certain that would always be the case?
Tom’s Take
Governments may not ever stop making these weapons but perhaps it’s time for the private sector to stop. The best ways to keep the barn doors closed so the horses can’t get out is not to build doors in the first place. If you build a tool like Pegasus it will get out. If you sell it, even to the most elite clientele, someone you don’t want to have it will end up with it. It sounds like a pretty optimistic viewpoint for sure. So maybe the other solution is to have them install their tool on their own devices and send the keys to a random person. That way they will know they are being watched and that whomever is watching them can decide when and where to expose the things they don’t want known. And if that doesn’t scare them into no longer developing tools like this then nothing will.
Pingback: The Death of Privacy in the Digital Age - Gestalt IT