If you grew up in the 80s watching movies like me, you’ll remember Wargames. I could spend hours lauding this movie but for the purpose of this post I want to call out the sequence at the beginning when the two airmen are trying to operate the nuclear missile launch computer. It requires the use of two keys, one each in the possession of one of the airmen. They must be inserted into two different locks located more than ten feet from each other. The reason is that launching the missile requires two people to agree to do something at the same time. The two key scene appears in a number of movies as a way to show that so much power needs to have controls.
However, one thing I wanted to talk about in this post is the notion that those controls need to be visible to be effective. The two key solution is pretty visible. You carry a key with you but you can also see the locks that are situated apart from each other. There is a bit of challenge in getting the keys into the locks and turning them simultaneously. That not only shows that the process has controls but also ensures the people doing the turning understand what they’re about to do.
Consider a facility that is so secure that you must leave your devices in a locker or secured container before entering. I’ve been in a couple before and it’s a weird feeling to be disconnected from the world for a while. Could the facility do something to ensure that the device didn’t work inside? Sure they could. Technology has progressed to the point where we can do just about anything. But leaving the device behind is as much about informing the user that they aren’t supposed to be sharing things as it is about controlling the device. Controlling a device is easy. Controlling a person isn’t. Sometimes you have to be visible.
Discomfort Design
Security solutions that force the user out of a place of comfort are important. Whether it’s a SCIF for sharing sensitive data or forcing someone to log in with a more secure method the purpose of the method is about attention. You need the user to know they’re doing something important and understand the risks. If the user doesn’t know they’re doing something that could cause problems or expose something crucial you will end up doing damage control at some point.
Think of something as simple as sitting in the exit row on an airplane. In my case, it’s for Southwest Airlines. There’s more leg room but there’s also a responsibility to open the door and assist in evacuation if needed. That’s why the flight attendants need to hear you acknowledge that warning with a verbal “yes” before you’re allowed to sit in those seats. You have admitted you understand the risks and responsibilities of sitting there and you’re ready to do the job if needed.
Security has tried to become unobtrusive in recent years to reduce user friction. I’m all about features like using SSL/TLS by default in websites or easing restrictions on account sharing or even using passkeys in place of passwords. But there also comes a point when encapsulating the security reduces its effectiveness. What about fishing emails that put lock emojis next to URLs to make they seem secure even when they aren’t? How about cleverly crafted login screens for services that are almost indistinguishable from the real thing unless you bother to check the URL? It could even be the tried-and-true cloned account on Facebook or Instagram asking a friend for help unlocking their account only to steal your login info and start scamming everyone on your friends list.
The solution is to make users know they’re secure. Make it uncomfortable for them so they are acutely aware of heightened security. We deal with it all the time in other areas of our lives outside of IT. Airport screenings are a great example. So are heightened security measures at federal buildings. You know you’re going somewhere that has placed an emphasis on security.
Why do we try to hide it in IT? Is it because IT causes stress due to it being advanced technology? Are we worried that users are going to drop our service if it is too cumbersome to use the security controls? Or do we think that the investment in making that security front and center isn’t worth the risk of debugging it when it goes wrong? I would argue that these are solved problems in other areas of the world and we have just accepted them over time. IT shouldn’t be any different.
Note that discomfort shouldn’t lead to a complete lack of usability. It’s very easy to engineer a system that needs you to reconfirm your credentials every 10 minutes to ensure that no one has hacked you. And you’d quit using it because you don’t want to type in a password that often. You have to strike the right balance between user friendly and user friction. You want them to notice they’re doing something that needs their attention to security but not so much that they’re unable to do their job or use the service. That’s where the attention should be placed, not in cleverly hiding a biometric scanning solution or certificate-based service for the sake of saying it’s secure.
Tom’s Take
I’ll admit that I tend to take things for granted. I had to deal with a cloned Facebook profile this past weekend and I worried that someone might try to log in and do something with my account. Then I remembered that I have two-factor authentication turned on and my devices are trusted so no one can impersonate me. But that made me wonder if the “trust this device” setting was a bit too easy to trust. I think making sure that your users know they’re protected is more critical. Even if it means they have to do something more performative from time to time. They may gripe about changing a password every 30 days or having to pull out a security token but I promise you that discomfort will go away when it saves them from a very bad security day.