Risking It All


When’s the last time you thought about risk? It’s something we have to deal with every day but hardly ever try to quantify unless we work in finance or a high-stakes job. When it comes to IT work, we take risks all the time. Some are little, like deleting files or emails thinking we won’t need them again. Or maybe they’re bigger risks, like deploying software to production or making a change that could take a site down. But risk is a part of our lives. Even when we can’t see it.

Mitigation Revelations

Mitigating risk is the most common thing we have to do when we analyze situations where risk is involved. Think about all the times you’ve had to create a backout plan for a change that you’re checking in. Even having a maintenance window is a form of risk mitigation. I was once involved in a cutover for a metro fiber deployment that had to happen between midnight and 2 am. When I asked why, the tech said, “Well, we don’t usually have any problems, but sometimes there’s a hiccup that takes the whole network down until we fix it. This way, there isn’t as much traffic on it.”

Risk is easy to manage when you compartmentalize it. That’s why we’re always trying to push risk aside or contain the potential damage from risk. In some cases, like a low-impact office that doesn’t rely on IT, risk is minimal at best. Who cares if we deploy a new wireless AP or delete some files? The impact is laughable if one computer goes offline. For other organizations, like financial trading or healthcare, the risks of downtime are far greater. Things that could induce downtime, such as patches or changes, must be submitted, analyzed, approved, and implemented in such a way as to ensure there is no downtime and no chance for failure.

Risk behaves this way no matter what we do. Sometimes our risks are hidden because we don’t know everything. Think about bugs in release code, for example. If we upgrade to a new code train to fix an existing bug or implement a new feature we are assuming the code passed some QA checks at some point. However, we’re still accepting a risk that the new code will contain a bug that is worse or force us to deal with new issues down the road. New code upgrades have even more stringent risk mitigation, such as bake-in periods or redundancy requirements before being brought online. Those protections are there to protect us.

Invisible Investments In Problems

But what about risks that we don’t know about? What if those risks were minimized before we ever had a chance to look for them in a shady way?

For example, when I had LASIK surgery many years ago, I was handed a pamphlet that was legally required to be handed to me. I read through the procedure, which included a risk of possible side effects or complications. Some of them were downright scary, even with a low percentage chance of occurring. I was told I had to know the risks before proceeding with the surgery. That way, I knew what I was getting into in case one of those complications happened.

Now, legal reasons aside, why would the doctor want to inform me of the risks? It makes it more likely that I’m not going to go through with the procedure if there are significant risks. So why say anything at all unless you’re forced to? Many of you might say that the doctor should say something out of the goodness or morality of their own heart, but the fact a law exists that requires disclosure should tell you about the goodness in people’s hearts.

Medical providers are required to reveal risk. So are financial planners and companies that provide forward looking statements. But when was the last time that a software company disclosed potential risk in their platform? After all, their equipment or programs could have significant issues if they go offline or are made to break somehow. What if there is a bug that allows someone to hack your system or crash your network? Who assumes the risk?

If your provider doesn’t tell you about the risks or tries to hide them in the sales or installation process, they’re essentially accepting the unknown risk on your behalf for you. If they know there is a bug in the code that could cause a hospital core network to melt down or maybe reboot a server every 180 days like we had to do with an unpatched CallManager 6.0 server then they’ve accepted that risk silently and passed it along to you. And if you think you’re going to be able to sue them or get compensation back from them you really need to read those EULAs that you agree to when you install things.

Risky Responsibility

The question now becomes about the ultimate responsibility. These are the “unknown unknowns”. We can’t ask about things we don’t know about. How could we? So it’s up to people with the knowledge of the risk to disclose it. In turn, that opens them up to some kinds of risk too. If my method of mitigating the risk in your code is to choose not to purchase your product, then you have to know that it was less risk for me to choose that route. Sure, it’s more risky for you to disclose it, but the alternative could lead to a big exposure.

Risk is a balance. In order to have the best balance we need to have as much information as possible in order to put plans in place to mitigate it to our satisfaction. Some risks can’t be entirely mitigated. But failure to disclose risks to prevent someone from making a sale or implementing a technology is a huge issue. And if you find out that it happened to you then you absolutely need to push back on it. Because letting someone else accept the risk on your behalf in secret will only lead to problems down the road.


Tom’s Take

Every change I checked into a network during production hours was a risk. Some of them were minor. Others were major. And still others were the kind that burned me. I accepted those risks for myself but I always made sure to let the customer I was working for know about them. There was no use in hiding information about a change that could take down the network or delete data. Sure, it may mean holding off on the change or making it so that we needed to find an alternative method. But the alternative was hurt feelings at best and legal troubles at worst. Risk should never be a surprise.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s