The human element of cybersecurity

If humans aren’t included in the design process of new technologies and we don’t train analysts to work together, why are we still blaming humans for our cybersecurity woes?

|
Ng Han Guan
In this Aug. 16, 2016 file photo, a worker is silhouetted against a computer display showing a live visualization of the online phishing and fraudulent phone calls across China during the 4th China Internet Security Conference (ISC) in Beijing. Chinese electronics maker Hangzhou Xiongmai Technology has issued a recall on Monday, Oct. 24, 2016, for millions of products sold in the U.S. following a devastating cyberattack, but has lashed out at critics who say its devices were at fault.

If you’re inclined to think of cybersecurity as lending itself to clean, elegant, better-than-human, extremely secure solutions, you probably don’t work in the field.

But one bias held by many in information security is that much of the mess is because humans — not hackers, shoddy software or poorly-built devices — are the source of the vast majority of our digital vulnerabilities. Why extend the time and energy to hack into a heavily-guarded system, security experts might opine, if you can simply trick a user into clicking a link laden with malware?

If businesses didn’t have to deal with the “end user” (that is, you and I), this reasoning goes, all our problems would be solved.

This represents a quiet bias against users in nearly every conversation about cybersecurity. Unfortunately, this bias means that humans have become an afterthought in the design of our technology. Unraveling this is part of what makes cybersecurity a “wicked problem,” or a problem that resists resolution and can’t be solved without a multi-disciplinary approach.

Just because a problem resists resolution doesn’t mean that it can’t be broken down into smaller parts in order to make progress on the whole.

That’s where the work of Nancy Cooke, a professor of human systems engineering at Arizona State University, comes in. By remembering people in the design of our technology and by training cybersecurity analysts to work together, we can put people back into our digital security tools and make the world safer.

The ghost in the machine

As we develop new technology, we often begin with a novel technological approach instead of a focus on who is supposed to be using our new tool.

“It’s a mess of technology that’s out there with good intentions, but doesn’t work well with humans,” says Dr. Cooke.

Cooke advocates that developers include users in the planning stages as they design and build new security technologies.

Her colleague, Jamie Winterton, director of strategy at ASU’s Global Strategy Initiative, concurs.

At first, engineers think “‘This would be a really great technological solution to this problem’ and then rush forward and build it. Then we say, ‘Now, how do we make it secure?’ But it’s a lot harder to secure something after you’ve already built it than if you start to think about security and the way that real people are going to use the technology in the design process,” says Ms. Winterton.

The human element that does seep into our current development process? Implicit biases. Machine-learning algorithms, for example, can reflect the biases of the engineers who write them or biases within the training data fed to them.

“We like to romanticize this idea of machines being pure and perfect, but we are in the machines because we made them,” says Winterton.

Lone wolves need packs

In defending against digital threats, too, thinking about how to make technology and humans work well together would serve an immense good. Many cyber analysts work alone and are not incentivized to work in teams, says Cooke. This can be a weakness because lone wolves can’t do nearly as well in addressing complex problems as a team with varied backgrounds, according to Cooke’s research.

And even when there are teams, like in the capture-the-flag competitions which have become popular with public and private cybersecurity recruiters,  Cooke says that the work that students and competitors do in these competitions is closer to individual work than it is to teamwork.

The distinction being that sitting close to coworkers does not mean communication between members, while a team will approach a problem more deliberately, assigning tasks based on individual strengths and clearly defining roles in order to collaborate more effectively.

“It’s a much tighter kind of collaboration,” says Cooke. “You can put people together and you will get a group, but it doesn’t necessarily make them a good team.”

Some of the “lone wolf” mentality in cybersecurity is cultural: the mythos of the individual who can hack through difficult challenges alone is the stuff of hacker lore. Some of it is self-selection: the kinds of people that enjoy long hours coding tend to be less team-oriented, Cooke said.

Moving forward

Cybersecurity itself is a thorny, knotted issue, but both Cooke and Winterton say including the end user in the design process from the beginning is one important way to tighten or eradicate flaws that appear when a device or software is put into the hands of a user. Other solutions include incentivizing collaboration and better training for analysts.

“Broad groups of stakeholders should be engaged to make sure systems are secure, fair, and as useful as they can be,” says Winterton.

Incorporating the human element will reduce the risk inherent in cybersecurity systems and build a stronger framework for the development of new technology in the future.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to The human element of cybersecurity
Read this article in
https://www.csmonitor.com/Science/Complexity/2016/1028/The-human-element-of-cybersecurity
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe