Person-First Defense

Written by Madoline Markham
Photo by Beau Gustafson

In an age where something as simple as an email can hold a company’s server hostage, one thing is certain about information security: uncertainty. That’s why UAB Associate Professor Dr. Allen Johnston is working to reduce the question marks that surround the field he studies. What he has found is that the key lies not just in policies but in people.

Traditionally, companies tend to take what Johnston calls a “less mindful approach” to information security where they rely on well-articulated policies and procedures. “The problem you see is that there tends to be a moving field because hackers are always innovating, but the policies are static,” Johnston says. “We almost get on autopilot.” Instead, with a mindful approach to security, employees know they are needed to be alert and more creative in their approach to solutions.

Another area of concern Johnston sees is a traditional use of “fear appeals,” sending warning messages to employees that might say that a security breach has occurred, that the threat is real, and here is what you should do to avoid it. “The fear part doesn’t really work but instead backfires,” Johnston says. “Employees sometimes don’t feel capable, so they put their blinders on.”

Instead, Johnston says the best way to get employees to comply with a warning is still tell them about the threat but to focus on encouraging them to trust their training and judgment in how to proceed. “The employee feels more capable to help resolve it,” Johnston says. “That’s what you want—engaged, confident employees.”

Looking closely at human reaction also comes into play in how corporate procedure enforcement policies affect certain personality types. Many companies enforce sanctions for anyone who violates policy, so, for instance, if you don’t reset your computer password every six months, you will be locked out of your computer. Johnston has found that certain personalities get more agitated in response to policies such as these, which affects other parts of their job. Before long, disgruntled-ness that started with having to change a password could bleed into everything they do at work. “Most IT professionals don’t think about the impact of their rules in that way,” Johnston notes.

To investigate this dynamic, Johnston is using games on apps that test people’s behavioral traits, collecting data along the way. Through this game testing, Johnston hopes to identify people who might get inside an organization and let a hacker in. After all, about 75 percent of hacks are the result of
insider information being leaked. Deviants tend to keep to themselves and have problems with authority, among other traits.

Johnston’s colleagues Dr. Mark Keith at Brigham Young University and Dr. Sam Thompson at UAB found that by having students play games for bonus points, they would be more likely to cheat if the rules were relaxed. From this gamification, they looked at the common characteristics of those who cheated and found that they shared that of hackers.

But the researchers aren’t sure yet how these findings could best be applied to real-world scenarios. It could inform the hiring process, but still, there is a fine line. Companies want to know this information but not necessarily act on it, Johnston says. It is possible to screen for traits that might lead them to do something deviant—the key word is might.  “It’s easy to do academic research, but applying it to business is new,” Johnston says. “And I would imagine it’s scary too.”

One thing is for sure, Johnson says: Students in UAB’s business school will be familiar with this research upon their graduation and entrance to the local workforce, and Johnston believes they will have a heightened awareness of how to be secure with the technology
they use.

Where Bias Meets Innovation

Dr. Allen Johnston collaborates with colleagues, including Dr. Paul Di Gangi, in the Collat School of Business to research how new ideas are vetted. In one study, they asked UAB students about what services they wanted to see from the university, and a group of faculty met to come to a consensus on which of the ideas were most important.

What the researchers initially determined was that in each group of three, an alpha always arose, but it wasn’t always the highest ranking person or the one who spoke the most. Instead, this person was unconsciously elected.

From this, the researchers concluded that how you construct decision-making teams makes a big impact and shared the study’s initial results in presentations to companies like Regions and Protective Life. Who does the vetting and what biases those people bring has an effect on how they find the best—as opposed to the most popular—idea.