Hero-cybersecurity

How to embed cybersecurity into the user experience of your operators

July 13, 2022
"No operator is going to speak up in a conference room and tell the CEO that having to type in a password is a problem."

A mature cybersecurity program has established leadership, formalized policies, procedures, and processes. It promotes awareness and is continuously improving. Much like the goal of a safety program designed to protect workers from personal bodily injury, illnesses or deaths, a cybersecurity program protects people, assets, and data from harmful cyber-threats. Just like operators know to wear steel-toed boots, hard hats, and safety glasses on the manufacturing floor, protecting systems and data should be a matter of habit. 

Also read: Tips for working cleaner in the era of digital transformation

Unfortunately, we often treat cybersecurity as a stand-alone issue, rather than something that needs to be integrated into an operator’s workflow. When our people are already being asked to do more with less, and then we add another layer with cybersecurity protocols, we create inefficiencies, and resistance to secure practices. 

Cybersecurity professionals often say that company culture needs to start with leadership. That is true, but it’s not the company leadership that stares at a system screen for an eight-hour shift. To truly make cybersecurity part of the company culture we need to focus on the user experience of the operators on the floor when implementing security tools and controls. If the tool or process is too difficult or time-consuming to use or follow, operators will find alternative methods that could potentially create more risks. 

Reframe the perspective 

Cybersecurity tools and technologies are often designed to address certain security requirements or perform a critical function. How the tool or technology is to be used by security professionals, employees, etc. is seldom considered. 

By not considering usability factors, the person behind the keyboard may create greater risk for an organization by working around controls if they are perceived to make their tasks more difficult or prevents them from working efficiently. This is a common occurrence when cybersecurity protections limit a user’s access or permissions, which can impact an operator’s ability to do her job. 

Employees represent both value and risks to an organization. By focusing on usability, we increase the organization’s value by enabling workers to perform their jobs securely. 

We need to remember that our people are also our first—and deepest—line of defense. Instead of restricting them, we need to find ways to enable them. Users of cybersecurity tools or technologies should be an active part of the process when designing, selecting, and implementing solutions. Understanding how job performance will be affected once controls are implemented is critical to ensure the new control will work.   

It is important to understand what users need in order to do their jobs better, so that security measures can be built into the workflow. This means truly understanding the challenges they face throughout their shifts. 

As an example, let us imagine that an operator of a plant is at the end of her shift when the system starts to trigger alarms. With alarm bells ringing, the operator attempts to login in. Security practices call for a 16-digit password with special characters, and in her haste, our operator mistypes the password. Then they mistype it again. At this point she is tired, she is stressed because the system is going haywire, and she can’t log in to do her job. 

When the operator is locked out of the system because of several mistyped passwords, the organization's efforts to protect themselves have backfired. Yes this is an extreme and oversimplified example, but we can be sure similar real-world examples exist. 

Use technology to enable the process 

No operator is going to speak up in a conference room and tell the CEO that having to type in a password is a problem. Yet when we look at real-world situations, we see how requiring strong passwords can prevent an operator from working efficiently. 

That insight could allow the organization to alter their security practices to use a fingerprint scan, PIN, or smartcard to give the user access. Those authentication options are just as secure, but much more efficient.  

We also need to remember that cybersecurity goes beyond strong passwords and guarding against malicious links in emails. Cybersecurity is a process of risk-management that helps identify—and mitigate—threats before they become real problems. To do that successfully, we need the entire organization to take responsibility for cybersecurity, rather than putting the onus on operators on the floor. 

As another example, any industrial-automation system can be programmed to provide important self-troubleshooting information and display error logs, which enables technicians to quickly understand if an error is a malicious attack or merely a faulty part. The logic knows why it is or is not doing something…why isn’t that communicated to the operator as well?  

Most systems are programmed as black boxes that require someone to log on to the controllers and tell operations why something isn’t working. All computers and network hardware has status files that can be easily exposed to an HMI screen so anyone can see if there are anomalies. We don’t need expensive artificial-intelligence-driven tools; we simply need to unlock the data that already exists in our control systems. Most operations folks are unaware these displays can exist because the system has not been configured to show them.  

Often, the root cause of this is siloed departments in which there is a big divide between the IT and OT departments. They are not working in concert with each other for the good of the organization. 

Users are the key stakeholder 

It is critical that we allow departments to focus on what they are good at. IT is great at servers, ethernet networks, and databases. OT must leverage this knowledge and relieve itself of this burden to focus on its own challenges of operational improvement, equipment maintenance and obsolescence, etc. 

However, when it comes to cybersecurity, we must be all-inclusive; we must be tightly orchestrated to balance effectiveness with risk-mitigation.  

It is unclear to us why the folks most impacted by cybersecurity measures are rarely included as key stakeholders. What we do know is that if you want a truly well-managed and secure environment, the operators must be at the table from the beginning. We’ve found that most operators are aware of these issues from simply watching news reports. They are eager to hep and want to see their organizations implement protective measures. They must be at the table to work out the most effective plans. 

If it is known that our people are our biggest risk, then we must include them. Usability considers how well a system or process can be used or adopted by the actual user. If we implement cybersecurity in a manner that doesn’t factor usability, then adoption will be slow or non-existent. Users will seek work-arounds that will be less secure and create more risks for organizations. 

By Dean Ford, CAP, PE, COO and managing principal engineer of Luminary Automation, Cybersecurity and Engineering, member of the Control System Integrators Association (CSIA) Don Wells, CEO and managing principal of Luminary Automation, Cybersecurity and Engineering