BOOK THIS SPACE FOR AD
ARTICLE ADI'm very excited to share my latest research on best practices for successfully influencing employee cybersecurity behavior. Excited may not be the right word exactly, as this research was born out of the disappointment I started feeling when hearing of security leaders and teams implementing disciplinary sanctions for employees who fail phishing simulations, cybersecurity quizzes, or fall victim to scams such as business email compromise.
This punishment ranges from extreme sanctions, such as disciplining or terminating the offenders or victims, to less severe forms, including forcing employees to sit through more training. While the latter may sound okay, employees disagree. The debate raged about the ethics and effectiveness of these practices. And hence it took me a while to put pen to paper, because I get all sides of this dilemma.
This is what I decided: Sure, there is a time and place for disciplinary action, but leaders seemed to jump to it too readily. It seemed as though we could not see that some of the interventions we were putting in place reinforced negative perceptions and resentment of security, humiliated employees, caused psychological damage, and encouraged employees to hide failures and mistakes. Education and shame are not synonyms. You may win the battle, but the war is much bigger. As a security leader, your bigger opportunity is to engage, influence, and benefit your employees as well as your organization's customers, and even society, and to do this, you need to:
Be aware of the impact of each security intervention. When weighing consequences for negative security behaviors, security leaders often think of extreme punishments like formal disciplinary action or dismissal as deterrents. However, employees also view many well-meaning interventions as punitive, particularly if they overtax employee time and productivity and seem to lack empathy. Tread that fine line between engagement (e.g., quizzes), empathy (e.g., ask-and-listen hours), and punishment (e.g., dismissals).
Start by designing an environment tolerant of human fallibility -- this isn't purely an awareness or training problem. Before proceeding to punishment -- or indeed any sort of intervention -- you need to be very clear that you've done all that you can to support employees who have made a mistake or have become a victim. Your employees fall for scams -- real or simulated -- for many reasons, including: your test or simulation is too difficult to detect; your security awareness training is dull and tedious; you're not helping employees avoid errors; or you failed to design security processes and technologies that stop people from making errors.
Find positive ways to influence good security behavior and creativity. Instead of scaring employees into complying with your security rules, use empathy and recognition to create engagement. Employees who feel empowered can focus on solutions without fear. Forrester's Employee Experience Index shows that empowerment is the most significant predictor of engagement. Initiate positive reporting and messaging (e.g., communicate successes such as "X% completed the exercise this month, up from Y%" and "Clicks are down by Z%, and nonreporting is down by X%.") so employees are encouraged and respond to self-reported mistakes, nudge behaviors toward the correct action, and recognize and reward positive behaviors as they occur. Consider safety culture, where organizations celebrate success and change behavior via initiatives such as incentives, leaderboards, safety moments, and walls of fame.
Choose the appropriate behavior modification action. Outside gross negligence, employees should never suffer when their employer falls victim to a data breach, cyberattack, fraud, or scam. Before making the call about what intervention to use, decide whether your employee is a victim or has been blatantly and regularly breaching the rules. Use our severity versus repetition framework to segment offenders and create different interventions for each type of offender.
Make the tough calls when necessary, and always do so ethically. Listening, coaching, and changing processes are all well and good, but at some point, you need to face reality and discipline anyone who has been maliciously flouting the rules. To know when you've reached the point of making the tough call, consider these questions: Is their intent malicious? Are they bypassing processes repeatedly for inappropriate reasons, such as their seniority in the organization? If the answer to either of these is yes, you have every reason to act with ethics, integrity, empathy, candor, and transparency.
My key takeaway? Make empathy your new superpower in all the big and small things that you do. All of this recognition and behavioral change requires you to become a coach -- not a boss-- not only for your team, but also for all employees and stakeholders within your organization. Level up your leadership skills by eliminating passive management practices and fostering a strong coaching mindset. It is through this mindset that the suggestions above will seem less of a chore or a practical guide and more of a lifestyle that you and your team can implement.
As organizations seek to leverage emerging technology and intelligent automation in new ways, employees need to feel they can innovate and experiment consistent with the security and privacy values of the enterprise. But many organizations manage human risk through a model of control, coercion, and punishment -- from penalizing users who fail simulations or training to terminating offenders or victims of breaches. Security programs founded in fear not only drive down employee engagement and inspiration, but also stifle creativity. Instead, organizations must learn how to nurture positive behavior to foster a security culture that deals with human fallibility with positivity, instead of distress, reprimand and shame.
To learn more, register for Forrester's virtual events, Technology & Innovation APAC here and Security & Risk event here.
This post was written by Principal Analyst Jinan Budge, and it originally appeared here.