How cyberpsychology helps prevent human errors leading to data leaks

Thursday, February 15, 2024
Data
News

Hackers know how to manipulate people

According to a study by Stanford University, 88% of data security breaches start with some human error. Cybercriminals know it's much easier to trick a human being and thus gain access to data than to break ever better security technology. They use social engineering to get an employee to click on a sent link or open an attachment, allowing malware into the system.

It is estimated that a successful ransomware attack occurs every 11 seconds. Activated by inattention or ignorance, the malware takes control of a network and computer - in return for unlocking it, the hackers demand a ransom. Anyone can fall victim to the increasingly sophisticated methods of hackers sending emails and impersonating messages from trusted institutions, banks, and couriers.

How can psychology help?

Cybersecurity training is by far the best way to safeguard against hackers. Nevertheless, they are not always as effective as expected. In the flurry of responsibilities and stress, people quickly forget what they have learned and make errors.

Engineers and psychologists are trying to study human-technology interactions, identify sensitive situations, and reduce human errors. A new concept of "human-centered cybersecurity" is being created based on the latest research, which involves understanding how employees use computers. Cyberpsychology combines technical knowledge of cybersecurity with human behavioral science to develop new principles for working safely with new technologies.

Following this new discipline, when building a data protection policy, you need to pay attention to several elements:

Change employees' attitudes toward technology. People tend to cede their responsibility to technology: "The computer should always work well, and the IT department is responsible for security, so I don't need to bother with hackers." With such an attitude, cybersecurity is seen as a burden – it requires attention, not routine use of technology on autopilot.

Training should emphasize a different approach, such as zero trust in incoming emails, which may include phishing. This, of course, does not exclude the responsibility of the IT department to apply technical safeguards. Doctors won't blame the equipment if a stethoscope malfunctions because they know how the working one functions and can react. The same awareness should apply to IT equipment and systems.

Make employees identify with their IT infrastructure. Cyber-risky behavior among employees can result from treating computers as if they were the company's equipment, not their own.

Experiments conducted by Professor Phil Morgan and his team from New Castle University have shown that personalizing a device increases the sense of responsibility. Professor Morgan worked for Airbus for three years, studying the causes of risky behavior in cyberspace. In several studies, he concluded that the assumption that "a computer is a company's hardware" results in an automatic shift of responsibility for data protection to the organization and a lack of a sense of shared responsibility.

Build a human-centered approach to cyber security based on the triad of knowledge, awareness, and understanding. Knowledge is gained through systematic training on data protection and new methods used by hackers. To enhance the understanding component, create a culture of learning from mistakes. The organization must communicate that an employee's mistake that leads to a hacker attack will not be a reason for dismissal or punishment but is part of a culture of continuous process improvement.

Such an approach ensures that employees are willing to report errors faster, allowing the organization to respond quickly to an incident and minimize its impact. On the other hand, hiding an incident leads to increased harm proportionally to the time elapsed.

Professor Morgan argues that instead of seeing a security mistake as something to be ashamed of, we should see it as something that happens to people and is normal. A culture that encourages error reports further helps build trust within an organization. Just as reporting medical errors is essential to improving patient safety, reporting security incidents makes it easier to strengthen protection against hackers.

Personalize the user interface so that the computer is treated as a user-friendly tool. People who have been victims of cyber-attacks often claim to have been well aware of the threat beforehand and don't know how they could have fallen into such a trap.

The problem may be incorrect IT system configuration. People forget to follow procedures because, over time, they perform repetitive actions mechanically. That's why systems should remind them, for example, to log out (or do it automatically after a particular time), change their passwords regularly, etc. One of the methods is the implementation of a few seconds' delay in accessing data or the need for additional verification by using multi-factor authentication so that an employee does not act in an affair but has time to think carefully about whether he wants to open an attachment or share data. 

Where should you start to implement cybersecurity principles?

Cybersecurity training once a year isn't enough anymore because protecting data is a process that needs to be addressed daily. Managing data security starts when hiring and onboarding a new employee. People need to know that they are responsible for the computers they work on, but in case of a mistake, they are welcome to report any data security incidents.

Training should be conducted in groups according to organizational roles since the interaction of medical and administrative staff with computers is different.

Risks identified by the IT department should be analyzed and communicated. It is recommended to include an emotional factor to remember better the information conveyed, for example, showing a case study of data leakage and its consequences, citing data from current reports, etc.

It can't just be raw data, but examples showing what happens when an attack occurs, how patients, personnel, and the entire healthcare facility are affected, how much stress it can cost to return to normal operations, etc.

The level of cyber security also needs to be monitored by arranging simulated cyberattacks. One idea is to introduce an element of gamification. For example, a group of employees take on the role of a group of hackers and are tasked with creating a malicious phishing message. This one is then distributed among workers by the IT department. In the end, it is verified on which computers the email was opened and analyzed why it happened.

Upgrade your cyber security standards

The methods used by hackers are rapidly evolving, and now cybercriminals also have new tools at their disposal to operate even more effectively. These include artificial intelligence and deep fakes. Thus, healthcare entities must update their methods to avoid falling into criminals' traps.

Data protection-critical incidents resulting from human error are often rooted in stress, routine, negative attitudes toward IT, and deficits in employees' identification with the healthcare facility. The first front line of protection should always be technology - antivirus systems, software updates, real-time security copies. The second is that people are informed about threats and their role in defending against hackers.