The Threat from Within
On May 21, 2014, the accounting director at AFGlobal Corp. in Texas read an email from his CEO asking him to work with the attorney for an outside auditor in “a strictly confidential financial operation.”
The attorney soon contacted the accounting director and said that $480,000 was needed for due diligence costs pursuant to a pending acquisition of a Chinese company. The attorney sent an email with the wiring instructions, which the director followed.
About a week later, the attorney requested $18 million, at which point, the accounting director became suspicious and told his supervisors.
It is probably no surprise that the “attorney” was an imposter and the email was not from the CEO.
Risk managers schooled on the threat of social engineering would also guess that the imposter knew a great deal about the company’s processes and procedures.
He in fact knew that the accounting director had a “long-standing, very personal and familiar relationship” with the CEO, according to a lawsuit filed by AFGlobal seeking to force its insurance company to repay it for its losses.
“We are now way past the old style of people trying to crack codes to get in through firewalls. The hacking community realized that the weak point in any defense system is the people element,” said Roger Miles, who teaches risk-related psychology at Cambridge University and the UK Defence Academy.
“Ordinary employees simply do not have insight into the risk coming at them. The hackers are pretty smart at understanding that,” said Miles, who also researches and consults on risk perception, regulatory design and governance.
“Given the option between the effort of hacking code or getting the average employee in the organization to hand you the key, they clearly see a better return on time spent,” he said.
The risk is staggering.
The Experian Data Breach Resolution and Ponemon Institute found that about 80 percent of all data breaches began with employee activity. Verizon’s Security Breach Investigation Report said that of the top five ways that result in 95 percent of security breaches, four of those five directly involve employee behavior.
Some employee activities are malicious: The IT employee at the American College of Education who allegedly changed the system password before being fired and then offered to sell it back to the organization for $200,000; or the 20 percent of employees who admitted to a Market Pulse Survey by SailPoint that they would sell their company passwords, some for as little as $150.
But most employees are culpable only of not being wary enough.
“You have to be more pessimistic, more mistrusting and more suspicious of intentions,” said Miles. “That’s not a natural behavior of humans. Social engineering exploits ordinary people’s natural goodwill.”
It doesn’t have to be as blatant as the bogus CEO’s email to his accounting department. And that scam probably didn’t start there. That’s where it ended.
It starts with easy questions, by phone or email, from an apparent co-worker or vendor asking for a name or a title. Then, the hackers dive deeper, pulling together corporate hierarchies, co-worker relationships and personal activities.
Sometimes, the emails, apparently from inside the organization, ask users to click onto a link to review a file or log onto a training session.
Or it could be an email from a vendor with an attached invoice that has to be paid or a message from a merchant with instructions on when a package is expected to be delivered, said Larry Lidz, chief information security officer at CNA. When the attachments are clicked, the embedded malicious materials are used to access systems.
“If the program isn’t structured correctly, sometimes these types of claims can fall within the cracks.” —Rob Rosenzweig, vice president and national cyber risk practice leader, Risk Strategies Co.
In the day-to-day time crunch at work, employees may not take the time to look at such innocuous emails suspiciously. And once the link is clicked, the hackers are inside the system.
“As far as my experience,” said Austin Berglas, senior managing director and cyber defense practice head at K2 Intelligence, “with few exceptions, the majority of the successful breaches start because a cyber criminal exploits an employee or third-party who had connectivity inside the target network.”
He said 11 percent to 15 percent of employees will click on an infected email attachment.
“That’s a pretty significant number,” he said. “It often just takes one.”
Martin Frappolli, senior director of knowledge resources at The Institutes, which provides training for insurance and risk management professionals, said that corporations are sometimes too distracted by the never-ending news of big data breaches and external cyber liability risks to focus on the risks of employee behavior.
“I don’t want to say [employee behavior] is more important [than external risks] but it’s not getting its full share of attention,” he said. “It’s not top of mind for many organizations.”
He cited an example where criminals left USB memory sticks inside various restrooms of a corporation. They were labeled “confidential salary information.” Not surprisingly, employees who found them plugged the USBs into their computers.
That launched programs that captured and transmitted sensitive data to a criminal organization, Frappolli said.
“That’s a really good example of how easy it is to exploit employees,” he said. “Rarely is it deliberate. It’s not malicious. … It is behaviors they could be educated about.”
Cost may be one reason organizations have not focused more on employee training, but Frappolli said, it’s also a belief that the “threat always seems a little bit more remote than it is.”
Quarterbacking Cyber Risk
Rob Rosenzweig, vice president and national cyber risk practice leader at Risk Strategies Company, said risk managers must take ownership of cyber security.
“In many ways, risk managers are the quarterbacks,” said Rosenzweig, a 2017 Risk & Insurance® Power Broker® winner in the Technology category.
“The coordination of those stakeholders internally often falls on risk management at our clients,” he said. “They are able to drive home to those various stakeholders what the risk is and why everyone should be involved in the process.”
Plus, he said, risk managers are more aware of resources, such as training or other proactive measures, which can be provided either at no cost or at a discounted rate by insurance companies or brokers.
“It’s a win-win for everybody,” Rosenzweig said. “For clients, it’s prevention and for insurers, it makes their clients better risks. I expect to see more in the coming years.”
Prevention, of course, depends on whether the breach was due to carelessness or malfeasance.
“If you have an internal person who knows you really well and has gone over to the dark side for criminal acts, that’s a tough one to deal with,” said Bob Parisi, managing director, Marsh FINPRO.
But companies can make it more difficult for them. It can be as simple as automatically canceling log-in credentials when employees leave or when vendors complete their work, said Berglas of K2 Intelligence.
A survey by Sailpoint found that two of five former employees could still access their former company’s computer system after they left.
It’s also crucial to segregate data, so that files are available only to employees who need the information to do their jobs, Berglas said. Requiring two-factor authentication, such as an additional password or requiring the use of a thumbprint on an iPhone, is also important, he said.
Michael Kaiser, executive director of the National Cyber Security Alliance, said companies should determine their “most critical or crown jewel assets. What would harm you the most if stolen, lost or destroyed and how do you build protection layers around that?”
Segregating information, he said, can be more challenging in small to mid-size organizations where responsibilities are more diverse.
According to a survey by Kaspersky Lab and B2B International, intentional fraud by employees in enterprise companies amounts to more than $1.3 million in costs, said Andrey Pozhogin, cyber security expert at Kaspersky Lab, in an email.
For small to medium size businesses, it results in over $40,000 in costs per incident, on average. The cost of falling for phishing is more than $48,000 per incident, he said.
Since the report was published, Pozhogin said, Kaspersky Lab has seen a huge jump — eight times as many — in ransomware attacks on companies, which mostly result from phishing.
Beazley reported in January that ransomware attacks quadrupled in 2016 over the previous year, and it expects the attacks to double again in 2017.
According to the FBI, business email compromise, which it defines as sophisticated scams targeting businesses working with foreign suppliers and/or businesses that regularly perform wire transfer payments, affected 7,066 businesses from October 2013 to August 2015, for a total loss to U.S. companies of $750 million.
The loss increases to $1.2 billion, when combined with international victims.
Pozhogin said the biggest challenges in implementing employee training “are underestimating the risk (both probability and potential impact of a cyber incident) and significant friction when implementing yet another employee education program.”
Tom Dunbar, senior vice president and head of information risk management at XL Catlin, said employees take educational efforts seriously when organizations discuss the consequences and risks of lax cyber security.
“When you demonstrate why you are doing it, why it has meaning, then you get the cooperation,” he said.
Dunbar, who earned a 2014 Risk & Insurance® Risk All Star award for his innovative cyber security work, said his company engages employees by using humorous gamification in its online training to focus on specific cyber security risks, and then it tests them — throughout the year — to keep the messages fresh.
“Given the option between the effort of hacking code or getting the average employee in the organization to hand you the key, they clearly see a better return on time spent.” —Roger Miles, professor of risk-related psychology, Cambridge University and the UK Defence Academy
Last year, it also ran a video campaign on cyber security risks and responses. For every view by an employee, the company donated $1 to charity, he said. His department also uses blogging as well as computer screen savers and wall posters to reinforce the messages.
The training teaches employees, for example, to “mouse over” the link of an email or the firm name and address to see if there are clues to a phishing attempt.
The company also sends out false emails to employees to see “how many we hooked and how many swam away” from a phishing attempt, Dunbar said. Then it sends out the email again highlighting the elements that should have clued in employees that the email was phony.
It does the same with phone calls. Using a third-party, employees may get a phone call from the help desk or vendor asking for information.
“It’s really trying to get colleagues to understand that attackers, the phishers, will try to come from any angle,” he said.
Dunbar said his team partners with legal, compliance, HR, marketing and other areas “to make sure we have support and things resonate but the actual program — creating it, designing it, is done by us.”
The phishing “exercises create a lot of awareness,” said John Coletti, chief underwriting officer for cyber and technology at XL Catlin. “They are very highly discussed internally and people will say, ‘Hey, did it get you?’ ”
“A good [training] program,” said Frappolli of The Institutes, “is repeated and updated on a regular basis. It’s a big mistake to do one-time training and then they are off chasing the next fire.”
It’s also a mistake, he said, to give total responsibility for cyber security to the IT department.
“I think it primarily belongs to the risk manager,” Frappolli said. “The risk manager, the HR department and the IT folks should be in lock step on how to educate employees and how to close off threats.”
Companies must also create an environment that is open, said Kaiser of the National Cyber Security Alliance.
“If the response to [clicking on a phishing email] is, ‘How could you be so stupid?’ people aren’t going to tell you,” he said.
And that knowledge is important or hackers could be in a company’s network for months without the company being aware.
Marsh’s Parisi said his favorite testing exercise is when companies send out fake emails and when the link is clicked, the worker’s computer displays a note that the system was taken over by a hacker. After 10 seconds or so, the display changes to a notification that the worker failed the cyber security exercise and must sign up for the next training class.
“I find it odd at times that a lot of companies aren’t as energetic or enthusiastic about training rather than building the latest and newest firewall,” Parisi said.
Crime or Cyber?
When underwriting policies, XL Catlin’s Coletti looks at not only IT system defenses, but also turnover within key areas and outsourcing changes that may result in disgruntled employees. He also looks at segregation of data access, to ensure that employees are limited to necessary data only.
“I think most of the companies we look at have very good employee training, particularly around phishing campaigns. Companies are extremely sensitive to the fact that phishing is the easiest way for a hacker to get a foothold in your organization,” he said.
When it comes to social engineering schemes, such as when an employee wires funds to an imposter, that can create problems with insurance coverage, he said.
“My sense is that doesn’t sound like cyber coverage to me,” Coletti said. “If I trick you into wiring funds to somebody and you do, I’m not sure why that becomes a cyber claim. There’s nothing cyber about that. … To me, that’s crime coverage.
“From a coverage perspective,” said Marsh’s Parisi, “it doesn’t matter if a person has criminal motivation or did something stupid. They will cover it. There’s no stupidity exclusion in cyber policies.”
But it depends on the type of loss that results from a social engineering scheme, he said. If it leads to a breach of privacy or data breach, that would be covered by a cyber policy. If the scheme results in an employee transferring funds, then it would be a crime or fidelity policy.
He noted there also is an “intentional acts limitation” in policies that relates to “the control group,” which generally is seen as the C-suite. If a C-suite executive engaged in fraudulent activity, that may not be covered by insurance, while an act by a “rogue employee” would be covered.
In the last 18 months or so, social engineering fraud endorsements have been available for crime coverage, he said.
“Cyber is not a panacea for all things that involve a computer. There are a lot of ways technology can cause loss or harm that isn’t necessarily picked up under a cyber policy,” Parisi said. “But what we have seen is an increasing reluctance of traditional P&C markets to cover cyber-related perils, creating a vacuum that the cyber markets can fill.”
Rosenzweig of Risk Strategies said there “is not a consistent response across the marketplace” as to whether a social engineering claim is covered under a cyber or crime policy.
“There is still a bit of finger-pointing on this,” he said. “That’s a point of frustration for clients.
“If the program isn’t structured correctly, sometimes these types of claims can fall wtihin the cracks.”
Andy Lea, vice president, underwriting for E&O, media and cyber, at CNA, said that while cyber policies are becoming broader as they relate to social engineering and data, the “available policy language and philosophy differ from carrier to carrier, and coverage can be very fact and circumstance specific, if they provide coverage at all.” &