Data Diving to Improve Comp

More companies are harnessing industry data to cut down claim duration and overall cost.
By: | September 14, 2015 • 7 min read

Predictive analytics in the workers’ compensation space is having an evolutionary moment.

Dean Foods, a food and beverage company, first developed an analytical model for its return-to-work program back in 2010.

“The model was very simple at that time. It was basically an indicator of lost time or no lost time,” said Kevin Lutzke, director of risk management at Dean Foods.


The company’s TPA uploaded daily updates to a data warehouse, where return-to-work coordinators could see which claims were indicated as possible lost-time claims and required their attention.

That early model, built in-house by a company actuary, yielded an 80 percent accuracy rate, and “moved the needle” on the company’s transitional duty program, getting more injured workers back in the game faster.

“It got to a point where everybody was so good at identifying which claims were going to be lost time or not, and the RTW coordinators were able to facilitate transitional duty so well, that we actually changed our culture at Dean,” Lutzke said. “The attitude used to be, ‘They’re not coming back until they’re 100 percent.’ Now we always try to keep people working with transitional duty.”

Three years later, Dean Foods implemented an updated version with third party administrator Helmsman Management Services, which factors in a wider variety of details about each case, including the worker’s age, type of injury, classification as surgical or non-surgical, and comorbidities.

“The attitude used to be, ‘They’re not coming back until they’re 100 percent.’ Now we always try to keep people working with transitional duty.” — Kevin Lutzke, director of risk management, Dean Foods

It also asks for input on social factors, such as the worker’s family situation, personal finances, and the status of his or her relationship with the employer.

Those elements may indicate tendencies to depression and isolation, which would prolong recovery, or that the worker might purposely lengthen the time away from work.

“Those details are kept confidential, to protect the employee’s privacy,” Lutzke said. “We don’t see that information as the end user, but we receive an overall score from the TPA.”

The more detailed model also allows Dean Foods to expand its applications from a simple lost time indicator. The company now uses its output to measure reserve levels against what it typically spends on a claim of a particular severity — a figure based on Helmsman’s database and book of business.

“We’re only seven or eight months into this, so we don’t have enough information yet,” Lutzke said. “We’ll have to wait a few more months to see if we’re focusing on the right things.”

The evolution of the food company’s predictive model and its applications reflects an industrywide trend. Less than a decade ago, modeling programs for workers’ comp simply spit out a score on a scale of one to 10, ranking a claim’s likelihood of becoming long-term and expensive based on a few factors specific to the injury in question. Standard metrics have traditionally included type of injury, body part injured, age of the worker, gender and comorbidities.

But models include a much wider scope of demographic variables that can impact an injured worker’s path to full recovery, such as the worker’s ZIP code, the average income of that area and the level of access to health care institutions.

Some models also incorporate information about prescribers — what type of pain medication they dispense, their general prescribing patterns and who refers patients to that prescriber.

Companies can adapt a predictive model’s various functions to address more specific areas of interest and pinpoint trouble spots.

For some, a model might highlight chronic overspend on pharmacy costs, while for others it might indicate a need for a better way to triage claims up-front and get the right resources on the case more quickly. Models exist to determine which claims would benefit from nurse case management and which might be subject to subrogation.

Clinical Applications

Helios, for example, launched a pilot predictive analytics program in 2011 that identified claims likely to result in high pharmacy costs, which would then be targeted with specific interventions. Measured against a control group that received no interventions, the pilot group saw a 26 percent total cost reduction, as well as a shorter duration of pharmacy utilization.

Joe Anderson, director of analytics, Helios

Joe Anderson, director of analytics, Helios

“We know there are some prescribers who, if an injured worker with a higher level of severity goes to see them, there’s a correlation with higher long term pharmacy costs and longer claim duration,” said Joe Anderson, director of analytics at Helios.

In a situation where an injured worker has several prescriptions for opioids, Helios launches a fairly non-invasive intervention: sending a letter to the prescribers to alert them of possible drug abuse.

“It can go all the way to medication review, or peer-to-peer intervention,” Anderson said. “Sometimes the doctor will talk to the treating prescriber and figure out how to change the regimen.”


That model has also been enhanced several times.

“Over the past few years, we’ve had an expanded data set with more customers, which gives us opportunities to fill in gaps of information and measure things we weren’t previously able to measure,” Anderson said.

Other models help detect fraud, said Scott Henck, senior vice president and actuary, claim actuarial and advanced analytics, at Chubb Insurance.

Some claims are flagged for potential “soft fraud” if there are indicators that the worker is malingering and simply not progressing at the rate the model predicts he or she should.

Medical codes on the claim bill can also tip off an adjuster when a worker may be taking advantage of workers’ comp care to get treatment for a separate, unrelated injury or illness.

“Codes for significant but unrelated conditions might give us reason to investigate further,” Henck said.

Chubb’s original model hit the market in 2008, and has since undergone several updates. Currently, the insurer sees average cost savings in the 5 percent to 10 percent range, which stems from identifying fraud early as well as more effectively directing the right resources toward potentially risky claims.

“Sometimes it’s very obvious which claims are high or low severity. It’s more about identifying which claims are likely to develop in their severity and address it early to mitigate the exposure,” Henck said.

Developing targeted strategies early in the claim life cycle is a key benefit, said Stephen Allen, managing director of commercial insurance services at Crystal & Company.

“The strength of modeling is that you know early on that a claim has the potential to be dangerous, and can get a senior adjuster involved to make sure more resources are used for high risk, high severity claims,” Allen said.

“Any time you can get earlier involvement, there’s a much higher likelihood you get a worker back to work quickly,” he said.

Data Depth and Detail

Of course, a model is only as good as the data fed into it. The length and richness of claim history varies from provider to provider, but larger TPAs and insurers with developed tools typically have a large data set with 10 to 15 years’ worth of history.

“The customization and client-specificity are very important in determining what’s predictive and relevant,” said Chris Flatt, leader of Marsh’s workers’ compensation center of excellence.

Origami Risk’s analytics combine data sets from multiple sources: its own aggregate data, the client’s data including claims history, and third-party information such as evidence-based disability guidelines, said Aaron Shapiro, executive vice president of Origami Risk.

Henck said that data sources could even expand to include information gleaned from social media.

Expansion of data sources “should be able to increase the depth and precision of solutions as well as open up possibilities for new solutions,” he said.


One challenge that remains in predictive analytical models is making the data output easily consumable and actionable by a broad-spectrum consumer base comprising claims adjusters, insurers, data scientists and risk managers.

“We present metrics on the individual claim that indicate how quickly a particular injured worker should be back to work,” Shapiro said. “We also produce trend lines based on an aggregate of claims so that an individual case can be compared to the average claim duration and cost for a particular injury.” That allows users to identify outliers and intervention opportunities.

“We present metrics on the individual claim that indicate how quickly a particular injured worker should be back to work.” — Aaron Shapiro, executive vice president of Origami Risk.

Interventions implemented based on a model’s output might involve deciding who would benefit from nurse case management, sending a letter or otherwise intervening in the treatment plan set forth by a prescriber, or adjusting reserve levels.


“Analytics is still a very complex subject and there’s still a lot of confusion in the marketplace, due to different terms being used in different ways,” Helios’ Anderson said.

The varied availability of data and many ways analytics can be used likely add to the confusion. Savings can also be hard to determine because that involves estimating costs that are never actually incurred.

But given the pace at which companies have developed analytical programs, the role of predictive analytics in workers’ comp seems bound to grow.

Katie Dwyer is an associate editor at Risk & Insurance®. She can be reached at [email protected]

More from Risk & Insurance

More from Risk & Insurance

Risk Focus: Cyber

Expanding Cyber BI

Cyber business interruption insurance is a thriving market, but growth carries the threat of a mega-loss. 
By: | March 5, 2018 • 7 min read

Lingering hopes that large-scale cyber attack might be a once-in-a-lifetime event were dashed last year. The four-day WannaCry ransomware strike in May across 150 countries targeted more than 300,000 computers running Microsoft Windows. A month later, NotPetya hit multinationals ranging from Danish shipping firm Maersk to pharmaceutical giant Merck.


Maersk’s chairman, Jim Hagemann Snabe, revealed at this year’s Davos summit that NotPetya shut down most of the group’s network. While it was replacing 45,000 PCs and 4,000 servers, freight transactions had to be completed manually. The combined cost of business interruption and rebuilding the system was up to $300 million.

Merck’s CFO Robert Davis told investors that its NotPetya bill included $135 million in lost sales plus $175 million in additional costs. Fellow victims FedEx and French construction group Saint Gobain reported similar financial hits from lost business and clean-up costs.

The fast-expanding world of cryptocurrencies is also increasingly targeted. Echoes of the 2014 hack that triggered the collapse of Bitcoin exchange Mt. Gox emerged this January when Japanese cryptocurrency exchange Coincheck pledged to repay customers $500 million stolen by hackers in a cyber heist.

The size and scope of last summer’s attacks accelerated discussions on both sides of the Atlantic, between risk managers and brokers seeking more comprehensive cyber business interruption insurance products.

It also recently persuaded Pool Re, the UK’s terrorism reinsurance pool set up 25 years ago after bomb attacks in London’s financial quarter, to announce that from April its cover will extend to include material damage and direct BI resulting from acts of terrorism using a cyber trigger.

“The threat from a cyber attack is evident, and businesses have become increasingly concerned about the extensive repercussions these types of attacks could have on them,” said Pool Re’s chief, Julian Enoizi. “This was a clear gap in our coverage which left businesses potentially exposed.”

Shifting Focus

Development of cyber BI insurance to date reveals something of a transatlantic divide, said Hans Allnutt, head of cyber and data risk at international law firm DAC Beachcroft. The first U.S. mainstream cyber insurance products were a response to California’s data security and breach notification legislation in 2003.

Jimaan Sané, technology underwriter, Beazley

Of more recent vintage, Europe’s first cyber policies’ wordings initially reflected U.S. wordings, with the focus on data breaches. “So underwriters had to innovate and push hard on other areas of cyber cover, particularly BI and cyber crimes such as ransomware demands and distributed denial of service attacks,” said Allnut.

“Europe now has regulation coming up this May in the form of the General Data Protection Regulation across the EU, so the focus has essentially come full circle.”

Cyber insurance policies also provide a degree of cover for BI resulting from one of three main triggers, said Jimaan Sané, technology underwriter for specialist insurer Beazley. “First is the malicious-type trigger, where the system goes down or an outage results directly from a hack.

“Second is any incident involving negligence — the so-called ‘fat finger’ — where human or operational error causes a loss or there has been failure to upgrade or maintain the system. Third is any broader unplanned outage that hits either the company or anyone on which it relies, such as a service provider.”

The importance of cyber BI covering negligent acts in addition to phishing and social engineering attacks was underlined by last May’s IT meltdown suffered by airline BA.

This was triggered by a technician who switched off and then reconnected the power supply to BA’s data center, physically damaging servers and distribution panels.

Compensating delayed passengers cost the company around $80 million, although the bill fell short of the $461 million operational error loss suffered by Knight Capital in 2012, which pushed it close to bankruptcy and decimated its share price.

Mistaken Assumption

Awareness of potentially huge BI losses resulting from cyber attack was heightened by well-publicized hacks suffered by retailers such as Target and Home Depot in late 2013 and 2014, said Matt Kletzli, SVP and head of management liability at Victor O. Schinnerer & Company.


However, the incidents didn’t initially alarm smaller, less high-profile businesses, which assumed they wouldn’t be similarly targeted.

“But perpetrators employing bots and ransomware set out to expose any firms with weaknesses in their system,” he added.

“Suddenly, smaller firms found that even when they weren’t themselves targeted, many of those around them had fallen victim to attacks. Awareness started to lift, as the focus moved from large, headline-grabbing attacks to more everyday incidents.”

Publications such as the Director’s Handbook of Cyber-Risk Oversight, issued by the National Association of Corporate Directors and the Internet Security Alliance fixed the issue firmly on boardroom agendas.

“What’s possibly of greater concern is the sheer number of different businesses that can be affected by a single cyber attack and the cost of getting them up and running again quickly.” — Jimaan Sané, technology underwriter, Beazley

Reformed ex-hackers were recruited to offer board members their insights into the most vulnerable points across the company’s systems — in much the same way as forger-turned-security-expert Frank Abagnale Jr., subject of the Spielberg biopic “Catch Me If You Can.”

There also has been an increasing focus on systemic risk related to cyber attacks. Allnutt cites “Business Blackout,” a July 2015 study by Lloyd’s of London and the Cambridge University’s Centre for Risk Studies.

This detailed analysis of what could result from a major cyber attack on America’s power grid predicted a cost to the U.S. economy of hundreds of billions and claims to the insurance industry totalling upwards of $21.4 billion.

Lloyd’s described the scenario as both “technologically possible” and “improbable.” Three years on, however, it appears less fanciful.

In January, the head of the UK’s National Cyber Security Centre, Ciaran Martin, said the UK had been fortunate in so far averting a ‘category one’ attack. A C1 would shut down the financial services sector on which the country relies heavily and other vital infrastructure. It was a case of “when, not if” such an assault would be launched, he warned.

AI: Friend or Foe?

Despite daunting potential financial losses, pioneers of cyber BI insurance such as Beazley, Zurich, AIG and Chubb now see new competitors in the market. Capacity is growing steadily, said Allnutt.

“Not only is cyber insurance a new product, it also offers a new source of premium revenue so there is considerable appetite for taking it on,” he added. “However, whilst most insurers are comfortable with the liability aspects of cyber risk; not all insurers are covering loss of income.”

Matt Kletzli, SVP and head of management liability, Victor O. Schinnerer & Company

Kletzli added that available products include several well-written, broad cyber coverages that take into account all types of potential cyber attack and don’t attempt to limit cover by applying a narrow definition of BI loss.

“It’s a rapidly-evolving coverage — and needs to be — in order to keep up with changing circumstances,” he said.

The good news, according to a Fitch report, is that the cyber loss ratio has been reduced to 45 percent as more companies buy cover and the market continues to expand, bringing down the size of the average loss.

“The bad news is that at cyber events, talk is regularly turning to ‘what will be the Hurricane Katrina-type event’ for the cyber market?” said Kletzli.

“What’s worse is that with hurricane losses, underwriters know which regions are most at risk, whereas cyber is a global risk and insurers potentially face huge aggregation.”


Nor is the advent of robotics and artificial intelligence (AI) necessarily cause for optimism. As Allnutt noted, while AI can potentially be used to decode malware, by the same token sophisticated criminals can employ it to develop new malware and escalate the ‘computer versus computer’ battle.

“The trend towards greater automation of business means that we can expect more incidents involving loss of income,” said Sané. “What’s possibly of greater concern is the sheer number of different businesses that can be affected by a single cyber attack and the cost of getting them up and running again quickly.

“We’re likely to see a growing number of attacks where the aim is to cause disruption, rather than demand a ransom.

“The paradox of cyber BI is that the more sophisticated your organization and the more it embraces automation, the bigger the potential impact when an outage does occur. Those old-fashioned businesses still reliant on traditional processes generally aren’t affected as much and incur smaller losses.” &

Graham Buck is editor of He can be reached at