Analytics

Data Diving to Improve Comp

More companies are harnessing industry data to cut down claim duration and overall cost.
By: | September 14, 2015

Predictive analytics in the workers’ compensation space is having an evolutionary moment.

Dean Foods, a food and beverage company, first developed an analytical model for its return-to-work program back in 2010.

“The model was very simple at that time. It was basically an indicator of lost time or no lost time,” said Kevin Lutzke, director of risk management at Dean Foods.

The company’s TPA uploaded daily updates to a data warehouse, where return-to-work coordinators could see which claims were indicated as possible lost-time claims and required their attention.

That early model, built in-house by a company actuary, yielded an 80 percent accuracy rate, and “moved the needle” on the company’s transitional duty program, getting more injured workers back in the game faster.

“It got to a point where everybody was so good at identifying which claims were going to be lost time or not, and the RTW coordinators were able to facilitate transitional duty so well, that we actually changed our culture at Dean,” Lutzke said. “The attitude used to be, ‘They’re not coming back until they’re 100 percent.’ Now we always try to keep people working with transitional duty.”

Three years later, Dean Foods implemented an updated version with third party administrator Helmsman Management Services, which factors in a wider variety of details about each case, including the worker’s age, type of injury, classification as surgical or non-surgical, and comorbidities.

“The attitude used to be, ‘They’re not coming back until they’re 100 percent.’ Now we always try to keep people working with transitional duty.” — Kevin Lutzke, director of risk management, Dean Foods

It also asks for input on social factors, such as the worker’s family situation, personal finances, and the status of his or her relationship with the employer.

Those elements may indicate tendencies to depression and isolation, which would prolong recovery, or that the worker might purposely lengthen the time away from work.

“Those details are kept confidential, to protect the employee’s privacy,” Lutzke said. “We don’t see that information as the end user, but we receive an overall score from the TPA.”

The more detailed model also allows Dean Foods to expand its applications from a simple lost time indicator. The company now uses its output to measure reserve levels against what it typically spends on a claim of a particular severity — a figure based on Helmsman’s database and book of business.

“We’re only seven or eight months into this, so we don’t have enough information yet,” Lutzke said. “We’ll have to wait a few more months to see if we’re focusing on the right things.”

The evolution of the food company’s predictive model and its applications reflects an industrywide trend. Less than a decade ago, modeling programs for workers’ comp simply spit out a score on a scale of one to 10, ranking a claim’s likelihood of becoming long-term and expensive based on a few factors specific to the injury in question. Standard metrics have traditionally included type of injury, body part injured, age of the worker, gender and comorbidities.

But models include a much wider scope of demographic variables that can impact an injured worker’s path to full recovery, such as the worker’s ZIP code, the average income of that area and the level of access to health care institutions.

Some models also incorporate information about prescribers — what type of pain medication they dispense, their general prescribing patterns and who refers patients to that prescriber.

Companies can adapt a predictive model’s various functions to address more specific areas of interest and pinpoint trouble spots.

For some, a model might highlight chronic overspend on pharmacy costs, while for others it might indicate a need for a better way to triage claims up-front and get the right resources on the case more quickly. Models exist to determine which claims would benefit from nurse case management and which might be subject to subrogation.

Clinical Applications

Helios, for example, launched a pilot predictive analytics program in 2011 that identified claims likely to result in high pharmacy costs, which would then be targeted with specific interventions. Measured against a control group that received no interventions, the pilot group saw a 26 percent total cost reduction, as well as a shorter duration of pharmacy utilization.

Joe Anderson, director of analytics, Helios

Joe Anderson, director of analytics, Helios

“We know there are some prescribers who, if an injured worker with a higher level of severity goes to see them, there’s a correlation with higher long term pharmacy costs and longer claim duration,” said Joe Anderson, director of analytics at Helios.

In a situation where an injured worker has several prescriptions for opioids, Helios launches a fairly non-invasive intervention: sending a letter to the prescribers to alert them of possible drug abuse.

“It can go all the way to medication review, or peer-to-peer intervention,” Anderson said. “Sometimes the doctor will talk to the treating prescriber and figure out how to change the regimen.”

That model has also been enhanced several times.

“Over the past few years, we’ve had an expanded data set with more customers, which gives us opportunities to fill in gaps of information and measure things we weren’t previously able to measure,” Anderson said.

Other models help detect fraud, said Scott Henck, senior vice president and actuary, claim actuarial and advanced analytics, at Chubb Insurance.

Some claims are flagged for potential “soft fraud” if there are indicators that the worker is malingering and simply not progressing at the rate the model predicts he or she should.

Medical codes on the claim bill can also tip off an adjuster when a worker may be taking advantage of workers’ comp care to get treatment for a separate, unrelated injury or illness.

“Codes for significant but unrelated conditions might give us reason to investigate further,” Henck said.

Chubb’s original model hit the market in 2008, and has since undergone several updates. Currently, the insurer sees average cost savings in the 5 percent to 10 percent range, which stems from identifying fraud early as well as more effectively directing the right resources toward potentially risky claims.

“Sometimes it’s very obvious which claims are high or low severity. It’s more about identifying which claims are likely to develop in their severity and address it early to mitigate the exposure,” Henck said.

Developing targeted strategies early in the claim life cycle is a key benefit, said Stephen Allen, managing director of commercial insurance services at Crystal & Company.

“The strength of modeling is that you know early on that a claim has the potential to be dangerous, and can get a senior adjuster involved to make sure more resources are used for high risk, high severity claims,” Allen said.

“Any time you can get earlier involvement, there’s a much higher likelihood you get a worker back to work quickly,” he said.

Data Depth and Detail

Of course, a model is only as good as the data fed into it. The length and richness of claim history varies from provider to provider, but larger TPAs and insurers with developed tools typically have a large data set with 10 to 15 years’ worth of history.

“The customization and client-specificity are very important in determining what’s predictive and relevant,” said Chris Flatt, leader of Marsh’s workers’ compensation center of excellence.

Origami Risk’s analytics combine data sets from multiple sources: its own aggregate data, the client’s data including claims history, and third-party information such as evidence-based disability guidelines, said Aaron Shapiro, executive vice president of Origami Risk.

Henck said that data sources could even expand to include information gleaned from social media.

Expansion of data sources “should be able to increase the depth and precision of solutions as well as open up possibilities for new solutions,” he said.

Interventions

One challenge that remains in predictive analytical models is making the data output easily consumable and actionable by a broad-spectrum consumer base comprising claims adjusters, insurers, data scientists and risk managers.

“We present metrics on the individual claim that indicate how quickly a particular injured worker should be back to work,” Shapiro said. “We also produce trend lines based on an aggregate of claims so that an individual case can be compared to the average claim duration and cost for a particular injury.” That allows users to identify outliers and intervention opportunities.

“We present metrics on the individual claim that indicate how quickly a particular injured worker should be back to work.” — Aaron Shapiro, executive vice president of Origami Risk.

Interventions implemented based on a model’s output might involve deciding who would benefit from nurse case management, sending a letter or otherwise intervening in the treatment plan set forth by a prescriber, or adjusting reserve levels.

“Analytics is still a very complex subject and there’s still a lot of confusion in the marketplace, due to different terms being used in different ways,” Helios’ Anderson said.

The varied availability of data and many ways analytics can be used likely add to the confusion. Savings can also be hard to determine because that involves estimating costs that are never actually incurred.

But given the pace at which companies have developed analytical programs, the role of predictive analytics in workers’ comp seems bound to grow.

Katie Dwyer is a freelance editor and writer based out of Philadelphia. She can be reached at [email protected].

More from Risk & Insurance