A Problem of Trust
I relish reports comparing one state’s workers’ compensation system with another’s. Yet, the more popular comparative reports trouble me. To be blunt, some of the most-read ones are flawed. Do not believe their numbers. Since data was invented — it first appeared in the West as a mortality table in 1662 — producers and users have been caught up in endless and sometimes infernal arguments over usefulness, credibility and access.
Two very careful publishers provide, at a growing pace, comparative analyses quickly shared among flourishing online venues. The National Council on Compensation Insurance (NCCI) and the Workers Compensation Research Institute (WCRI) have been, by analogy, upgrading the quality of the dinner dishes, causing the rest of the dining table to look a little tawdry.
The state of Oregon publishes biennial state rankings of workers’ comp insurance costs. The federal Bureau of Labor Statistics annually issues state injury statistics.
A reform advocate in New York recently used Oregon’s ranking data to assert that N.Y.’s workers’ comp system was expensive and inferior. Critics of the advocate pointed out an obvious flaw in the ranking data. Oregon compares the states by using a standard mix of industry based on Oregon’s most important employment codes. It should have long ago switched to using a mix that represents national employment.
Oregon’s publishing venture is short on resources. And it does not seem terribly interested in how others have come to use it. The interests of the national workers’ comp community likely have as much impact on it as a snowflake on the Pacific.
At least two for-profit firms have published comparative work safety rankings of states, drawing from Bureau of Labor Statistics data on injury types, rates and duration. Anyone who looks at state BLS data over time notes persistent but unexplained high rates in some states. Neither the BLS nor anyone else has tried to explain the persistent state disparities in, for example, industry sectors with extremely uniform business models nationwide. Maine’s figures, for example, are always high, to the chagrin of its state work safety officials.
Comparative data should be — in fact, must be — a resource for our industry. Many use the data to advocate a point of view, and before that to figure out how the workers’ comp system performs. This kind of data is especially useful in testing hypotheses.
The industry has a problem with parochialism, of just focusing on a state’s existing system and not questioning it by looking at other systems. The leadership of workers’ comp organizations is strengthened by knowledge of other states. We need to encourage creative, imaginative questioning, which comparative analysis enables and inspires. Sometimes one can get into strange propositions by using comparative analysis — all the better, as it indicates intellectual energy.
We need to measure the right things. Up through 2009, an actuarial firm compared insurance costs of manufacturers against injured worker benefits. I used this series to opine that Massachusetts bested the other states in its mix of low insurance costs and high benefits.
We need to compare workers’ comp costs of employers, benefits to workers, injury rates and the duration of disability — at the very least. The publishers need to account for major employment shifts, regionally and sector-wise as in residential construction. They need to account for privatization in Texas and Oklahoma. They need to earn our trust.