Risk Insider: Nir Kossovsky

New Tools to Bolster Reputation Risk Management

By: | July 5, 2016 • 3 min read
Nir Kossovsky is the Chief Executive Officer of Steel City Re. He has been developing solutions for measuring, managing, monetizing, and transferring risks to intangible assets since 1997. He is also a published author, and can be reached at [email protected]

The American Law Institute (ALI) is formalizing governance law principles that will include legal strategies to signal stakeholders and enhance institutional reputations for governance, risk management and compliance.

ALI’s principles, according to the Institute, hold an authority nearly comparable to that accorded to judicial decisions. No board, once advised, could safely ignore or deny these guidelines.

One example of these emerging standards involves monitorships. These are court-ordered corporate oversight mechanisms to enforce compliance after ethics or safety breaches or any other shortcoming that could impact a company’s reputation and by natural extension, reputation value.

The legal community’s recognition of the value of actions designed to send signals to stakeholders and impact reputation is a major sea change only 35 years in the making.

As reported in the blog JOTWELL, a review of legal scholarship sponsored by the University of Miami School of Law, monitorships are also now recognized as a legal means of signaling ongoing reputation rehabilitation.

The modern-day monitor, like a gatekeeper, may lend reputational capital to the wrongdoer, but in this context to facilitate rehabilitation or … (other) public relations benefit(s).

Advertisement




According to Notre Dame’s Veronica Root, Associate Professor of Law, examples of modern-day monitorships include the court-ordered events following Penn State’s sexual abuse scandal, Apple’s anti-competitive behavior, and Bank of America’s improper foreclosures on hundreds of thousands of homeowners.

They provide outsiders a unique source of information about the efficacy of the tarnished organization’s efforts to remediate misconduct.

Responding to a need for a set of recommended standards and best practices on the law of compliance and risk management (GRC), ALI’s recommendations are primarily addressed to legislatures, administrative agencies, and private actors such as the law firms that advise boards.

The legal community’s recognition of the value of actions designed to send signals to stakeholders and impact reputation is a major sea change only 35 years in the making.

It affirms that behavioral economic principles are becoming mainstream among leading lawyers working in the governance, risk and compliance area. It is a formal acknowledgement that governance battles in the courts of public opinion are as important as battles in the courts of law.

The prior legal mindset was solidified in 1947 in Judge Learned Hand’s formula where if the expected cost of harm exceeded the cost to take the precaution, then the company must take the precaution, whereas if the cost of harm was less, then it did not have to.

This standard, named “BPL,” was severely tested in 1981 with the legendary Ford Pinto case. Based on a BPL analysis, Ford legally chose not to make the Pinto safer.

The BPL defense led to Ford’s being lambasted for insensitivity to the risk-cost trade-offs associated with the risk of immolation from rear-impact crashes of the Pinto.

Yet, tied to the BPL formula, the mainstream legal community found it difficult to factor in the cost of lost reputation — the added costs of harm due to lost sales, weakened talent retention, damaged supplier relationships and other damage.

For years, the legal community railed at a legal system that allowed companies to be punished in situations in which they undertook “responsible risk analyses.”

The ALI’s forthcoming guidelines redefines “responsible risk analysis” to acknowledge the reality that in matters of governance, risk management and compliance that all depend on trust, to paraphrase former Federal Reserve Chairman Alan Greenspan, reputation has a significant economic value.

Robotics Risk

Rise of the Cobots

Collaborative robots, known as cobots, are rapidly expanding in the workforce due to their versatility. But they bring with them liability concerns.
By: | May 2, 2017 • 5 min read

When the Stanford Shopping Center in Palo Alto hired mobile collaborative robots to bolster security patrols, the goal was to improve costs and safety.

Once the autonomous robotic guards took up their beats — bedecked with alarms, motion sensors, live video streaming and forensics capabilities — no one imagined what would happen next.

Advertisement




For some reason,  a cobots’ sensors didn’t pick up the movement of a toddler on the sidewalk who was trying to play with the 5-foot-tall, egg-shaped figure.

The 300-pound robot was programmed to stop for shoppers, but it knocked down the child and then ran over his feet while his parents helplessly watched.

Engaged to help, this cobot instead did harm, yet the use of cobots is growing rapidly.

Cobots are the fastest growing segment of the robotics industry, which is projected to hit $135.4 billion in 2019, according to tech research firm IDC.

“Robots are embedding themselves more and more into our lives every day,” said Morgan Kyte, a senior vice president at Marsh.

“Collaborative robots have taken the robotics industry by storm over the past several years,” said Bob Doyle, director of communications at the Robotic Industries Association (RIA).

When traditional robots joined the U.S. workforce in the 1960s, they were often assigned one specific task and put to work safely away from humans in a fenced area.

Today, they are rapidly being deployed in the automotive, plastics, electronics assembly, machine tooling and health care industries due to their ability to function in tandem with human co-workers.

More than 24,000 robots valued at $1.3 billion were ordered from North American companies last year, according to the RIA.

Cobots Rapidly Gain Popularity

Cobots are cheaper, more versatile and lighter, and often have a faster return on investment compared to traditional robots. Some cobots even employ artificial intelligence (AI) so they can adapt to their environment, learn new tasks and improve on their skills.

Bob Doyle, director of communications, Robotic Industry Association

Their software is simple to program, so companies don’t need a computer programmer, called a robotic integrator, to come on site to tweak duties. Most employees can learn how to program them.

While the introduction of cobots into the workplace can bring great productivity gains, it also introduces risk mitigation challenges.

“Where does the problem lie when accidents happen and which insurance covers it?” asked attorney Garry Mathiason, co-chair of the robotics, AI and automation industry group at the law firm Littler Mendelson PC in San Francisco.

“Cobots are still machines and things can go awry in many ways,” Marsh’s Kyte said.

“The robot can fail. A subcomponent can fail. It can draw the wrong conclusions.”

If something goes amiss, exposure may fall to many different parties:  the manufacturer of the cobot, the software developer and/or the purchaser of the cobot, to name a few.

Is it a product defect? Was it an issue in the base code or in the design? Was something done in the cobot’s training? Was it user error?

“Cobots are still machines and things can go awry in many ways.” — Morgan Kyte, senior vice president, Marsh

Is it a workers’ compensation case or a liability issue?

“If you get injured in the workplace, there’s no debate as to liability,” Mathiason said.

But if the employee attributes the injury to a poorly designed or programmed machine and sues the manufacturer of the equipment, that’s not limited by workers’ comp, he added.

Garry Mathiason, co-chair, robotics, AI and automation industry group, Littler Mendelson PC

In the case of a worker killed by a cobot in Grand Rapids, Mich., in 2015, the worker’s spouse filed suit against five of the companies responsible for manufacturing the machine.

“It’s going to be unique each time,” Kyte said.

“The issue that keeps me awake at night is that people are so impressed with what a cobot can do, and so they ask it to do a task that it wasn’t meant to perform,” Mathiason said.

Privacy is another consideration.

If the cobot records what is happening around it, takes pictures of its environment and the people in it, an employee or customer might claim a privacy violation.

A public sign disclosing the cobot’s ability to record video or take pictures may be a simple solution. And yet, it is often overlooked, Mathiason said.

Growing Pains in the Industry

There are going to be growing pains as the industry blossoms in advance of any legal and regulatory systems, Mathiason said.

He suggests companies take several mitigation steps before introducing cobots to the workplace.

First, conduct a safety audit that specifically covers robotics. Make sure to properly investigate the use of the technology and consider all options. Run a pilot program to test it out.

Most importantly, he said, assign someone in the organization to get up to speed on the technology and then continuously follow it for updates and new uses.

The Robotics Industry Association has been working with the government to set up safety standards. One employee can join a cobot member association to receive the latest information on regulations.

“I think there’s a lot of confusion about this technology and people see so many things that could go wrong,” Mathiason said.

Advertisement




“But if you handle it properly with the safety audit, the robotics audit, and pay attention to what the standards are, it’s going to be the opposite; there will be fewer problems.

“And you might even see in your experience rating that you are going to [get] a better price to the policy,” he added.

Without forethought, coverage may slip through the cracks. General liability, E&O, business interruption, personal injury, cyber and privacy claims can all be involved.

AIG’s Lexington Insurance introduced an insurance product in 2015 to address the gray areas cobots and robots create. The coverage brings together general and products liability, robotics errors and omissions, and risk management services, all three of which are tailored for the robotics industry. Minimum premium is $25,000.

Insurers are using lessons learned from the creation of cyber liability policies and are applying it to robotics coverage, Kyte said.

“The robotics industry has been very safe for the last 30 years,” RIA’s Doyle said. “It really does have a good track record and we want that to continue.” &

Juliann Walsh is a staff writer at Risk & Insurance. She can be reached at [email protected]