Risk Insider: Paula Vene Smith

Does ERM Fall Apart in Volatile Times?

By: | March 15, 2017 • 2 min read
Paula Vene Smith directs the Purposeful Risk Engagement Project (PREP) and is a professor at Grinnell College. Paula consults on risk in higher education, and has written Engaging Risk: A Guide for College Leaders. She can be reached at [email protected]
Topics: Education | ERM | Risk Insider

Today’s headlines in higher education report high uncertainty and rapid change. Campus controversies erupt over free speech, prospective foreign students express newfound reluctance to study in the United States, and research funding is threatened in science, arts, and humanities. Even the principle of evidence-based decisions, fundamental to the academic enterprise, is questioned by national leaders. Do these times call for a new look at risk?

Advertisement




Colleges and universities only recently started to see value in developing a way to identify, evaluate, and address risks at the institutional level. By the time Enterprise Risk Management began to take hold on campus, most businesses and government agencies had already established systems of ERM. Realizing it wouldn’t work simply to replicate standard ERM frameworks, academic leaders developed their own systems based on shared governance and sustaining their mission.

The new presidential administration has issued statements, directives, and orders on issues from immigration policy to transgender rights; nearly all have implications for the academic realm.

But in recent months, uncertainty in academic decision-making has risen steeply. The landscape for higher education and for nonprofit organizations grows harder to navigate. The new presidential administration has issued statements, directives, and orders on issues from immigration policy to transgender rights; nearly all have implications for the academic realm. What happens to ERM programs assailed by so much change?

While this question affects any organization that practices ERM, it looms largest for those with fledgling and new programs—and that includes most academic institutions, especially the smaller ones. I’ve noticed three ways academic institutions are responding to these times of intensified uncertainty:

  • “Things right now are too volatile to do ERM.” Struggling to manage stepped-up risks in their own areas, administrators stop making institutional risk meetings a priority, and ERM goes on hold;
  • “Same old ERM.” The system proceeds on auto-pilot, without reference to the current political situation. A quarterly meeting of the ERM Council looks like the same meeting held a year and a half ago. Each risk owner summarizes what’s happened since last time and says, “We’re still working on it.” No one wants to acknowledge a new climate.
  • “Wait, let’s re-evaluate.” A risk leader makes the group face a new reality. The meeting agenda is rewritten; risk owners are asked to directly address potential and recent changes in the legal, political, and cultural environment.
  • Advertisement




This third approach enables new, crucial questions: How can we protect access for students whose immigration status makes them vulnerable? How does the rise of bigoted speech in the public sphere affect campus discourse? How might we respond to changing legal interpretations of Title IX?

Such adjustment to new circumstances should occur naturally as part of ERM. But in practice, people are so accustomed to their routines that unless it is given a name, even very large-scale change can be minimized or overlooked until it’s too late. Neither “ERM on hold” nor “ERM as usual” represents a wise option. Risk leaders should address big change head-on. Only then can Enterprise Risk Management—especially if newly established and still fragile—continue to drive good institutional decisions.

More from Risk & Insurance

More from Risk & Insurance

Robotics Risk

Rise of the Cobots

Collaborative robots, known as cobots, are rapidly expanding in the workforce due to their versatility. But they bring with them liability concerns.
By: | May 2, 2017 • 5 min read

When the Stanford Shopping Center in Palo Alto hired mobile collaborative robots to bolster security patrols, the goal was to improve costs and safety.

Once the autonomous robotic guards took up their beats — bedecked with alarms, motion sensors, live video streaming and forensics capabilities — no one imagined what would happen next.

Advertisement




For some reason,  a cobots’ sensors didn’t pick up the movement of a toddler on the sidewalk who was trying to play with the 5-foot-tall, egg-shaped figure.

The 300-pound robot was programmed to stop for shoppers, but it knocked down the child and then ran over his feet while his parents helplessly watched.

Engaged to help, this cobot instead did harm, yet the use of cobots is growing rapidly.

Cobots are the fastest growing segment of the robotics industry, which is projected to hit $135.4 billion in 2019, according to tech research firm IDC.

“Robots are embedding themselves more and more into our lives every day,” said Morgan Kyte, a senior vice president at Marsh.

“Collaborative robots have taken the robotics industry by storm over the past several years,” said Bob Doyle, director of communications at the Robotic Industries Association (RIA).

When traditional robots joined the U.S. workforce in the 1960s, they were often assigned one specific task and put to work safely away from humans in a fenced area.

Today, they are rapidly being deployed in the automotive, plastics, electronics assembly, machine tooling and health care industries due to their ability to function in tandem with human co-workers.

More than 24,000 robots valued at $1.3 billion were ordered from North American companies last year, according to the RIA.

Cobots Rapidly Gain Popularity

Cobots are cheaper, more versatile and lighter, and often have a faster return on investment compared to traditional robots. Some cobots even employ artificial intelligence (AI) so they can adapt to their environment, learn new tasks and improve on their skills.

Bob Doyle, director of communications, Robotic Industry Association

Their software is simple to program, so companies don’t need a computer programmer, called a robotic integrator, to come on site to tweak duties. Most employees can learn how to program them.

While the introduction of cobots into the workplace can bring great productivity gains, it also introduces risk mitigation challenges.

“Where does the problem lie when accidents happen and which insurance covers it?” asked attorney Garry Mathiason, co-chair of the robotics, AI and automation industry group at the law firm Littler Mendelson PC in San Francisco.

“Cobots are still machines and things can go awry in many ways,” Marsh’s Kyte said.

“The robot can fail. A subcomponent can fail. It can draw the wrong conclusions.”

If something goes amiss, exposure may fall to many different parties:  the manufacturer of the cobot, the software developer and/or the purchaser of the cobot, to name a few.

Is it a product defect? Was it an issue in the base code or in the design? Was something done in the cobot’s training? Was it user error?

“Cobots are still machines and things can go awry in many ways.” — Morgan Kyte, senior vice president, Marsh

Is it a workers’ compensation case or a liability issue?

“If you get injured in the workplace, there’s no debate as to liability,” Mathiason said.

But if the employee attributes the injury to a poorly designed or programmed machine and sues the manufacturer of the equipment, that’s not limited by workers’ comp, he added.

Garry Mathiason, co-chair, robotics, AI and automation industry group, Littler Mendelson PC

In the case of a worker killed by a cobot in Grand Rapids, Mich., in 2015, the worker’s spouse filed suit against five of the companies responsible for manufacturing the machine.

“It’s going to be unique each time,” Kyte said.

“The issue that keeps me awake at night is that people are so impressed with what a cobot can do, and so they ask it to do a task that it wasn’t meant to perform,” Mathiason said.

Privacy is another consideration.

If the cobot records what is happening around it, takes pictures of its environment and the people in it, an employee or customer might claim a privacy violation.

A public sign disclosing the cobot’s ability to record video or take pictures may be a simple solution. And yet, it is often overlooked, Mathiason said.

Growing Pains in the Industry

There are going to be growing pains as the industry blossoms in advance of any legal and regulatory systems, Mathiason said.

He suggests companies take several mitigation steps before introducing cobots to the workplace.

First, conduct a safety audit that specifically covers robotics. Make sure to properly investigate the use of the technology and consider all options. Run a pilot program to test it out.

Most importantly, he said, assign someone in the organization to get up to speed on the technology and then continuously follow it for updates and new uses.

The Robotics Industry Association has been working with the government to set up safety standards. One employee can join a cobot member association to receive the latest information on regulations.

“I think there’s a lot of confusion about this technology and people see so many things that could go wrong,” Mathiason said.

Advertisement




“But if you handle it properly with the safety audit, the robotics audit, and pay attention to what the standards are, it’s going to be the opposite; there will be fewer problems.

“And you might even see in your experience rating that you are going to [get] a better price to the policy,” he added.

Without forethought, coverage may slip through the cracks. General liability, E&O, business interruption, personal injury, cyber and privacy claims can all be involved.

AIG’s Lexington Insurance introduced an insurance product in 2015 to address the gray areas cobots and robots create. The coverage brings together general and products liability, robotics errors and omissions, and risk management services, all three of which are tailored for the robotics industry. Minimum premium is $25,000.

Insurers are using lessons learned from the creation of cyber liability policies and are applying it to robotics coverage, Kyte said.

“The robotics industry has been very safe for the last 30 years,” RIA’s Doyle said. “It really does have a good track record and we want that to continue.” &

Juliann Walsh is a staff writer at Risk & Insurance. She can be reached at [email protected]