Risk Insider: Ernie Feirer

A Clearer View of Location Risk

By: | April 28, 2017 • 3 min read
Ernie Feirer, CPCU, is Vice President and General Manager, Commercial Insurance, at LexisNexis Risk Solutions, where he is responsible for developing a suite of solutions for the commercial insurance market. He can be reached at [email protected]

Location information is critical for assessing risk and underwriting commercial property insurance effectively.

Most commercial property underwriters today gather risk-specific location information manually. In addition, they are limited by having incomplete data on existing exposure and property-specific peril by location when they underwrite.


Fortunately, access to geospatial technology and new data sources are allowing carriers to improve the breadth and efficacy of risk peril data in commercial property underwriting.

Access to raw data is useless if you can’t interpret it. New geospatial technology solves this problem by converting raw peril data into machine-readable risk scoring indices that can be integrated into an automated underwriting workflow.

Peril-specific risk scoring indices can be produced by taking historic peril-specific data on an address and assigning it number from one to five, indicating the intensity of its peril exposure. For example, a property that has a long history of hail events will have a hail score of five. A property that has little or no hail event history may have a hail score of two.

Carriers can then establish thresholds based on these peril indices so that they can easily sort out which potential risks are a definite yes, definite no, or require further scrutiny. For example, a carrier may not have the appetite for property with a wind peril score of four or higher, and it may want to apply additional underwriting attention on a wind peril score of three.

Having these tools in place positions the carrier to implement better location specific underwriting along with a process we’ve called Strategic Visualization.

How Strategic Visualization Works

Underwriters don’t have the time to visualize every property across dozens of parameters. Strategic visualization helps by simply automating a process that leverages risk-specific indices to identify risks that require additional underwriter attention.

An automated scoring engine can use indices across different risks to identify those that are below acceptable thresholds and automatically send them through to bind. Risks that exceed underwriting thresholds are referred for further action and underwriter attention.

Scoring can be fine-tuned to match organizational risk tolerances at the SIC or business category level. Carriers can implement peril-specific indices or thresholds by business type into existing underwriting workflows, visualizing only the risks that are above the threshold and, in turn, improve productivity and their bottom line.

A carrier underwriting a car dealership, for instance, may set a lower risk threshold for hail while jewelry store underwriting may necessitate a lower threshold for theft.

What types of risk-relevant geospatial information should underwriters look at? The options are constantly expanding, but the following five categories of geospatial data should be considered:

Location-Based Risk Factors are business-related information at the geocode level. It is used to assess a location’s risk level based on the neighborhood business climate, including bankruptcies, foreclosures, vacancies, business failures, business creation and business change rates.  Location risk factors help the underwriter understand the type of neighborhood and environment in which the business is operating.

Natural Hazard Data provides insight on the propensity and potential future damage from natural hazard perils by location. These perils include floods, high winds, tornadoes, hail, brush fires and earthquakes. While the availability of natural disaster data is not new, the ability to distill this data to a set of property specific indices, along with analysis of existing exposure and visualization on a single platform is new.

Firmographic Data assesses the types of businesses that surround a particular property location that can have impact on risk. For example, a day care located next to a bookstore and elementary school will have a different level of risk than a day care sandwiched between a liquor store and a pawn shop.


Historical Loss Data is perhaps the most effective predictor of future risk. Aggregated loss history indices from industry-wide contributory claims databases reveal valuable insights regarding historic theft, wind, hail, lightening, fire, water and liability claims by location.

Existing Policy in Force Data provides the current exposure context based on policies already on the books and allows the carrier to assess whether underwriting that new business — in that particular location — will create exposure beyond their appetite for risk.

Real time access to geospatial insights at point of underwriting can significantly improve the commercial underwriting process. Underwriters only touch those risks that require more attention. At any point, carriers can customize and fine tune the technology to better match their risk tolerance.

Geospatial visualization is shaping the future of commercial underwriting. Carriers that incorporate the technology in their underwriting processes will gain a significant competitive advantage.

More from Risk & Insurance

More from Risk & Insurance

Cyber Liability

Fresh Worries for Boards of Directors

New cyber security regulations increase exposure for directors and officers at financial institutions.
By: | June 1, 2017 • 6 min read

Boards of directors could face a fresh wave of directors and officers (D&O) claims following the introduction of tough new cybersecurity rules for financial institutions by The New York State Department of Financial Services (DFS).


Prompted by recent high profile cyber attacks on JPMorgan Chase, Sony, Target, and others, the state regulations are the first of their kind and went into effect on March 1.

The new rules require banks, insurers and other financial institutions to establish an enterprise-wide cybersecurity program and adopt a written policy that must be reviewed by the board and approved by a senior officer annually.

The regulation also requires the more than 3,000 financial services firms operating in the state to appoint a chief information security officer to oversee the program, to report possible breaches within 72 hours, and to ensure that third-party vendors meet the new standards.

Companies will have until September 1 to comply with most of the new requirements, and beginning February 15, 2018, they will have to submit an annual certification of compliance.

The responsibility for cybersecurity will now fall squarely on the board and senior management actively overseeing the entity’s overall program. Some experts fear that the D&O insurance market is far from prepared to absorb this risk.

“The new rules could raise compliance risks for financial institutions and, in turn, premiums and loss potential for D&O insurance underwriters,” warned Fitch Ratings in a statement. “If management and directors of financial institutions that experience future cyber incidents are subsequently found to be noncompliant with the New York regulations, then they will be more exposed to litigation that would be covered under professional liability policies.”

D&O Challenge

Judy Selby, managing director in BDO Consulting’s technology advisory services practice, said that while many directors and officers rely on a CISO to deal with cybersecurity, under the new rules the buck stops with the board.

“The common refrain I hear from directors and officers is ‘we have a great IT guy or CIO,’ and while it’s important to have them in place, as the board, they are ultimately responsible for cybersecurity oversight,” she said.

William Kelly, senior vice president, underwriting, Argo Pro

William Kelly, senior vice president, underwriting at Argo Pro, said that unknown cyber threats, untested policy language and developing case laws would all make it more difficult for the D&O market to respond accurately to any such new claims.

“Insurers will need to account for the increased exposures presented by these new regulations and charge appropriately for such added exposure,” he said.

Going forward, said Larry Hamilton, partner at Mayer Brown, D&O underwriters also need to scrutinize a company’s compliance with the regulations.

“To the extent that this risk was not adequately taken into account in the first place in the underwriting of in-force D&O policies, there could be unanticipated additional exposure for the D&O insurers,” he said.

Michelle Lopilato, Hub International’s director of cyber and technology solutions, added that some carriers may offer more coverage, while others may pull back.

“How the markets react will evolve as we see how involved the department becomes in investigating and fining financial institutions for noncompliance and its result on the balance sheet and dividends,” she said.

Christopher Keegan, senior managing director at Beecher Carlson, said that by setting a benchmark, the new rules would make it easier for claimants to make a case that the company had been negligent.

“If stock prices drop, then this makes it easier for class action lawyers to make their cases in D&O situations,” he said. “As a result, D&O carriers may see an uptick in cases against their insureds and an easier path for plaintiffs to show that the company did not meet its duty of care.”


One area that regulators and plaintiffs might seize upon is the certification compliance requirement, according to Rob Yellen, executive vice president, D&O and fiduciary liability product leader, FINEX at Willis Towers Watson.

“A mere inaccuracy in a certification could result in criminal enforcement, in which case it would then become a boardroom issue,” he said.

A big grey area, however, said Shiraz Saeed, national practice leader for cyber risk at Starr Companies, is determining if a violation is a cyber or management liability issue in the first place.

“The complication arises when a company only has D&O coverage, but it doesn’t have a cyber policy and then they have to try and push all the claims down the D&O route, irrespective of their nature,” he said.

“Insurers, on their part, will need to account for the increased exposures presented by these new regulations and charge appropriately for such added exposure.” — William Kelly, senior vice president, underwriting, Argo Pro

Jim McCue, managing director at Aon’s financial services group, said many small and mid-size businesses may struggle to comply with the new rules in time.

“It’s going to be a steep learning curve and a lot of work in terms of preparedness and the implementation of a highly detailed cyber security program, risk assessment and response plan, all by September 2017,” he said.

The new regulation also has the potential to impact third parties including accounting, law, IT and even maintenance and repair firms who have access to a company’s information systems and personal data, said Keegan.

“That can include everyone from IT vendors to the people who maintain the building’s air conditioning,” he said.

New Models

Others have followed New York’s lead, with similar regulations being considered across federal, state and non-governmental regulators.

The National Association of Insurance Commissioners’ Cyber-security Taskforce has proposed an insurance data security model law that establishes exclusive standards for data security and investigation, and notification of a breach of data security for insurance providers.

Once enacted, each state would be free to adopt the new law, however, “our main concern is if regulators in different states start to adopt different standards from each other,” said Alex Hageli, director, personal lines policy at the Property Casualty Insurers Association of America.

“It would only serve to make compliance harder, increase the cost of burden on companies, and at the end of the day it doesn’t really help anybody.”


Richard Morris, partner at law firm Herrick, Feinstein LLP, said companies need to review their current cybersecurity program with their chief technology officer or IT provider.

“Companies should assess whether their current technology budget is adequate and consider what investments will be required in 2017 to keep up with regulatory and market expectations,” he said. “They should also review and assess the adequacy of insurance policies with respect to coverages, deductibles and other limitations.”

Adam Hamm, former NAIC chair and MD of Protiviti’s risk and compliance practice, added: “With New York’s new cyber regulation, this is a sea change from where we were a couple of years ago and it’s soon going to become the new norm for regulating cyber security.” &

Alex Wright is a U.K.-based business journalist, who previously was deputy business editor at The Royal Gazette in Bermuda. You can reach him at [email protected]