The high cost of low-quality reinsurance data

Supercede’s Jerad Leigh highlights how bad data can lead to higher costs and lower trust.

A newly released report from Supercede sheds light on the detrimental effects of inconsistent and inaccurate data on the reinsurance industry. Garnering insights from candid discussions with underwriters, actuaries and brokers, the report reveals the issues arising from unreliable data exchanged between cedants, brokers and reinsurers.

The overarching consensus: bad data erodes efficiency in the global reinsurance value chain, leading to higher costs and lower trust.

The price tag of uncertainty

One of the most immediate consequences of sub-par data becomes evident during renewal season. The veil of uncertainty cast by such data compels reinsurers to implement significant “pricing loads”, often to the tune of 5-10 percent or more. Reinsurers have no choice but to penalise bad data, and it means cedants end up paying more.

Additionally, the very integrity of submissions comes under scrutiny, leading underwriters to sometimes withhold capacity. Moreover, this data dilemma also impedes the growth of new tools designed to improve the industry – tools which need good data to work effectively.

Looking closer, we see professionals are spending more time fixing mistakes rather than using data to get helpful information. Instead of harnessing data to draw valuable insights, practitioners find themselves ensnared in the daunting task of wrestling with spreadsheets.

This means excruciating cycles of error detection and rectification that test both patience and professional relationships. And the ones bearing the brunt of this work are often the understaffed underwriting teams who, in their bid to prioritise, delay addressing problematic accounts, naturally focusing their attention on the cedants with cleaner data submissions.

Reinsurtech and revolution

While there's excitement about emerging data platforms, technology in isolation isn't the magic bullet. Cedants should be diligent in the data shared to their partners, and reinsurers and brokers must clearly and transparently articulate what they need in order to secure the best terms and pricing.

The newer generation of professionals, who are innately more attuned to data-driven processes rather than manual drudgery, are frustrated. They see the huge potential of leveraging AI-enabled platforms but are held back by the bad data they receive, and the systems used to organise that data. This issue affects everyone across the reinsurance value chain. Clear and easy-to-use data will unlock unparalleled efficiency, orders of magnitude greater than what practitioners use today.

Elevating standards for collective gain

The main message is clear: good data is not just nice to have, it's essential for best results. And cedants are in a key position to drive this change. By championing data integrity, they can pave the way for brokers and reinsurers to step in as true collaborators. Alternatively, cedants can continue to send poorly structured and opaque data into the market to absorb the inflated premiums and reduced value provided by their most trusted partners who will move them further down their to-do list in favour of their competitors. The choice seems obvious as the rewards of upholding elevated standards promise a brighter, more efficient future for all stakeholders.

The repercussions of poor reinsurance data are far-reaching, affecting every aspect of the industry. With increased costs, strained relationships and hindered technological advancements, it's evident that quality data is vital moving forwards. As the digital age continues to evolve, it's crucial for all players in the reinsurance sector, especially cedants, to prioritise data integrity. By doing so, they not only streamline operations but also pave the way for innovation and collaboration, ensuring a prosperous future for the entire industry.

Access the full whitepaper The Reinsurance Data Crisis at

Jerad Leigh is CEO at Supercede