Want to work with us? Get in touch.
Share
News
Technology

Why instant, self-service catastrophe modelling is transforming underwriting in 2026

Callum Eastwood
May 13, 2026

The exposure management challenge

At Carbon, we’ve used the power of our proprietary analytics platform, ‘Graphene’, to create a market-leading workflow for exposure management (EM) insights.

Exposure management is fundamental to any property portfolio. It’s the discipline of understanding where risk accumulates across geographies and perils, ensuring that an insurer’s portfolio stays within its risk appetite.

For delegated authority businesses like Carbon, this becomes even more important as we rely on coverholders across the world to underwrite risks on our behalf, without us seeing every individual risk upfront.

Verisk AIR is one of the world's leading catastrophe modelling firms. Touchstone is their flagship modelling platform - an industry-standard tool used to simulate natural catastrophes over real portfolio data and generate loss estimates. In my role at Carbon I work closely with the property underwriting team to produce analytics and insight into our exposure and what drives it. Day-to-day, I use Graphene to understand how major natural catastrophes would affect our book - an important function that’s made simple through our use of data and technology, and enabled by our ongoing partnership with Verisk. We've made catastrophe model outputs a self-service resource for the whole business, at location level.

The traditional workflow problem

All insurers can run catastrophe models. They can model quotes and in-force portfolios to estimate losses, either from or single event or as an annual aggregate. Losses are typically modelled all the way up from an average year to remote 1 in 10,000 year simulations, and insurers will have appetites for the amount they are willing to lose across these different hypothetical return periods. A standard workflow (without Graphene) could take this shape:

Underwriter question → exposure team analysis → modelling → exporting results → formatting → sharing insights.

If an underwriter wants to explore further - perhaps identifying which coverholder is driving Canadian earthquake AEP increases - the process starts again. It’s data exploration, hampered by a middleman.

The issue here is really a visibility problem. Underwriters and other key people in these businesses don’t have access to this data; slowing down decision making and limiting proactiveness.

How we fixed it

Step 1: building an intuitive modelling interface

Our quantitative analyst team built a user interface using the streamlit application framework directly into Graphene, to sit between our teams and Touchstone. This allows users to upload risk-level data in a simple CSV format. Subsequently, the code transforms that data into the contract and location files that Touchstone needs.

Before these files are sent to Touchstone, our application provides users with;

• A preview of the input data

• The generated contract and location files

• Summary statistics such as total premium

• An interactive map showing the geographic distribution of risks

This ensures users understand exactly what is about to be modelled before submitting the run.

Step 2: automated catastrophe modelling

Upon hitting go, the system uses Touchstone’s API endpoints to send the data off for modelling. Our premade workflow, built using Touchstone’s workflow builder, handles the whole process, removing the need for manual input.

The modelling happens as usual, with Touchstone calculating the losses for each location and event.

Step 3: bringing location-level losses into the data platform

When the modelling completes, results are automatically pulled back by API into our data warehouse, hosted by Google BigQuery. The end result is more than summary statistics, it displays the full loss table at location level for every simulated year and event. That granularity gives us complete flexibility to build analytics on top of the model outputs. It means catastrophe model results aren’t just reports, they are a rich mine of data, running to billions of rows, that can be explored in seconds.

Step 4: making the insights accessible

The next step of our process was to create dashboards in Graphene that were accessible to everyone and that allowed users to dynamically explore our loss data.

(screenshot)

Now, anyone with access can answer questions like:

• Which coverholders are driving our windstorm exposure?

• Which postcodes drive our Canadian earthquake 1 in 30 year AEPs?

• How does adding this new quote impact our portfolio LCM5 losses?

• What would a 1-in-200 year North Atlantic hurricane do to our book?

Because we store such detailed data, users can drill down or aggregate across any dimension they choose.

Now the work flow is far simpler:

Underwriter question → upload data to Touchstone → accessible model outputs in Graphene dashboard

Putting it into practice: Canadian earthquake exposure

The impact becomes clear in practice. When Gary Clark, our Head of Property, was preparing for a meeting with one of our coverholders, we had identified that this partner was contributing significantly to our overall losses - driven primarily by their Canadian earthquake exposure.

Gary needed specifics. Which postcodes were driving this? Where was the concentration coming from?

Using our most recent in-force portfolio results, I filtered on Touchstone's US & Canada earthquake model, targeted the specific coverholder, and opened the loss-by-location table.

From there, I could see gross losses by postcode and connect this back to our risk-level data to pull in premiums and Total Insured Values.

The insight? The top four postcodes accounted for roughly 50% of this coverholder's gross earthquake losses.

That's the kind of actionable information that changes conversations. Gary went into that meeting knowing exactly where the concentration was, backed by specific numbers - not just aggregate figures.

“Before we had these tools, getting this kind of detail would have taken days and much back-and-forth. Now I can walk into a coverholder meeting with precise, location-level data and have a completely different conversation. It's changed how we manage our portfolio.” – Gary Clark, Head of Property

By embedding Verisk AIR modelling directly into our Graphene analytics platform, we’ve turned catastrophe modelling into something the whole business can explore — not just exposure specialists. Because the real power of catastrophe models isn’t the numbers they produce, it’s what happens when underwriters can interact with them.

To find out more, contact us for a Graphene demo, here.

Callum Eastwood, Quantitative Engineer

Shape the future of risk insights
Join a global team of innovators, disrupters and free thinkers.
Callum Eastwood
Quantitative Engineer

Listen

Register to subscribe for updates and insights