Data​: Ascribing meaning to risk / Deriving meaning from risk

The reinsurance industry is awash with discussions about data these days, not least thanks to the insurtech wave that continues to drive interest from the venture and private equity community.

But what are reinsurance and insurance-linked securities (ILS) market participants trying to do with data?

Yes, there’s a push for straight-through processing of claims along the market chain, which makes having one true view of underlying data absolutely critical (we’re not there yet).

Yes, there’s increasing sophistication in risk modelling and an expanding array of models, covering ever widening numbers of risks and perils as well (much more to do here).

Then there are initiatives and products related to marketplaces, electronic trading, following form using still relatively unsophisticated algorithms, and the still-early days of a machine learning wave that promises to deliver greater visibility of and transparency into risks, and a whole lot more.

But what is the industry trying to do with all of this, where will it lead?

Using data, I believe the industry is trying to ascribe meaning to risk.

By this I mean wrapping more understanding around risk, as well as the flows of risk and capital in the industry.

It’s also about redrawing, or installing, the boundaries around risk that will also aid in understanding it, but perhaps more importantly in structuring it, developing new product opportunities, classes of business and contract forms.

As well as deriving meaning in risks that have previously been lacking understanding, a key opportunity for the data-ready re/insurer or ILS fund.

The industry is also aspiring to derive more meaning from its risk data, of course.

But, by putting some additional meaning (or understanding) around the risks it covers, the reinsurance market can greatly expand its usefulness, by being able to manage and handle its risks, and portfolios of them, a whole lot more effectively.

Data comes in many forms and in ever-increasing quantities.

It’s important to be able to warehouse this, move it around your organisation, share it with your counterparties and generally leverage it to your advantage.

That doesn’t mean it has to be industry-wide standardised, that’s not what I mean.

Trying to get the industry to put its many systems and strategies behind single standardised data formats is not going to happen anytime soon, aside from in certain specific or niche business use-cases.

Ascribing meaning to something enables us to gain a greater or higher understanding of it.

This enables us to make more intelligent decisions, which is precisely what the reinsurance and ILS sectors’ need to begin to do.

There are going to be data winners and losers, as a result, with your ability to wrangle data in ways that allow you to extract more meaning and value from it set to become a key differentiator.

Data equals power, in a number of ways.

In reinsurance, the major globally active broking and underwriting groups hold a lot of this power, given they see a significant proportion of the data related to all of the business ceded across the market.

These major players have had a head start in putting data to work usefully, providing them a competitive-advantage over the last few years.

A number have done this particularly well and will prosper because of it. But some others have clearly squandered this opportunity, as they now languish behind in terms of their data abilities.

The top-tier of reinsurance brokers, in particular, have done a very good job of maximising value from data and their access to it and control of it.

You can see this not just in their pure data and technology-related initiatives, but also in some of the ways they’ve put their data-advantage to work to create underwriting facilities and capital pools for their clients.

An enhanced understanding of risk, capital and market activity gives an edge that can help brokers bridge the divide to the underwriters.

As such, it’s another area we, at Artemis.bm and Reinsurance News see reinsurance brokers and reinsurers becoming increasingly competitive.

It would be remiss not to mention the promise of machine learning and artificial intelligence, in helping risks be better understood, portfolios be better managed, and generally enabling a greater understanding of the predictability, or otherwise, of insurance risks.

Maximising your data advantage may be critical in reinsurance in years to come, so seeing more business can give an edge.

But, more important is having the technical plumbing and infrastructure to gather, arrange, sort, and understand this data lake, so you can better ascribe meaning to it and enhance your understanding of it.

Walled-gardens vs setting it free

In some industries it’s not just owning and securing data that’s important, it’s also having an understanding of how to use it to clients advantage, setting it free to allow it to be used in ways that ultimately deliver value and earnings back to you.

We’ve seen tech giants profit massively by allowing their clients to ride on their data coat-tails.

Reinsurance brokers can do this too, letting both cedents and capital providers surf their waves of data.

Which in turn will deliver more insights back to them and further enhance their ability to ascribe meaning to risk and market data, as well as to understand it (both how it’s used and what it means).

At the moment though, the data flow is really quite one-way and lacking transparency, unless you’re paying a lot for it and that’s likely on top of all your other brokerage costs.

Pricing (and clearing) data is king

In future, marketplaces could be one of the ultimate data-winners, as those that gather and own the true data insights on the pricing of risk, plus the technical ability to understand that data, will own one of the biggest differentiators in reinsurance.

There’s the market price. The fixed or range of costs attached to a risk or group of risks.

But there’s also the actual price it/they clear and trade at.

Knowing both and having the deep technical ability to harness data to the benefit of clients placement and trading of risk, while also extracting data insights for your own needs and to benefit the marketplace users, are significant areas of advantage in risk transfer.

This has played out in capital markets and one day it will in reinsurance too.

The push-back is increasing

This is already playing out to a degree, as reinsurance brokers send letters to clients cautioning them against using marketplaces, effectively saying “if you do, we can’t guarantee you best-execution.”

Which is a fallacy of course, as using technology to find the best price to clear trades at is accepted as the optimal mechanism, superior to human trading, in almost every asset class, except for risk (so far).

With a clear data advantage, if they can gain broad market adoption, a reinsurance marketplace or trading platform would also be the best venue for beta-style capital deployment, or following-form, or establishing facilities and the like.

Something reinsurance brokers are all too aware of, which is one of the factors causing the delivery of letters to clients that we’ve seen, and also something now holding back broad adoption of reinsurance marketplace tech, I believe.

So this area, of reinsurance brokers vs marketplace tech, could be where the data battle comes to a head.

Although the broker-reinsurer battle is set to accelerate as well.

Data wrangling = opportunity

Which brings me onto where the startup opportunities could be.

Helping cedents understand their data, apply meaning to it, manage it better, derive insights from it, is going to be key.

Once upon a time (25 years ago) I thought capital (how it’s managed, structured, deployed) would be an efficiency driver in reinsurance.

The capital markets and ILS have made a huge difference, but still haven’t gone as far as I’d thought they would (yet).

Data and the technology to use/understand it, has similar potential as an efficiency driver in reinsurance.

The industry is still in its infancy when it comes to utilising data, in really effective ways.

While reinsurance is a data heavy market, the future could see enormous gains made as streams of data are effectively put to work.

It’s advancing fast and is going to be a real point of both differentiation and competition in the reinsurance market.

Ability with data, has the potential to give some players additional leverage over their competition and an edge that others may struggle to match.

The bloated middleware of the risk transfer (insurance) market

While everyone’s so focused on front-end customer experience and distribution in insurtech circles, while the insurance-linked securities (ILS) market is so focused on providing the lowest-cost and most efficient capital, I thought it was time to talk about the rest of the chain, the bit in the middle.

I’m going to call this middleware.

Before I start. Yes this is simplistic. No I’m not offering any definitive answers. But it might get some people thinking…

Coming from a software focused background originally, developing / managing insurance and reinsurance technology solutions since 1995, also managing large, data intensive e-commerce operations, having front, middle and back-ends of the market makes a lot of sense to me, as a paradigm that’s understandable, simplified and so allows room for bigger thinking.

Middleware is clearly, the bit in the middle.

No prizes for guessing that in the insurance and reinsurance market this middle piece is severely bloated and full of redundancy.

In fact there is so much duplication of effort or roles, that you could highlight as surplus (to requirements), that I’m not even going to bother trying to distinguish between them here.

I’m just going to lump it all together as middleware and talk a bit about what it’s function should (could) be.

If the front-end of risk transfer and risk financing (an easier and more all-encompassing name to include hedging than just calling it insurance and reinsurance) is all about customer interaction, originating, understanding and pricing risks.

While the back-end is all about capital allocation (to the risks) and (in the future) trading it.

What does the middleware bit do?

It’s where the many actors of the risk to capital chain come in.

Broker, agent, underwriter, wholesaler, MGA, insurer, reinsurer, retrocessionaire, etc.

All the different people that touch, interact with, or hold a risk, likely expressing a view on its ultimate cost or value, during its journey down (or around) the market chain to capital.

Also in the middle are a range of different activities, such as risk bundling, pooling, aggregating, sharing. As well as activities that are essentially secondary and tertiary replications of the front-end, or secondary and tertiary transfers, to satisfy different players confidence in the risk and their various appetites for it.

Now don’t get me wrong, there’s an awful lot of value in an awful lot of this activity. This really is an industry steeped in expertise.

But there’s also a lot of cost, friction and waste as well.

Much of it sounds like it could be a bit redundant in a world where efficient risk technology converges with efficient risk capital (a future I for one subscribe to).

That’s because it largely is and find itself increasingly so.

There’s a whole chunk of the market that aggregates and then passes risk on down the line, in different ways/means/structures. While different actors of agents/brokers/wholesalers/consultants/underwriters/modellers etc, all get paid.

That’s not even considering hedging (reinsurance and retrocession, secondary markets, plus layers beyond in the lifecycle of a risk).

These add additional layers of complexity and sometimes duplication as well. They certainly add to the cost of taking a risk from source to its ideal home attached to the right capital.

How did it all develop like this?

Everything has a purpose and is valid, depending on your role in (and view on) this marketplace.

Justification for the existence of certain roles, procedures, processes, actors and actions has always been found by highlighting complexity, expertise, ability, connections and of course relationships.

But none of those factors actually mean it’s needed, especially not when you’re the third, fourth or fifth actor to touch, analyse, express a view on, or transfer a risk.

As a result this middleware is bloated, overly complex and adds significant cost to the process of taking a risk and matching it with capital.

It also adds risk to market participants, given the middleware’s habit of obfuscating the clarity surrounding risk.

We all know data magically disappears as risks travel further down the market chain, especially in secondary and tertiary trades (reinsurance, retro etc).

But there’s a natural obfuscation of risk clarity as more and more people touch it, analyse it, apply their view, their data formats, their underwriting standards etc, meaning risk information can be significantly less clear and embedded risks themselves harder to recognise, in a contract or portfolio the further down the chain you sit.

That’s a problem, requiring more expenditure and effort, ultimately making the market less effective than it could be and risk transfer less efficient.

In my view that’s a key issue here.

The market has naturally developed to a point in which layer upon layer of complexity and duplication has taken it to a stage where it’s now becoming harder and harder to unpick as well.

Which suits some incumbents, as this complexity equals greater security and longevity, for some roles in the chain right now.

That means roles and responsibilities won’t be given up readily. Protectionism is alive and well in risk markets (and always has been).

Depending on where you sit (in the market landscape), I bet you can draw a diagram and pick out pieces of the traditional market chain and highlight some that are redundant, in your view.

But of course your own seat is vital, valid and you deserve to get paid?

As I’ve said before, only if you’re bringing value to the risk to capital chain.

If you’re not? Perhaps you’re a piece of the middleware problem.

The discussion that needs to be had has to begin with taking a simple view of risk and capital, asking how best to connect the two in the most efficient manner.

Stripping back the market to its most basic of functions and roles means this middleware can be reimagined entirely.

Which could result in something new, more efficient, more effective and ultimately fit for the future.

My gut feel is this is going to happen in niches, where risks are more readily streamlined (commoditised) along an efficient chain to capital at first.

Or where the risk is so new there’s a chance to do things differently from the off.

But that will have ramifications for the wider market, as signals of efficiency emerging from niches that have a chance to redesign the chain, show incumbents how to redesign their own niche while maintaining market share and most importantly margin/profits.

As that spreads, so too will market efficiency.

There is an issue coming in some ‘innovation’ and ‘insurtech’ initiatives though and its worth highlighting a risk that may emerge.

Some tech and innovation initiatives are not really making anything more efficient.

Rather they are adding control, designed for the few, perpetuating the status-quo, all under a shiny layer of unicorn dust.

That doesn’t actually help anyone (except some investors & incumbents).

However, the industry does need to go through these phases of innovation and iteration, which will range (often wildly) between control and openness, before a better paradigm (and balance) for efficiency is found.

Some will find it sooner than others and these may be the ultimate winners, at either end of the chain or those seeking to provide an efficient and new form of middleware to connect them.

Bifurcation of the middleware is entirely possible as a result of this iterative approach (I expect we’ll find).

But this change will be positive for everyone adding value and able to demonstrate it.

Those unable to (demonstrate the value they bring to the chain) might have to think again.

As a side note, consider the travel industry, a space I worked in for a number of years.

Incumbents ended up disrupting their own business models as they tried to fight back against start-ups offering a better and more efficient way to connect product and user together.

Playing a zero-sum game while trying to exert ownership and control has resulted in some very large companies disappearing altogether (not just in travel).

As risk industries look to bring risk to capital (or capacity) ever more efficiently, to provide a better product at the front-end, there’s a good chance some may undermine their very reason for existence in this market, by trying to do so while still exerting their control.

Just as you can compete yourself into irrelevance, so too can you disintermediate and disrupt yourself to the same end, commoditising the original value proposition you once had.

Why responsive risk transfer (or insurance)? An example…

Insurance and reinsurance, as a product set, are not particularly responsive today.

Yes, it (the product set) can meet the broader expectations of billions of consumers and hundreds of millions of businesses around the globe, as financial tool to transfer risk.

Or at least they think it does, based on what little they often know about it.

But is the insurance or risk transfer product actually serving their needs, when they really need it most?

Following on from my thoughts on rethinking and redesigning re/insurance for the modern age, where I questioned whether insurance, reinsurance and risk transfer really responds to its users needs (at the right time and in the right way).

I thought it might be interesting to dive a bit deeper into the responsive angle.

In that post I wrote:

Given the way our lives and businesses work, in this fast-paced and rapidly changing environment, we need something new and more responsive than this.

Something more responsive to our needs, that dovetails into the cycle of our lives, businesses and communities.

What we need are shock-absorbers: financial and risk protection products that smooth out the bumps in the road that might otherwise have knocked our life or business off course.

Products that respond right when we need it, providing just enough in terms of recovery to push us back on track, helping us to help ourselves right at the point it’s most needed.

Insurance can become this shock-absorber for our lives.

Insurance and reinsurance is often more like a time-delayed source of risk capital, with benefits only coming at the point the pain has already become so significant that it can often prove too late anyway.

But we’re used to this now, particularly in the business world, where insurance can payout after it’s too late and too little to help a firm avoid significant financial impacts and sometimes even bankruptcy.

In the majority of use-cases, insurance and risk transfer should be about responding at precisely the right point in time (when it’s first needed and can be most helpful) and in precisely the right way/amount required (no more, just enough to steer you back on-course).

The right time and right amount/way are both key.

Get that wrong and you’re over-paying (expensive), or over-complicating the product itself (confusing & disappointing for the customer), and likely also over-burdening the insured during its time of recovery (cognitive load is high).

Better to deliver only what is required, but most importantly at precisely the point it can be of most use to the insured.

But how to be this responsive?

To have an almost sixth-sense for when a claim is set to be needed/made and then delivering just what is required to smooth out the volatility (to life or business) that is being experienced?

Of course it largely comes down to data, access to it and the ability to understand/use it.

The more of it (data) the better. The more granular the better. The more real-time it’s delivered, on an ongoing basis throughout the policy term, the better. The more localised and personalised it is to the particular use-case in question, the better.

This is where I get excited (nerd alert) about anything that can provide real-time data insights to inform insurance, reinsurance and risk transfer responsiveness.

Enter the sensor.

Sensors and the data they provide are going to become key tools and inputs that allow for better risk transfer product design and development in the first place.

Insurance and risk transfer products are often created in what seems like the dark, with little visibility of what could or may happen. So decisions on pricing, triggers, responsiveness, are all taken using historical data and information derived from analogues and synthetic models of reality.

But, with sensors and the data they provide, you could be updating underwriting information, risk metrics, pricing, triggers, tweaking the responsiveness of the risk transfer product all in real-time, creating something that really does offer the kind of responsive experience users demand (or should demand) from finance today.

Enter Cloudleaf (https://www.cloudleaf.com/about), an interesting start-up that I first heard of a while back, but it caught my eye again the other day.

Cloudleaf just raised a $26m round of funding (congrats!) and provides sophisticated digital and analytical solutions for better supply-chain improvement.

Internet of things (IoT), artificial intelligence, machine learning, Cloudleaf uses them all.

Buzzword heavy but for the right reasons, as these advanced technologies enable its services to map and understand, even predict or forecast, where the pain points are and how to optimise supply-chains for large organisations.

That’s interesting alone.

But given the IoT angle, which involves sensors and the resulting data that flows from them (don’t you know), Cloudleaf can deliver real-time intelligence into how a supply-chain is performing, for a single organisation but of course (extrapolated out) that could also provide intelligence on an entire industry chain as well.

Which leads me back to the future of insurance and risk transfer, as I strongly believe supply-chain disruption related business interruption coverages can be better designed and made responsive to organisations needs, through the use of sensors and advanced data analytical services + parametric inputs to risk transfer triggers.

Cloudleaf could (should), if it isn’t already, speak to the likes of the world’s largest reinsurance firms in this regards (they may reach out to it anyway after reading this).

As, an integrated data analytics, sensors, AI and risk transfer approach to delivering business insurance coverage for supply-chain related risks could really be a responsive solution fit for the future.

Imagine a system that can forecast where pain is set to emerge in the system, calculate the potential impacts, release capital based on triggers calibrated using the data and supply-chain network health information, releasing just the right amount of insurance payment at just the right time for the customer.

That’s the idea of responsive risk transfer as a volatility-smoother, responding when its needed to even out the cycle of business (or our lives). A shock absorber, even a predictive and preventative one, for our lives and businesses.

That’s of course just one idea on the future responsiveness of insurance, reinsurance and risk transfer. But to me it’s a particularly compelling one that solves a coverage gap that exists today.

It also shows how data can enable responsive risk transfer and insurance product design, to deliver entirely new solutions that better meet client needs.

Thinking laterally similar models can fit to different challenging areas of the business world.

But, taking a step back, it’s the responsive model of delivering the financial protection in the right amount and at the right time (instead of the all or nothing of many re/insurance products) that I find a particularly interesting concept for the future of the industry.

That doesn’t always need sensors or ‘bigs-of-data’ to achieve it, sometimes it just needs someone to go back to basics, look at the user needs, design products appropriately for it.

It means more efficient use of capital for re/insurers, as well as opportunities to open up entirely new sectors and really work on closing gaps in protection.

More responsive re/insurance and risk transfer products can be created today.

In the future responsiveness should be a design tenet of every insurance and risk transfer product this industry creates, as it’s just a more satisfactory delivery model than the ‘claim and pray’ process we see today.

Risk financing based on intelligence, simplicity and user needs, intelligence furnished with data to design products that better meet the demands of our businesses and lives today.

More on the concept of risk transfer and insurance becoming more responsive in posts to come…

Profiles in Risk podcast interview

I was lucky enough to be invited to talk about the history of http://www.artemis.bm with Nick Lamperelli of Insurance Nerds for his Profiles in Risk podcast series.

We discussed how I got into the internet, some of the work we did at the first start-up I worked at in the mid-1990’s, how I discovered catastrophe bonds and insurance-linked securities (ILS), the labour of love that is http://www.artemis.bm, as well as my thoughts on the development of industry trends, including InsurTech.

Thanks to Nick Lamperelli for inviting me, it was a fun discussion. Check out Insurance Nerds and the epic Profiles in Risk podcast series here.