By Lucy Small, senior data scientist, Insurance, LexisNexis® Risk Solutions
With changes to general insurance pricing practices commencing in January 2022 (1) , we have seen the appetite for data from the insurance market increase. Insurance providers will need to understand their customers to a far greater degree than before, at renewal and new business to meet the new pricing rules (2) . They will also need to ensure they are delivering fair value and fair outcomes in the products and services they provide.
With the data available to the insurance market growing rapidly, choosing the right data to enrich the pricing and underwriting process is vital - too much data delivered from multiple sources rather than one gateway could slow down quotation times, rather than aid in the assessment process, allowing more agile players to take advantage. Too little data could leave gaps in knowledge that may lead to higher claims costs and an overall poor customer experience.
In this new world, picking the data that supports fair and accurate pricing and helps provide a smooth customer experience is critical. At LexisNexis Risk Solutions, we assess the value of new data for the insurance market all day, every day. Our team of data scientist are invested in truly understanding the value of new data to our business, to our clients and to the market overall.
Naturally, we first consider if the new data makes sense and is fair to use as a predictor of risk. We then assess the power of the new data. We look to understand its predictive capability for different outcomes such as claims, cancellations and fraud in different markets such as home, motor and commercial.
If an insurance provider is assessing new data sources, it is best not to focus on one area alone, instead it is better to evaluate how it could benefit different lines of business and functions. New identity verification solutions using Email Intelligence for example, could help spot ghost broking activity in motor insurance but could also support fraud prevention across home or travel. Some data solutions such as highly granular property data can work to prefill applications, support quoting for home and may also be used to validate information pre-quote for landlord portfolios.
We then measure the power of the data so that in turn, we can prove its value to customers. This can be done in several ways, for example gains charts evaluate and visualise the predictiveness of the data. Insurance providers should seek this validation to support their own business case for investment in new data.
At the same time, we will consider how the data aligns with the existing data we hold – is it different, better or the same? We ask ourselves, what is it offering over and above the existing data held or called on at quote? We will also consider the uniqueness of the data relative to what’s available in the market. For example, LexisNexis® Policy Insights and LexisNexis® Vehicle Build are both unique datasets.
We think about the target market for the data. Brokers and insurers have different data needs – for example, brokers tend to be more concerned with cancellations and any data insight that can predict that risk. Insurance providers will have more interest in how the data can reduce claims costs. Some data solutions offer answers to more than one problem - a good example is motor policy history data which can predict the risk of cancellation but can also predict claims loss. It is important insurance providers are clear on how they intend to use the data and what challenge they want to solve.
We also look at how granular the data needs to be, for example would insurance providers want the data at individual address level or would postcode level suffice?
Last, but not least, we look at the compliance and operational aspects of the data – how should it be used? How can it be maintained? How can it be delivered from a technology perspective, into insurance workflows?
In essence, the questions we ask of data are the same questions insurance providers need to ask themselves when considering new data sets.
Data has to deliver real insight; it has to inform the next best action for the customer. This principle applies to both existing customer data and any new data being considered to enrich the core customer record.
We have discussed previously in Actuarial Post that the key to understanding new and existing customers starts with leveraging the customer data insurance providers already hold. Linking and matching customer data from all parts of the business can build a single customer view – this ‘golden’ record for the customer with its own unique ID then provides the basis for accurate data enrichment.
Historical policy and claims data is another important part of this picture and becomes invaluable when collected over time from across the market. Attributes built from the new data can help insurance providers understand more about the customer based on market’s experience, not just their own.
With the unique insights offered through data enrichment, insurance providers are empowered to offer fair and accurate pricing; they can have more certainty that the products are right for the risk and ultimately, they are in a stronger position to provide a more personalised and streamlined service that customers will appreciate.
(1) https://www.fca.org.uk/publications/policy-statements/ps21-11-general-insurance-pricing-practices-amendments
(2) https://www.fca.org.uk/publications/policy-statements/ps21-11-general-insurance-pricing-practices-amendments
|