By Nick Corrie, Solvexia
Examples include Netflix v Blockbuster, Amazon v traditional bookstores, Uber v Taxis, AirBnB v Hotels. Is insurance immune from this wave of innovation? That would be wishful thinking.
To understand why let’s investigate the enablers of these disruptive forces: connectivity and cloud computing.
In 2015 there were 3.2 billion people worldwide connected to the internet, compared to less than 500 million in 2007. This is just a count of the internet of people but if you count the internet of things, Gartner estimates there will be 21 billion connected devices by 2020. All of these people and sensors generate vast amounts of data: currently 2.5 quintillion (1018) bytes of data a day according to IBM.
This tsunami of data presents an opportunity to any organisation that is able to interpret that data and provide a service or product based on the insights generated from the data. Prior to the availability of cloud computing this would require a sizeable investment in computer hardware and the resources to maintain it. However, with the availability of cloud computing, these barriers to entry are significantly reduced. New entrants can sign up for cloud computing services and be using them within minutes. These services can start small and scale up in line with business growth. They benefit from the expertise and economies of scale that the likes of Amazon, Microsoft, Google, IBM and others are able to offer.
In general insurance, we have already seen new products come to the market under the telematics umbrella – where the evaluation of risk for a driver takes into account a new form of data that was previously not available – sensor data indicating when, where and how they are driving. A parallel exists in health and life insurance if you consider the sensor data now being generated by wearable devices. Fitness devices and smart watches with heart rate monitors are already readily available. Other devices are under development, for example contact lenses which measure blood glucose levels.
It is quite possible that these devices will have an impact on mortality –e.g. if a smart watch can detect a heart rate pattern which looks like a heart attack and then automatically call an ambulance. Products could also come to market which reward users for healthy lifestyle options such as evidence that they are exercising several times a week. As has been shown by the Vitality product (created by Discovery), the benefit to the insurer is often a better level of engagement with their client and therefore reduced lapse rates.
It is difficult to predict what the disruption vector will be in any one industry or product category. It is however clear that the enablers of disruption are here to stay so being prepared is vital. Part of that preparation is to learn about and adopt toolsets that can handle the data and draw out the information it contains. Gartner, a research and advisory firm, says that by 2020 more than half of major new business processes and systems will incorporate some element of the internet of things.
Toolsets that can handle this type of data will have to provide functionality which addresses Gartner’s “3 V’s of big data”:
• Volume – expect large volumes of data, several orders of magnitude larger than your current data sets;
• Velocity – data will be generated at speed so you won’t necessarily store all of it but instead listen for specific events and store those;
• Variety – many different data formats including unstructured data, images, video and sound; and now extended with a fourth V:
• Veracity – data sets are likely to be noisy and incomplete so will require cleansing and validation.
To take advantage of ongoing streams of data and gain competitive advantage, toolsets have to be able to fully automate the production of any necessary analytics and do the analysis of the data very rapidly. To do so may require large amounts of computing power hence the value brought by cloud computing. Out-of-date information will open the door to competitors who will take advantage of such information gaps. In a rapidly changing environment, the toolset used for the automation also needs to be quick to set up and easy to adapt to changing circumstances perhaps by utilising the data equivalent of “robots” that can be inserted onto and removed from virtual data analytics production lines. It is important that such toolsets enable actuaries and analysts to make the transition from manually performing the analysis to becoming the designers and managers of near real-time analytics production lines.
In order to provide flexibility and future-proofing, your toolset should embrace a philosophy of being open and connected.
This will enable you to get the best combination of the services you require at a competitive price. The connectivity might include links to cloud services, extraction of information from images or sound files, machine learning services, sentiment analysis, data cleansing, data mapping and live data feeds.
These are exciting times where the disruption is creating threats and opportunities. To avoid becoming a casualty of the data tsunami, actuaries and all professionals need to start investigating and learning how to use these new toolsets and skills to make sure they are prepared to ride the underlying wave of innovation.
|