Insurance companies are losing millions of dollars because their adjusters often pay out more than the amount that is owed to the claimant simply to avoid confrontation and close the file. By applying comparative negligence to even a small percentage of these claims, insurance companies can drive an increase in warranted denials and a higher amount of partial settlements, resulting in significant savings.
If you work in financial services or, more specifically, Capital Markets, then you are likely to be familiar with the concept of VaR or “value at risk.” VaR is a statistical calculation used in finance to incorporate a quantifiable measure of the financial risk that an asset (or a portfolio of assets) will decline in value. The details of VaR can be complicated, but at a high level it calculates the maximum loss possible on an investment over a given time period, given a certain probability that events will cause that decline.
If you're a marketer and Big Data is keeping you up at night, take comfort in knowing you're not alone. According to a recent IBM survey, the percentage of CMO's who feel under prepared to deal with the "data explosion" increased from 71% to 82% between 2011 and 2014. Big Data is a source of unrelenting angst among marketers everywhere.
For insurance companies today, claims management is all about speed. The faster you can create, handle, pay and close a claim is directly related to the level of service you provide to customers. However, while a lightning-fast claims process can make customers happy, in some cases it can actually cause problems for your bottom line. Why does this happen and what can insurance companies do to prevent it?
Organizations are drowning in their Big Data and are unsure of how to even begin to find the value in it. While many have set up “data lakes” as an initial step, they still struggle with understanding what data they have. The fact is, these large warehouses and repositories that store endless streams of high-volume, diverse data that flow into organizations every day may provide a good way to collect Big Data, but they do nothing to help organizations drive real business benefits.
A usually law-abiding citizen gets into a minor car accident and decides that it’s time to cash in. He exaggerates his injuries and, with the help of some unscrupulous medical providers, he places a bodily injury claim. While this may seem harmless to some, this type of “soft fraud” happens all the time. It is a contributor to the total cost of insurance fraud (non-health insurance), which is estimated to be more than $40 billion per year. This costs the average U.S. family between $400 and $700 per year in the form of increased premiums. The bottom line is that fraud, soft or not, impacts all of us.
To better attract and retain customers, leading insurance companies must establish strategies that focus on improving their customers’ experience and optimizing their underwriting and pricing practices. This is easier in theory than reality as a recent study from Bain & Company concludes - insurance carriers are usually only good at one strategy, not both. However, I believe that insurance companies can accomplish these two objectives by systematically accessing and analyzing their full universe of data, including free form text and scanned documents, to gain new insights into their customers, business and market environment.
by Len Dubois, Sr. Vice President, Marketing & Sales Support, Trillium Software
(Part 2: Empowering the right people with the tools they need to get the job done)
In the last blog post of this series, Jon Asprey, VP of Strategic Consulting at Trillium Software, details how cross-department collaboration to reach an understanding of data requirements, interdependencies, data quality issues and gaps is essential to developing a structured data governance process.
(Part 1: Establishing the required organizational structure)
Creating a culture of data awareness can help firms comply with complex regulations and enhance business performance. As financial services firms continue to struggle with the effective management of data quality, ingraining the right culture into the organization is critical. For firms to successfully implement structured data governance and control, it is imperative that all stakeholder groups involved in data management be considered. Stakeholders from across the organization need to look beyond their departmental silos and work together to understand the data requirements of each group, interdependencies based on how data is being used and what data quality issues or gaps exist. This understanding is the basis for developing a culture of data awareness within and across an organization.
After London, Birmingham is the UK’s second largest city. With a population of around one million it grew rapidly during the Industrial Revolution to become a powerhouse of British industry. Much of the heavy industry has long since vanished but it remains a vibrant and diverse place with a rich cultural heritage. For instance, many of Britain’s best known rock bands – Black Sabbath, the Moody Blues, Duran Duran & the ELO – all started life in Birmingham. And between them they created some lasting records – for instance Nights in White Satin, Paranoid, Mr. Blue Sky, and Say a Prayer.