For better or worse, the data management world is witnessing a grand process of moving intelligence closer to the people on the front lines.
Adrian Bridgwater, one of our new favorite tech writers, recently posted this graphic, generating some water cooler chat here at Modern Systems. The use cases for big data generally apply to large, Fortune 1000-type customers. However, operational and transactional data to support big data architecture and reporting is often trapped in legacy databases that don’t integrate with SQL Server, Oracle Database, DB2, or Hadoop.
Legacy Data Management: What’s In There?
Pre-relational databases from VSAM to Adabas or IDMS can be a treasure trove for data scientists. Details of how fast user and supply chain interactions occur, to how often, when, where, and with who- it’s all there. Some refer to this as dark data, underutilized information assets that have been collected for single purpose and then archived. But given the right circumstances, that data can be mined for other reasons.
Infinity Property & Casualty Corp., for example, realized it had years of adjusters’ reports that could be analyzed and correlated to instances of fraud. It built an algorithm out of that project and used the data to reap $12 million in subrogation recoveries.
The Immovable Object vs. The Irresistible Force
According to IBM, 92 of the top 100 banks use mainframes and 71% of all Fortune 500 companies have their core businesses on a mainframe. Most, if not all, of these entities support pre-relational databases. The process of migrating data can be risky, expensive, and disruptive. At the same time, the granular insight achieved through this data is what will keep these companies alive, enabling efficiencies, new product development, and closer relationships with their customers. As old data meets big data, something has to give.
It seems that the insurance and financial verticals will help us find out what. These companies have the most data available, as well as the biggest challenges around new product development, customer churn, and integration. Some choose to modernize legacy applications while keeping them on the mainframe, like Progressive Insurance. Legacy data management was a key aspect of Desjardins General Insurance Group‘s strategy for growth, spurring a $45M project that analysts call the largest data modernization project in insurance history.
However, many shy away from large modernization projects, fearful of disruption or failure. Companies cannot switch off those systems or simply import the data into modern Hadoop platforms. Due to the many mergers and acquisitions in the finance world, banks sometimes have dozens of separate legacy systems. These aging cobbled-together legacy systems can often be found in payment and credit card systems, ATMs, and branch or channel solutions. The fact that these legacy systems cause companies headaches is illustrated by the Deutsche Bank, whose big data plans are held back due to the legacy systems.
So What To Do?
The good news is technology has caught up, and there are ways to get legacy data out of the mainframe without a migration. Companies House, the UK’s government bureau with responsibility for recording and storing corporate reporting data, used this approach to legacy data management, syncing pre-relational IDMS databases to Oracle data warehouses for better visibility and reporting.
The big change now is not that everyone is an IT manager there are still plenty of ways companies will control devices, access to computers, and data but that everyone is a consumer of a lot of data. Making that easy on them will most likely be a winning strategy for data management moving forward.