The number of companies deploying Big Data software, infrastructure, and analytics tools continues to increase. Wikibon, who began tracking total revenue related to “Big Data” back in 2013, released it’s forecast this March that the Big Data market will hit $92.2B by 2026.
Given the opportunity there have also been no shortage of new vendors, all offering some flavor of the latest in Apache Open Source projects (Hadoop, Spark and more), NoSQL, deep learning, and in-memory processing. The landscape continues to expand leading to evaluation paralysis, or Franken-research IT projects that are destined for an unhappy ending. In this never-ending technology treadmill, you can hear business teams continuing to ask IT the grating, but very reasonable question, are we there yet?
RELIABILITY BEFORE INSIGHT
But there has always been a nagging problem. For years companies, and their business users have struggled with the reliability of their data, an essential foundation before any analytics and insights can be drawn. Gaining this level of data reliability used to be purely an IT discipline called master data management (MDM). But MDM itself used to cost millions in hardware, on-premises software and consulting services.
Data holds the key to streamlining processes, optimizing collaboration, improving customer satisfaction and meeting compliance reporting goals, but it also unlocks the formula for competitive advantage. Witness the rise of high paying job titles such as data scientists, and in the c-suite chief data officers (CDOs), are gaining in significance and popularity.
If anything the ‘big data-hype’ has just begun. It’s true that the term “Big Data” makes people think primarily of size (volume of the data), but it’s the variety, velocity and veracity (quality) of the data that has IT tied up in knots. Rather than focusing on the size of the data, their business users are rightly focusing on the size of the ‘big problems” that they need to solve.
But even IT and data savvy users are struggling with getting value out of Big Data projects. According to a report titled “CIO and Big Data – what your IT team wants you to know” 55% of big data projects fail because of the lack of communication between the top managers who had the overall project vision and those who were in charge of actually implementing it.
GARTNER PREDICTED THE FALL
Gartner Research, to their credit, was telegraphing this in their July 2015 Enterprise Information Management (EIM) Hype cycle Report.
Big Data had started the descent into the “Trough of Disillusionment”. The report summarized the trend as follows:
Another Gartner study on Hadoop adoption, 70% of respondents reported having only between 1 and 20 users accessing Hadoop, and 4% reported having no users at all.
MASTER DATA MANAGEMENT IN ITS LEGACY FORM FARES NO BETTER
If you glance 4 down the curve from Big Data, you’ll actually see MDM leading the way into the Trough of Disillusionment. Gartner explains this scenario as follows:
For years business has funded standalone MDM initiatives, with very little to show for it. Clearly then, just putting MDM and Big Data together in its current form is the equivalent of combining two boat anchors together.
BEING DATA-DRIVEN FOCUSES ON THE BUSINESS PROBLEMS
Being “data-driven” is a business term that has also been hyped to the max over the last few years with many experts encouraging companies to use analytics and visualization to gain the insights they need to succeed. Data mania is reaching a fever pitch with the BI and Analytics market forecast by some to be $18B+, with the enterprise applications market at $32B+. That’s a lot of money being invested by business teams and IT, often in uncoordinated efforts. Unquestionably there are successes that can be tied to the investments, but the ROI of software has always been arbitrary. Without a closed loop to correlate back actions to the insights provided, it’s a continued guessing game at best.
The hype and noise shows no signs of abating so buyers of technology want to focus more on time-to-value and accuracy of outcomes. Business users in particular who experience the ease-of-use of products such as LinkedIn and Facebook as consumers, are asking tough questions of their IT counterparts: “Are we there yet?” questions are now morphing into “Why can’t I get it and why does it take so long?”, and “Why can’t my application deliver recommended actions to me like LinkedIn, and why does it make me search for something it already should know?” This powerful voice of the user is putting pressure on IT, already faced with shrinking budgets and legacy systems to maintain, not to mention continuing issues with data quality and security.
BIG DATA DOESN’T BREAKDOWN SILOS
Much of the problems also stem from the fact that despite the volume of data being captured, data is still siloed, and unrelated. So the elusive 360 view of a customer or product has never been further from reality. Individual departments and organizations are still left to making business decisions based solely on their own data. Self service BI is a billion dollar market, but business users looking at poor quality data, without the complete picture across the enterprise is a ticking time bomb.
We are now on the cusp of a revolution in what it really means to be data-driven, and size of data doesn’t matter, and traditional/legacy MDM isn’t helping with that veracity piece.
Today a holistic “modern” data management strategy is needed, implemented through platforms in the cloud that can easily combine and clean master data across internal and external sources, correlate those profiles with transactions, big data or small, and maintain those relationships to provide a core repository from which predictive analytics can be obtained.
The enterprises that will thrive will be those that can best use those platforms to deliver and share the right information on demand, anytime, anywhere across the enterprise.