Latest posts by Continuity 1 (see all)
- 7 IoT Platform vendors to watch out for - September 21, 2019
- 6 Decision making criteria that will affect IoT platform vendor selection this year - August 22, 2019
- Ten reasons to use IoT platform-based services. - July 15, 2019
Late in 2015 PwC and Iron Mountain published a report on “How organizations can unlock value and insight from the information they hold”. The report was based on an exhaustive survey of 1800 top executives at medium and large companies. The results were quite deflating for the supporters of big data and analytics. 43% of the organizations surveyed said they got “little tangible benefit from their information” and a further 23% said they “derived no benefit whatsoever”. That’s 2 out of every 3 organizations reporting disappointing impact from their data-related initiatives. Richard Petley, director of PwC Risk and Assurance gave voice to the concern saying, “Data is the lifeblood of the digital economy, it can give insight, inform decisions and deepen relationships. Yet when we conducted our research very few organizations can attribute a value and, more concerning, many do not yet have the capabilities we would expect to manage, protect and extract that value.”
With so much talk of organizations adopting big data and analytics initiatives, this would suggest that such initiatives are not delivering – failing in other words. So why do big data initiatives fail? Depending on who you talk to, you will get different answers to this question. From our position as a software development focused company that helps client organizations put big data and cloud-focused initiatives into place here are four factors we would like to highlight.
Disconnect between the data and the systems that use it: Clearly there is no problem with the volume of data. This is flooding into the organization from all sides – every customer interaction or even intent is monitored, logged and made available for analysis. The same is true of data relating to operational efficiency and even the broader ecosystem in which the organization operates. But is the relevant data being collected and is it being channeled for deriving the insights that matter? That apart, we have observed substantial gaps between the maths that the data scientists apply to the data and the software and systems that then have to apply to these mathematical algorithms to derive insights from them the organization can use. A bridge is needed between the science in data and the computing technologies of big data without which the data cannot deliver.
Not designing for business impact: Ironically, many big data and analytics initiatives are doomed to fail even before they start. In these cases, the problem is not in how the initiative is implemented but in rather how it is envisioned. What are the key business questions you want your data to answer for you? What is the business impact you are seeking? Framing and then answering these questions accurately will help you define what data to collect, how to collect it and what treatment to put it through to get those insights. Starting without this end in mind will make your big data initiative a costly experiment of the IT Department in everything but name.
The Technology tangle: In most instances organizations know their business but do not have visibility into the technology landscape. This comes in the way of putting together a coherent and complete solution that turns input into insight. This often prompts inappropriate “fad of the day” technology choices. More than once we have found initiatives running aground because of the technology choice and after significant time is expended in trying to prove the technology works, rather than in making the technology work for you. Technology choices are many and we always recommend taking a considered decision keeping your specific business needs, including factors like future scalability and security, in mind and after reasonable trials.
The money shot: In case you had not noticed, big data initiative cost money – sometimes lots of it. Sometimes discussions about big data seem to ignore the fact that apart from the costs associated with the software there are likely to be significant costs in hardware and infrastructure – all that data has to be stored, secured, transmitted and treated somewhere. While using the cloud could reduce this burden (and shift costs from the Capex to the Opex budget) this cost is still not trivial. This is apart there are costs associated with the changes the organization will have to make to systems and processes, training costs and even hiring for new skill-sets. This is why big data and analytics initiatives need the sponsorship of the very top management in the organization. How else to get the money and organizational commitment required to pull this off?
Admittedly this list is not complete – our view is obviously coloured by our experiences and there are a whole bunch of other factors that can make or break big data initiatives. We do think though that it is important to stay positive and believe. A Capgemini report from last year showed that only 8% of the managers surveyed said their own big data initiatives were “very successful” and 27% thought their initiatives were “successful”. On our part, we would like to focus on the 60% of managers in that same report who believed that over the next 3 years, big data would change the world, including the industries they themselves were in. For organizations that have similar hopes from their big data initiatives, we would suggest a long hard look at the factors listed here!