Navigating on the Big Data Raft
Volume 2 Issue 1 June 2018 : Value Creation - by Kumar Abhinav Srivastava
Frisson is a word rarely used but frequently experienced by senior executives in Information Technology. It is a feeling that overlaps thrill and fear - a good example would be how one feels during a river rafting session. Sifting through the rapid explosion of evolving technologies and figuring out what makes business sense for an organization is no less than an adventure ride.
On one hand, is the vast untapped potential of a nascent technology and how it can become a potential game changer. IT executives are deluged with white papers about how a technology can unlock business value for organizations. A visit to a hackathon at a technical institute would reinforce that idea by taking it to a prototype level. And conferences by market research companies that target business leaders and IT executives are usually the last straw.
What happens next is either a CEO asking the question to IT, “What are we doing on Big Data?”, or a CIO telling its team that a vendor wants to do a POC (Proof of Concept) of a big data tool. If none of these happen, an IT executive is anyway left wondering if they are being left behind in the latest game.
One possible outcome at this point is a technically successful pilot. Most of the things went on as expected and there were some new insights. Unfortunately, it was not clear how these insights would translate into revenue for the company. In other words, the insights were interesting but not actionable.
Another possible outcome is going down the path of a big bang implementation approach. For example, trying to replace the traditional data warehouse with Hadoop. Two years down the line, data for only a handful systems has been acquired and even those business teams are complaining. And there is simmering tension within the organization about the return on sizeable investments in Big Data.
While both outcomes seem to be different on the surface, they both leave the business users disillusioned with the promise of Big Data. Even if one looks beyond the shiny sales pitches and use case specific success stories, there is no denying the potential of this technology. So, what went wrong?
The answer could be in analyzing the problem to be solved. Not all data problems within the organization are a good candidate for big data application. Taking example from a financial services industry, here is the mapping of their most common data analysis use cases - were these really a big data problem a few years back?
Most of the executives would have agreed that they need reliable data for every use case. But for regulatory reporting purposes, the existing Data warehouse was serving well, and no problem was foreseen in next 2-3 years at least. It takes multiple years to build a robust data warehouse and it would not make any sense to dismantle it and replace with big data platform. For millions of dollars investment in technology and resources, where is the business value?
At the same time, cross selling of products to customers seemed to be a good candidate for big data application. It made sense to have single view of all the products used by customer using Master Data Management. Integrating variety of data from social media could potentially reveal behavioral information and all these pieces can be put together to provide customized product offers or advice in real time.
If we want a person or an organization to change their respective behavior, the incentive always has to outweigh the resistance to change. For organizations, technology can drive changes only there is sufficient business value creation. The crucial point here is that this analysis should not stop at a CIO or an enterprise architect level. It really has to be done in coordination with each business owner, by figuring out the problems they are trying to solve and how factors of a new technology can help towards that goal.
The mantra in financial services often is not to fix anything which is not broken. So, instead of a orchestrating a big bang change and declaring that all data will be hosted in Hadoop, the wiser option is to say that this is not a big data problem or it may become one in next 3-5 years. And therefore, the current robust data warehouse should continue to co-exist with experimental Hadoop and let us plan to relook at this problem at a defined point in future.
Now that we have looked at what the organization does today, the other vital analysis is to map the future business focus against the Big Data capabilities and analyze whether it would help. For example, intelligent assistant can be lifelike with real time analytics. And to integrate social media data for behavioral analysis, unstructured data must be factored in.
Essentially, the question to be asked is - whether Big Data is crucial in meeting our future objectives? And again, the analysis must be done in consultation with each Business line by IT leaders. The answers could vary, and some business lines may not have any use case or sufficient justification for Big Data implementation. This should be a perfectly acceptable outcome of an analysis exercise to IT leaders. Instead, if Business lines are forced to adopt Big Data as an organizational policy or an architecture standard, it starts be perceived as liability rather than an asset. Prioritization of initiatives based on business value creation, is the only happy path to Big Data adoption
Kumar Abhinav is a Technical Program Manager and has experience of working with financial services organizations of different sizes across geographies. He holds a Master of Science degree from MIT. He is B.Tech, Computer Science & Engineering, MNNIT, Allahabad, India, MBA, Finance & Marketing, NMIMS, Mumbai