Unless you’ve been under a big rock, you’ve heard the chat about big data. It describes the tools and practices associated with unlocking the value in large amounts of structured and unstructured data. If that sounds confusing, just think of of how much information is being created daily. Now think of what it takes to find the nuggets that matter. Big data means big business.
According to analyst firm IDC, the big data market will grow from $3.2 billion in 2010 to $16.9 billion in 2015, growing 7 times faster than the information and communications technology (ICT) market.
More than 200 people on 107 teams submitted 1290 entries to try and win the $10,000 prize. What’s more striking is that the winning entry was 340% more accurate than Allstate’s existing methodology for predicting claims based on vehicle characteristics. Oh, and it took only 90 days. [chart below]
McAfee says that the prediction models, tools and techniques have changed so rapidly over the last decade, large companies are simply outgunned. Predictably, that’s forcing those same organizations to rethink their core competencies.
“If a bunch of kids and outsiders, who don’t know our customers, markets, and histories at all, can make much better predictions that we can in critically important areas, what does this imply? Do we need to completely rethink how we make our predictions? Do we need to outsource or crowdsource the making of predictions, even if we currently think of it as a ‘core competence?’
So what other areas are ripe for the reach of big data? Booming industries like oil and gas are one of the obvious segments. There’s tons of activity, an ecosystem of companies providing complementary services, and of course cash on hand to hire the data scientists and number crunchers. Austin’s own Drilling Info is a case in point. It closed a big round of funding recently, one that shows when industry processes can be transformed, the money will likely follow.
“If you go back 10 years, oil and gas companies would hire consultants to go to repositories of information at the state level, and manually collect data, put it into a spreadsheet and analyze it,” says Deven Parekh, a managing director with Insight. “What we’re really doing now is making that data available through regular workflows on a daily basis.”
Insight’s Parekh also mentions the importance of domain expertise, something often overlooked as organizations dive deeper into analytics.
“In three or four years, the technology itself may become fairly standard, but the key will be in having the deep vertical knowledge.” Of course, as we saw with AllState, data-driven insights might actually alter what some companies see as their real expertise.
Lastly, McKinsey published a report last Fall on the big data movement. One of its notable excerpts was the image below, showing an industry-specific view of big data projects as well as which segments stand to gain the most.
Their report also identified one of the biggest challenges lurking in large organizations — the dreaded silo.
“One big challenge is the fact that the mountains of data many companies are amassing often lurk in departmental “silos,” such as R&D, engineering, manufacturing, or service operations—impeding timely exploitation.”
But surely we can unlock all this data with all the technology we have, right? Sure we can. But just in case, let’s hear it for big data’s next hero, the system integrator.