As big-data surges around us, you might fear losing your analytical footing. New data types, larger data sets, and speedier time to results can challenge even the most seasoned professionals. This big-data phenomenon is big stuff.
But fear not, says Bill Franks, author of Taming the Big Data Tidal Wave. If you've got a grounded perspective on your company's business needs and understand how to make data work in meeting those, then you should do just fine in this new world of big-data. Big-data fits right into the general analytical trends that have been unfolding over the last decade or two, he explained during our e-chat with him yesterday.
"There really are some commonalities," said Franks, pointing to one major one: the perpetual struggle with the data at hand. "It is always too big and tough to analyze."
But, somehow, the analytics work gets done. The industry cranks out new tools, and analysts get more adept at handling the new requirements. But even as the technology advances and approaches evolve, you can always count on the persistence of a few underlying analytical principles, Franks said. None of this changes with the advent of big-data.
He pointed to his own career's worth of experience as an example. "As an analytical professional, I have always wanted to get all the data I could in order to address a given problem. I now have to add big-data to the mix. It may require some extra work in some cases, but the goal is still to extract meaningful insights from it."
And, as always, what's most important is an understanding of what you will do with big-data to drive value, Franks added.
"The fact it is big or unstructured really doesn't matter when it comes to deciding if you need to use the data and what value it will drive. It only matters to the extent that it impacts what tools and techniques you may have to use. But the important decision is [whether] the data has value or not."
One approach in determining value is to identify a business problem and then brainstorm on whether a given data source can help address it. "If you find a match, do some experimentation," Franks recommended.
Another approach is to explore data proactively and experiment to see what analytics it can drive. The idea behind such work, captured in an "innovation center," is to let the experimentation drive the requirement, he said. "You don't have it figured out up front, so you experiment as a starting point."
Also remember that big-data doesn't translate to "big in scope."
Some projects may be small in scope yet require a lot of data, he noted, citing a retailer that used big-data to identify people who browsed products but didn't buy. "That required processing through a lot of data, but the actual analytics and mechanics were simple. They got a huge ROI. Starting small makes a lot of sense."
Have you gotten started with a big-data project yet, big or small? Share on the message boards below.