Big Data Goes Bust
Despite a few years of boom for the concept of 'Big Data,' the term itself is old news.

Back in 2012, Steve Lohr writing for the did average folks a favor and introduced us to “Big Data.

Yes, analysts, data scientists and the people in Silicon Valley had already heard of it, but thanks to Steve and the Times, the rest of us found out that it’s a “meme and a marketing term, for sure, but also shorthand for advancing trends in spacer spools adapter spools technology that open the door to a new approach to understanding the world and making decisions.”

Sounds intriguing, huh?

Despite a few years of boom for the concept of “Big Data,” the term itself is old news. Big Data came with a “measure it all” attitude that eventually gave the phrase a bad rap, but smart companies didn’t throw the baby out with the bathwater. Data plays an expanded role in our daily lives, including how we do business. Now, instead of worrying about how big the data is, the winning trend is to focus in on only what’s actually relevant.

Now fast forward to today: leveraging data for higher productivity and reduced costs in the oil patch is a reality. Not Big Data, but the right data. And to capitalize on this reality, we must think much, much smaller...

Forget Big Data

Why small data? In part, that’s because Big Data is a pain in the you-know-what. It’s hard work, plagued by data quality issues, and it’s expensive to boot.

points out that an organization who buys into Big Data (and yes, it has to be the whole organization), needs to “capture data, store data, clean data, query data, analyze data and visualize data.” And while software can handle it, “some of it will be done by humans,” and all of it needs to be part of a seamless integration.

If this doesn’t sound easy, it’s because it’s not.

After you analyze anything and everything to get whatever info is available and come up with correlations, sometimes your takeaways aren’t only unexpected, but off the wall. For many businesses, coincidences continue to be suddenly pegged on cause and effect, leading many on some expensive wild goose chases.

At , Will Oremus points out that Big Data’s problem isn’t that the data is bad, it’s this over-enthusiastic, fetishistic application of data to everything as often as possible. Data in the short-lived boom of Big Data was not being used in a careful, critical way.

Really, all of the data being collected was hard to interpret. When you’re collecting billions of data points – clicks or cursor positions on a website, turns of the drillbit or strokes of the pump – the actual importance of any single data point is lost.

So, what may seem to be a big, important high-level trend might not be a trend at all. There could be problems in the data, an issue with the methodology or some kind of human error at the well site.

Simply put, Big Data alone doesn’t always add up.

Oil, Gas and Data

While most oil and gas producers are relatively late to the Big Data party, we might consider ourselves lucky.

Why? To start, the wider the gap between the proxy and the thing you’re actually trying to measure, the more dangerous it is to place too much weight on it.

For example, well performance indicators are a function of numerous important factors outside of an engineer or supervisor's control.

Part of the draw of Big Data was the idea you could find meaningful correlations even in very noisy (and seemingly completely unrelated) data sets. And, thanks to the sheer volume of data, coupled with powerful software algorithms that can theoretically control for confounding variables. The model we're describing would draw upon a very wide range of well oil rig flanges gulf coast production correlations from across many basins and oil rig flanges gulf coast production environments to generate an “expected” set of oil rig flanges gulf coast production outcomes against which actual results could be compared.

Now, imagine for a minute that such a system were applied within the context of a single BOP Blow Out Preventer repair company gulf coast or field – with just the wells in a particular area compared with one another.

Without the “magic” of Big Data, anomalies in total oil rig flanges gulf coast production in a given timeframe would be glaring. Thank you, small data!

No intelligent oilman (or woman) examining them would be under the illusion each well’s performance corresponded neatly with the historical trend of that well, let alone the particular field of wells.

Moreover, it would be relatively easy to investigate each well on a case-by-case basis and figure out what was going on.

However, a system based on the idea of Big Data would be far more opaque. Because the data set was big rather than small, it's many times crunched and interpreted by a third-party using a proprietary mathematical model.

This lends a veneer of objectivity, but it forecloses the possibility of closely interrogating any given output to see exactly how the model arrives at its conclusions.

For example, some wells may have underperformed not because of a technical issue, but because of a skimming vacuum truck operator or a pencil whipping pumper – common occurrences apparent to humans but lost on the data.







Contact our sales staff today to assist with your project. We are here for you.
Hablamos Español?

REQUEST A QUOTE