If you think extracting data for decision making in your organization can be tough, consider the task facing Tom Moroney and his team as they work to provide usable data from oil reservoirs tens of thousands of feet below the ocean's surface. Talk about a daunting challenge.
(Source: Shell Upstream Americas)
Moroney is deepwater technology deployment manager for Shell Upstream Americas
, charged with providing geoscience expertise for Shell's deepwater portfolio in the Gulf of Mexico and Brazil. "Whatever the range of technology our deepwater portfolio demands, my team is leading efforts to qualify, de-risk, work with scientists internal to Shell and from external vendors, and partner to come up with creative, innovative systems," Moroney explained for All Analytics in a recent phone interview.
Sometimes, this is about hardware and building physical systems. Other times, it's about software and know-how. Either way, the goal is to find "cutting-edge technology solutions that will drive us forward and keep us being able to secure and expand our resource frontier in the deepwater," he said.
The technologies run the gamut. Moroney's organization works on well technologies for construction and drilling, as well as on operational technologies that, for example, deal with the processing of fluids. Once assets are in production, its work involves figuring out how to monitor, surveil, and optimize those assets. "We look at how we instrument them, how we take all that information in, how we process that data, and how we interpret, analyze, and conduct all the key diagnostics that are needed to make sense of these massive integrated production systems so that we can keep them running optimally and our uptime maximized," he said.
Mind you, he said, "These assets aren't sitting in your driveway."
Not being able to climb into or see, feel, and touch the assets is one big challenge, Moroney said. In particular, he talks about the fluids coming into installations floating in the Gulf of Mexico.
The fluids are coming from reservoirs that are sitting in 3,000 to 10,000 feet of water, that are then several thousands of feet -- tens of thousands of feet -- below the sea floor. We have ways through seismic [imaging] to have interpretations of what the subsurface looks like, and certainly we've taken measurements on the key variables that describe the containers that hold the hydrocarbons. But these are very distant, deep, and difficult… and that is a challenge.
What's more, the information delivered up is imperfect and incomplete. Using its understanding of physics, Moroney said, his team can fill in the blanks through mathematical modeling and simulations to build a model of what it believes is occurring downhole in the reservoir and how the hydrocarbons move from the reservoir into the well bore, and ultimately how they are processed and brought topside. "These are complex systems that we have highly instrumented in many cases. But again, it's imperfect information and incomplete information and it's a big challenge trying to take all that real-time information -- actual measurements, however imperfect they are -- and compare against models of that system."
In some cases, Moroney's organization isn't necessarily using a model, he said. Sometimes, his team runs physics-based calculations and analytics. "If we know that pressure should drop X over this distance, and it's not, then we know perhaps that we have a plug or something restricting flow."
With production at stake, Moroney's team understandably wants to get ahead of such events using predictive analytics and advanced, forward-flowing asset management. Two years ago, it began using SAS Predictive Asset Maintenance (PAM), to help it work through the application of statistics and predictive analytics in order to find out what's occurring and what kinds of patterns it's seeing.
We want to move from picking up events to spotting the patterns that those events potentially represent and then, before those patterns become trends that put us on a path of jeopardizing performance and allowing a well to become impaired or a facility to be thrown offline, we want to be able to intervene -- to take the necessary proactive steps to keep everything running at optimum or near-optimum efficiency.
It's about getting ahead of an event hours -- up to a full day, even -- before it manifests itself at topside, Moroney said. The event, say a separation issue occurring within the systems on the seafloor, will still show itself at the surface eventually. But using a predictive asset manager, the team can say, "We've already conducted extensive analytics using real-time data and we know the issue is imperfect separation, so maybe we have to increase the amount of foamer we're pumping down the hole or what have you." The goal is shrinking time to decision-making.
"What PAM is teaching us is that you can use statistics and advanced analytical solutions to actually de-convolve the problem and get down to an understanding and separate causal and correlated variables, and you can even do quite a lot with an imperfect data set," Moroney said.
Shell's deepwater technology team has always known the value of information, but this project has helped quantify it even further, he added.
I'll be sharing more about how the use of predictive analytics for asset management is evolving at Shell later this week. In the meantime, share your own deep-data experiences below. Have any projects you've worked on helped further quantify the value of information for your organization?