Predictive analytics hit the midstream
Analytical software now at the disposal of midstream engineers and operators is a far cry from the trending and process historians, which have been relied upon to monitor and fine-tune operations. Current software is more predictive and model-based and may pull in data from multiple control-level sources and using Web dashboards. SCADA and plant historians remain important, and are themselves changing to extract more relevant information for users, and to serve as key data sources for the new breed of predictive analytics.
Analyst firm IDC Energy Insights, expects that by 2016, 50% of industry companies will have advanced analytics in place, with common business cases being predictive analytics and optimization of drilling, production, and asset integrity. Major technology and consulting providers to the industry also see opportunity in predictive analytics, such as GE Oil & Gas and technology consulting company Accenture, which announced a partnership in September 2014 to apply predictive analytics to pipeline integrity.
In short, the industry is looking at ways to move data up a level for advanced analytics, while looking at ways to improve on solutions at the measurement and SCADA levels. "The idea is that you have all these data from the control and sensor level, why not model some scenarios and make some predictions?" said Chris Niven, research director for IDC Energy Insights.
Pipeline integrity is the midstream area perhaps ripest for predictive analytics, but companies also face compliance and customer service pressures to maintain close, accurate tabs on production flow and product quality. "The thought of analytics is great, but you need to have a problem to solve before analytics adds value," said Niven. "The main thing is to look at analytics in terms of the outcomes it can help you achieve."
Unique midstream challenges
The midstream faces unique challenges. Not only must companies gather hydrocarbons from upstream producers, they may operate processing facilities to "sweeten" the gas and remove impurities. The core challenge is to distribute product to customers via pipelines and storage facilities while keeping close track of how much product is flowing to customers.
Pipeline networks may span hundreds of miles, and often have a wide mix of meters, gas chromatographs, and SCADA systems used within one network. In fact, the majority of natural gas pipelines in the U.S. were built prior to 1970, according to a 2012 report commissioned by the Interstate Natural Gas Association of America. With proper maintenance, these older pipelines can remain safe.
"Complexities such as the linear nature of the assets, buried pipe, the age of the assets, the difficulties to replace them, and also the changeover in ownership of pipelines, have created an environment that is challenging from a data perspective," said Brad Smith, product line leader at GE Oil & Gas. "The ability of midstream companies to digitize information and understand what is going on within their networks is a major challenge." The emerging remedy is to gather data from multiple sources and feed it into an analytics platform to make predictions on risk and how to best allocate spending for maintenance, Smith said. The data sources for pipeline integrity would include data from inline inspection or smart "pigging" systems, data from enterprise asset management systems on repairs, leak detection systems, as well as data from SCADA, metering, and instrumentation.
A predicative analytics platform can take in disparate data, apply it against a model, and identify asset sections or conditions of highest risk. According to Smith, "we’re trying to build a better real-time view of risk and where it might occur in a pipeline. The idea is not necessarily to reduce pipeline integrity costs, but to drive better effectiveness of that spending."
Columbia Pipeline Group (CPG), with operations based in the Marcellus and Utica shale plays, is the first customer to implement predictive analytics from GE and Accenture. The platform is being used to analyze pipeline integrity across CPG’s 15,000-mile network of interstate natural gas pipelines.
Engineers typically don’t partake in the setup of a predictive analytics engine, according to Smith, but they do typically get involved in detailing the type of dashboards and key performance indicators they want the platform to generate. There also is the need for engineers to work with analytics vendors to identify missing data to generate better predictions.
While there is work involved in helping vendors establish a predictive analytics solution, the result is a dashboard and geospatial view of pipeline risk that cuts the time that would otherwise be spent gathering data. "With an analytics platform, the users can concentrate on driving efficiencies and improvements," said Smith.
Scott Strandberg, Canadian midstream product manager with IHS, also sees predictive, model-based analytics gaining favor in the midstream. Configuring these analytics starts with establishing good data. Furthermore, feeding it accurate asset information such as pipeline diameters and materials-in addition to the history of the product types and volumes-that have moved through a section of pipe.
There might be more than 100 data points that an analytics engine crunches to come up with a prediction of pipeline integrity risk. "You start to look at things such as leak detection data and data coming off of pigging systems. The analytics software is now able to pinpoint which sections of pipeline are more susceptible to risk," said Strandberg.
Operational analytics evolve
Aside from predictive platforms, SCADA solutions also are evolving to become better at spotting bad data, flagging deviations, and providing a better foundation for production reporting, according to Strandberg.
Vendors are using monikers such as "intelligent" SCADA to denote the ability of these systems to track more data sources and spot deviations. "Some SCADA systems are starting to become almost the production volume record as well, and generally, they are getting more sophisticated than the SCADA systems of the past," said Strandberg.
SCADA vendors are starting to use rules and heuristic techniques to spot deviations impacting the accurate understanding of flow, production, and gas quality. "On the SCADA side of it, the vendors are starting to enable rules that manage more stringent data values so that you can’t have a fluid analysis that is out of range," said Strandberg. "It comes down to better variance reporting by being able to establish what your tolerances are."
SCADA and plant historians are essential for monitoring real-time process data, but tapping into data from electronic flow meters (EFMs), and keeping track of measurement and flow data is equally important in the midstream, according to Steve May, president of Computerized Processes Unlimited. "Because you are moving so much product in a midstream environment, if your measurement is off by a little, you could be losing lots of money. Therefore, in midstream, you also need to be able to monitor and analyze measurement data, not just the real-time process data," said May.
Measurement software taps into data from EFMs, storing the data for analysis and production reporting. Through the ability to set thresholds on deviations, measurement software can help spot problems such as missing data, suspect data, or uncollected data. For example, May inferred that repeating data on a differential meter could indicate the meter is stuck or frozen.
Midstream companies also use measurement software for government reporting and compliance with standards such as American Petroleum Institute (API) 21.1 for gas measurement and 21.2 for liquid measurement. Customers of midstream companies also typically want periodic reports that show them how much gas they are consuming under monthly contracts. Without measurement software that can quickly generate consumption reports, operators and control room engineers must scramble to compile such reports by analyzing EFM data. "Automatic reports make life a lot easier," May said. "What used to happen is that customers would call the control room and say, ‘how much gas have I taken?’ Compiling those reports distracts the control room operators."
Measurement software also can keep track of lost and unaccounted product, and this forms the basis for knowing the balance of product entering and leaving pipelines and other assets. "Midstream organizations want to know that the overall system is balanced across all of the pipelines, and they want to know if any meters or measurements are out of calibration or have issues," said May. "They need solutions that can spot deviations, quickly recalculate volumes, and basically can turn data into information, rather than just capturing data so that you can figure it out on your own."
Basics still apply
Major technology providers such as GE and Honeywell are using some of the same "big data" analytics they use to monitor assets such as jet engines to comb through data generated by pipeline compressors. Meanwhile, business intelligence software packages that have rich data visualization capabilities are being used by integrators in the midstream.
According to Munsoor ur-Rahmaan, a business development lead with INTECH Process Automation, much of INTECH’s work in midstream analytics is to create dashboards for business intelligence tools with data visualization and mapping capability to highlight and analyze trends derived from multiple process-level systems. One of INTECH’s recent projects was for a large oil and gas producer in the Middle East, pulling in data from dozens of sources spanning upstream and midstream operations into a Web-based dashboard.
The dashboard can also be accessed via tablets. "The users want to look at their data uniformly, and analyze it accordingly," ur-Rehmaan said. "They want a single view of exceptions and issues."
IHS’s Strandberg said he sees a mix of approaches in midstream. Some large organizations are pursuing enterprise-class dashboards for SCADA and plant intelligence, while others continue to use more of a point solution approach, with separate SCADA systems dedicated to particular types of equipment. Regardless of the approach, the end-user organization still must pay close attention to variances, data quality, and determine if it makes sense to invest in better instrumentation, such as replacing analog flow recorders with EFMs.
Setting a solid foundation for analytics should be part of the facility design process, according to Lindel R. Larison, COO and founding partner with Tall Oak Midstream, an Oklahoma City-based midstream company. Tall Oak has two natural gas gathering and processing systems in Oklahoma, the Tall Oak CNOW system and the Tall Oak STACK system.
The CNOW system will ultimately include 250 miles of low-pressure gas-gathering pipelines inclusive of 60 miles of recently acquired pipeline, as well as the Battle Ridge plant, a 75 mcf per day cryogenic processing plant and nitrogen rejection unit located in Payne County, which began operations in late February 2015.
The STACK system, which will entail more than 150 miles of gas gathering pipelines, also will include a new 100 mcf per day cryogenic processing plant scheduled to come online in the third quarter of 2015.
The vast majority of these assets are new, and even the acquired portion of CNOW pipe is only a couple of years old, so Tall Oak does not face the challenge of dealing with older legacy systems. However, the company is winding down the process of bringing new pipeline assets online, and as part of that process, paid close attention to putting in the proper amount of instrumentation, such as flow meters, pressure and temperature sensors, instruments to measure gas and natural gas liquids compositions in real time, and new SCADA systems to ensure it has ample data for monitoring and continuous improvement.
"If you look at the total cost of a new plant-whatever that cost might be-a very small percentage of that is going to go into instrumentation," said Larison. "I think the key thing is to ensure you have enough instrumentation-enough measurement devices-so that when the plant and gathering system is up and running, you have the ability to monitor everything very closely, warehouse the data, and make the adjustments as necessary to ensure that you can continually improve from an operational and customer-focus perspective."
4 key recommendations on midstream analytics
- Predictive analytics for pipeline integrity are emerging, but engineers must advise analytics experts on underlying data sources and configuring KPIs.
- Measurement is crucial to the midstream, as are ways of converting measurement and flow data into production reports. With solutions, look for analytics and reports that can quickly update trends and generate new reports as new data from metering or gas analysis become available.
- The foundation for analytics goes all the way back to plant and asset design. Install enough instrumentation to drive better predictions.
- To work well, an analytics program might identify the need for some newer instrumentation, such as replacing gas charts with EFMs, investing in new acoustic leak detection technology, or online corrosion measurement transmitters.
– Roberto Michel is a freelance writer and editor with more than 20 years of experience as an editor with business-to-business publications.
More information about analytics:
TDWI Research white paper, "Predictive Analytics for Business Advantage"