Why the midstream cares about data management
The outlook for the midstream is showing promising growth, but the industry is only starting to consider data management a priority, which could help increase that growth.
Surveys show that only 50% of midstream companies consider data management a priority, according to the second of a series of reports on digitalization in oil & gas from Deloitte Insights.
The report authors readily admit that the outlook for the midstream already is promising given strong growth in U.S. light tight oil production, natural gas production in shale fields, and emerging export opportunities for oil and liquified natural gas (LNG).
On the other hand, for example, basin price differentials can make difficult decision-making bearing on midstream infrastructure planning, with misplaced investment falling prey to either the Scylla of stranded assets or the Charybdis of lost opportunities.
The midstream asset base includes about 2.7 million miles of U.S. oil & gas pipelines with an average asset age of 20 years. The industry’s mechanical-centric operating culture hasn’t yet translated digital concepts into grass-roots level change.
The Deloitte digitalization model proceeds from the mechanical to the virtual and then back again. It starts with developing a narrative focused either on assets or the value chain, aligning operational objectives with digital technologies. Storage operations seem, digitally speaking, ahead of other midstream operations; while terminal operations are ahead of tank management systems, the authors say.
Gathering line systems, the first receiver of hydrocarbons prior to processing and transport, are at the emergent stage of “sensorizing.” What is typical today is availability of pressure and volume data from lease automatic custody transfer systems with tasks such as leak detection done using linear balancing equations such as negative pressure wave, real-time transient models, and corrected volume balances.
Through data generation and integration, relationships are defined that lead to meaningful insights. The authors point to the example of one U.S.-based service provider that assimilates diverse data sets on a virtual server, allowing users to define physics-based relations or calculations as well as run auto-tuning algorithms to refine results using non-physics-based concepts. This way goes beyond leak detection and batch management to predicting optimal operating parameters, simulating product properties, and illustrating zone characteristics.
Not fade away
The authors also cite an example of four North American operators that are leveraging the GE Predix platform to integrate operational and economic aspects of production, transportation, storage, and contracts, and running scenarios in a virtual collaborative environment to evaluate the consequences of network changes on basic production and gathering lines.
The authors say deployment of sensors and communication networks is a prerequisite for trunk lines, an operation where computational monitoring via pressure, volume, and temperature analysis, controller monitoring via SCADA systems, and scheduled line balance calculations are “a regular affair.”
Although new pipelines often are pre-equipped with such technologies, there is a significant opportunity to upgrade legacy infrastructure throughout a pipeline network rather than only at key junctions.
Obsolete legacy systems are exposing midstream companies to cyberattacks, especially the electronic data interchange systems used to encrypt, decrypt, translate, and track key energy transactions.