Flare stack monitoring made easy with edge-enabled video analytics

Real-time remote monitoring of a pressing concern in oil & gas

By Ramya Ravichandar July 31, 2020

McKinsey estimates that effective use of digital technologies in the oil & gas industries could cut capital expenditure by up to 20%. At the same time, it forecasts that total cash flows will improve by $11 per barrel across the offshore oil and gas value chain, adding $300 billion a year by 2025.

Even with the rapidly maturing solutions of the Industrial Internet of Things (IIoT), there are still areas of immediate and pressing concern for the oil & gas industry. One of these is monitoring flare stacks. Gas flares pose a threat to both the environment and worker safety, but the current method of monitoring them is expensive and problematic, with no margin for error. Since humans are only human, errors can creep in. Errors can not only bring on environmental disasters, but they also put workers at greater risk.

In addition to the time and labor issues, and the opportunity for human error, manual monitoring often causes a delay in identifying potentially life-threatening problems like equipment failure. Even though flares can now be monitored via streaming analytics, there still needs to be visual monitoring of the stack to maintain safety. Nothing changes in the end.

What’s needed is a fully automated system capable of both processing stack analytics and keeping a watchful eye on the flare itself. The deeper the dive into the data, the better visibility and foresight into potential issues.

Digitalization of flare monitoring

That deeper dive brings IIoT technology to the surface. It’s been estimated IIoT will have a $930 billion impact within the next decade. Regardless of exact numbers, the industry is investing heavily in IIoT.

However, it’s not as easy as simply installing sensors and cameras — oil & gas experts also need to process and glean insights from the data. IIoT devices generate data on a massive scale, which must then be transmitted (in most cases) to a central control center for processing. However, the transport, storage and processing of video data soon becomes prohibitively expensive. And if one is not going to take immediate action on flare anomalies, is it worth the opportunity cost of high latency? How can we ensure operators execute on actionable insights in time to prevent potentially catastrophic issues?

There is a solution for processing video from flare stack monitors, and it involves edge computing, and, more specifically, edge intelligence.

Video analytics comes of age

What type solution eliminates having to store massive amounts of data and makes it possible to react in seconds rather than hours or days? It involves bringing computing to the edge. All sensors, cameras and other IIoT-enabled devices sit at the edge, typically the edge of the cloud. Edge computing takes the data center out of the equation by performing compute functions in situ and communicating directly with other devices and systems.

By moving compute functions to the edge, network connectivity and speed are no longer an inhibiting factor. While the data is still transmitted to the central data center in the cloud or on-premises, it’s possible to do the transfer in batch or send only the outlying data points. Most of the compute is done at the edge to spare bandwidth costs and relieve network congestion.

Edge computing solves several problems inherent in collecting, processing and reporting on sensors, cameras and other IIoT devices that are probably scattered across your infrastructure. But what happens when the need is to process streaming video for insights into flare stacks in particular?

There’s a common misconception that machine learning (ML) and artificial intelligence (AI) can only be performed by powerful, large-scale systems. But a high-powered server isn’t necessary to take advantage of these technological advances. One of the ways edge intelligence differs from simple edge computing is the ability to perform sensor fusion with the incorporation of ML and AI. Add those capabilities together, and something impressive emerges — edge intelligence.

The inclusion of ML and AI based applications allow for the most benefit from IIoT-connected monitoring cameras. Edge-enabled ML and AI allow for issues to be assessed and acted upon in real-time on streaming camera data.

Putting that kind of advanced intelligence at the edge of the network, and where the stacks are, reduces the size and required memory by approximately 80%, enabling fast and efficient execution in real time.

Ensuring safety

Video analytics with edge intelligence in flare stack monitoring can help reduce flare stack emissions and immediately alert workers, or even shut down operations when they are outside of an acceptable range. The Environmental Protection Agency (EPA) has produced and is enforcing a comprehensive set of regulations surrounding flares, particularly to reinforce the need for careful controls over hazardous air pollutants (HAPs). HAPs can cause health issues, such as cancer or birth defects, or serious environmental damage. Using advanced, intelligent video analytics to closely monitor flares helps ensure full compliance with EPA regulations, as well as protecting your people and the environment.

Ramya Ravichandar
Author Bio: Ramya Ravichandar is vice president of products, FogHorn.