Operations data includes clues for automatic process improvement
Current market conditions are forcing oil and gas companies to do some corporate soul-searching.
They are seeking ways of doing business that will not only help them stay afloat, but also allow for continuously boosting profits in the face of persistently low product prices.
There is no shortage of people offering advice on how to reach that particular state of equilibrium. Industry analysts, consultants and even a few media people have studied and weighed in on the topic. It appears all of these deep thinkers have reached the same conclusion: perfecting the use of digital technologies offers the best path for oil and gas companies to thrive in this new environment.
Those holding that opinion include oil and gas industry experts with the global business consulting firm McKinsey & Company, which has published a series of reports on this sector since prices began their precipitous slide in 2014.
"Our research finds that the effective use of digital technologies in the oil and gas sector could reduce capital expenditures by up to 20%," McKinsey stated in an August 2016 report titled, The Next Frontier for Digital Technologies in Oil and Gas.
The report also stated these technologies could cut upstream operating costs by 3% to 5% and downstream costs by at least half that much.
Clearly such cost reductions would improve the corporate bottom line, and McKinsey argues those results are possible because inserting new digital technologies into their operations allows oil and gas companies to squeeze more value out of their existing assets, rather than expending funds on new ones.
This argument obviously applies to the equipment used to find, produce and transport oil and gas. However, it also applies to a not so obvious—but equally important—asset: the data generated during the exploration and production processes.
Aids process automation
In fact, some industry observers believe this era of low prices has made data an oil and gas company’s most valuable asset because it constantly provides clues on the most efficient ways of managing all of the enterprise’s other assets—from exploration and production equipment to people.
Perhaps the biggest positive impact the use of data can have on an oil and gas company is as an aid to process automation. McKinsey stated as much in an August 2014 report titled Digitizing Oil and Gas Production.
In that report, McKinsey advised oil and gas companies to explore the use of big data and advanced analytics as a process automation tool, arguing that this approach promises both micro and macro benefits.
The micro benefits could include the automation of routine tasks that are either expensive, dangerous or error-prone. The macro benefits could include tackling a myriad of challenges confronting companies across the industry, including the following:
- Managing operations in increasingly complex and hostile environments, such as oil and gas fields located in arctic, offshore or geographically remote areas
- How to better monitor conditions that can lead to health, safety or environmental incidents that ultimately could drive an operator out of business
- Retaining knowledge that is being lost as experienced workers retire from the industry in the major demographic shift known as "the great crew change."
For a novice, embarking on a big data and advanced analytics project could prove an overwhelming task, but many oil and gas companies have been laying the groundwork for such endeavors, often unknowingly, for many years.
For instance, field engineers have been collecting and analyzing complex seismic data—starting with 2-D pictures and moving into 3-D images—to locate oil and gas deposits since the 1980s. For just as long, oil wells, pipelines and refineries have been outfitted with control systems consisting of sensors and other devices that feed operators information about the operational status of various types of equipment.
For many oil and gas companies, taking the next step to applying big data and advanced analytics to process automation is simply a matter of understanding which processes could benefit most from such automation. McKinsey’s analysis indicates that automating processes related to production operations are likely to yield the greatest return on investment.
While the exact processes that should be automated first may vary depending on the production environment, most industry experts agree that two things will hold true in almost every situation:
- Using data to not just automate—but also optimize—any production-related process will prove worthwhile; and
- Using automation to practice preventive maintenance will produce significant financial returns.
General process optimization entails creating strategies for constantly improving any process deemed critical to the company’s profitability. This happens by collecting data generated in the process, analyzing the data to uncover process flaws and implementing changes to correct those flaws.
Preventive maintenance, as the name implies, is analyzing data on the operational status of individual pieces of equipment and using that information to keep that equipment functioning at maximum capacity for longer periods of time.
"The potential impact of using advanced analytics for preventive maintenance is a decrease in maintenance costs of up to 13%," McKinsey stated in its August 2016 report. "At one company, where maintenance costs accounted for 25% of operating expenses, this enabled preemptive equipment maintenance—in effect, vital equipment could be repaired before it broke down. This effort reduced costs by up to 27% while increasing reliability and uptime."
Babar Iftikhar calls this phenomenon "condition-based maintenance." He also says he is seeing it practiced with greater frequency in his work as a product development manager for Intech, a global provider of process automation solutions and services.
Regardless of what’s it’s called, this type of maintenance involves the constant monitoring of critical data sets on the equipment being maintained, and having the ability to transmit that data, ideally in real time, to people who can step in and fix the equipment before what appears to be an imminent problem occurs.
Once again, the oil and gas companies that have laid a technology foundation-in this case those that have installed distributed control systems connected to data historians-are a step closer to putting this practice in action.
"Various types of control technology—PLCs, distributed control systems and SCADA systems—accumulate different types of data," Iftikhar explained. "To create a condition-based maintenance solution, we first have to collect all of the data from the instrumentation and devices into a central location. Most often that will be a data historian located in the plant. Then we can write logic for how to process that data on top of the historian."
Typically, this logic, or program code, is written to instruct the historian to issue an alarm if the equipment being monitored shows readings outside of certain parameters. If, for instance, a compressor displays a low-pressure reading x number of times in a day for more than 10 minutes, the system will trigger an alarm. In a true condition-based monitoring system, that alarm will go to a dashboard connected to the preventive maintenance module of an ERP system, which automatically issues a work order dispatching a technician to the machine.
"When the maintenance team receives this type of work order—which is not a preventive or corrective work order—they visit the site and diagnose the problem based on the listed symptoms," Iftikhar said. "One gas production facility in Pakistan implemented this strategy on seven giant compressors, and immediately saw its incidents of production shutdowns disappear. That is the value of condition-based maintenance."
Iftikhar also has helped oil and gas companies with what could be considered general process optimization. In this case, it was helping them optimize alarm management. Such optimization often becomes necessary after a plant has been using a control system for a number of years, and the nature of the business has shown some alarms to be more critical than others.
"You can find operators being distracted by alarms that never actually lead them to a problem," Iftikhar said. "In one facility in Nigeria, operators were receiving as many 5,000 alarms a day. The ISA standards say operators should never receive more than six alarms an hour, or roughly 140 a day. We can rationalize those alarms by pulling four to six weeks’ worth of data from the historian and analyzing to determine which ones have not required any corrective action. We then present the results to the customer and go about the process of disabling or changing the set points for non-critical alarms. That typically reduces the alarm response workload by 40% to 60%, and it represents a quick win for the customer on the use of technology."
Surveillance by exception
Using data for this type of optimization can go much deeper, according to Jose Jimenez, director of global oil & gas solutions at Emerson Automation. He helps companies employ a strategy called "surveillance by exception" that can aid help in optimizing almost any aspect of the oil and gas production—from tracking and improving well performance to boosting the output and reliability of refinery equipment.
The surveillance by exception strategy involves building models for how you expect operations to perform and then developing means of collecting and analyzing data that can reveal if the actual performance is meeting those expectations. If actual performance is not within expected parameters, it is labeled an exception warranting immediate corrective action.
For this strategy to work, Jimenez says a company must be able to view and act on data "as it becomes available," as opposed to reviewing and reacting to reports generated at monthly, weekly, or even daily intervals. Jimenez said companies that have SCADA systems in place are collecting the necessary data to operate in this fashion, they just need a way of making that data readily available to operators who can recognize and act on the exceptions.
"When you’re doing this type of surveillance, you’re tapping into either a data historian or directly into a SCADA system," Jimenez said. "So, as the data becomes available, you compare it against your model and do a trend analysis or projection to see if anything needs to be examined more closely." The data sources could vary based on the actual process being observed. If the goal is to boost well performance, for instance, the system would be instructed to pull four to six different forms of data, such as maintenance records, production history and real-time output data from the field.
A ‘no-regrets move’
Jimenez said growing demand for these capabilities recently prompted Emerson to enter a partnership with OVS Group, a Houston-based company that has created a library of prepackaged tools that automate the process of organizing data from various points of an oil and gas operation—from reservoirs to wells, pipelines and refineries—so that it can be easily analyzed.
"The foundation of the partnership is OVS’s ability to link into multiple data sources very quickly and without duplication," Jimenez said. "A lot of oil and gas companies have purchased and installed a lot of sensing and control products. That provides a good inflow of data from the field, but they don’t have the ability to evaluate that data against their performance models on a timely basis."
Sebastiano Barbarino, CEO of OVS Group, said companies seeking to use data for process improvement should be looking, first and foremost, for solutions that can help them identify which data they should be analyzing to achieve their specific business goals.
"We know big data is a trendy topic in the industry," Barbarino said, "but the amount of data you collect is not important. What matters is how you use the data. If you want to do gas leak optimization, you may need 10 points of data. Optimizing another process may require even fewer data points. We created our workflows to lead the users to those conclusions, showing them the exact data they need to get the result they’re seeking."
Solutions like these are why McKinsey called an investment in digital technology—including big data and analytics solutions—a "no-regrets move" that all oil and gas companies should quickly embrace.
Sidney Hill Jr. is a graduate from the Medill School of Journalism at Northwestern University. He has been writing about the convergence of business and technology for more than 20 years.
Places to start: