virtualization, Cloud, Analytics, Edge Computing

The orchestration that precedes analysis

The DDDP Conference carried the tagline had an emphasis on cloud and edge technologies meeting the oil & gas industry.
By Kevin Parker July 30, 2019

The recent Data-Driven Drilling and Production (DDDP) Conference carried the tagline “where Silicon Valley meets oil & gas.” The event was as good as its word.

The annual Houston-held conference was about one-third again larger than it was in 2018. Analytics, machine learning and artificial intelligence (AI) providers were much in evidence. Perhaps more surprising were the growing number of exhibitors focused on data normalization, i.e., preparing data to be fit-for-purpose or otherwise analyzable.

Palo Alto, Calif.-based Infoworks supports enterprise data operations and orchestration in the oil & gas and other industries. “Artificial intelligence can deliver benefits, but most companies are still struggling with the foundation and realizing that for analytics to play a greater role, getting the data under control is foundational,” said Buno Pati, Infoworks CEO.

Data scientists and data engineers have among the world’s most highly sought skill sets. “A manual approach to the data task doesn’t scale because there is just not enough talent or money available. A solution must automate the capabilities needed, integrate so that it scales up to a suite, and abstract away from the data,” Pati said.

Solid cross-section

Mark Thompson is a vice president with Campbell, Calif.-based Swim.AI, a company that says it can help the oil & gas industry “build massively real-time streaming applications” that are “stateful” and communicate and collaborate autonomously.

“Companies are drowning in data, swimming in sensors,” said Thompson. “We can bring in the data, normalize it, then bring in open-source and other tools to enable data analytics.”

Swim.AI engages with the oil & gas industry because it is “concentrated and massive, and because of the scale and scope of the challenges being addressed. Essentially, we’re talking about machine learning and pattern matching. Cognitive AI is still coming.”

SparkCognition, founded in 2012, today employs 250 people. The company uses “AI to write the AI application, speeding the time in which a data science problem can be addressed,” said Philippe Herve, VP of solutions, and thus providing the spark in the cognitive process.

“AI and analytics companies need to get out of batch mode,” said Sam Chance, a principal consultant with Boston-based Cambridge Semantics. “To do that, they don’t need to rip out and replace.”

The company presents the alternative of adding a layer “on top” that includes a Graph OLAP database for data discovery, machine learning and business-intelligence style analytics. “The idea is to bring disparate data together, and harmonize that data based on open systems. We have a query engine. The idea is to use policy, rather than people, to drive the compute.”

Native to O&G

Eric Fidler, founder and CEO, of Houston-based Lavoro Technologies, said emerging edge/IIoT suppliers to the oil & gas industry must demonstrate their ability to scale solutions. “Typically, an [IIoT, edge, Cloud] engagement involves more services. The challenge with working in the Cloud is that context needs to be provided.”

Lavoro bills itself as “the software as a service solution for digital oilfield automation.” Lavoro technology can replace or augment an RTU. It can provide SCADA technology or connect to SCADA. Applications in the oil & gas upstream sector include automation of choke valve, plunger, separator, tank flow, well test, or artificial lift performance.


Kevin Parker
Author Bio: Senior contributing editor, CFE Media