Reading Time: 9 minutes

Pressure on the petroleum industry is increasing.  With the “lower for longer” market, over capitalization, budget overruns and production oversupply have strained industry economics.  Exploration and development investments made in remote and environmentally sensitive areas have further added to the cost and complexity of capital projects.  The growth of renewables is affecting demand.  More stringent regulations on emissions and low-carbon requirements are increasing industry constraints.  And the emergence of “pure play” companies, which specialize in specific parts of the value chain, is driving changes in market dynamics.

In summary, petroleum markets have evolved from a state of organizational centricity, in which companies and service providers largely define what to produce and market to customers, to one of individual centricity, in which technology-savvy consumers demand customized engagement and experiences.  Pipelines are now facing a new world order that is heavily affecting the industry.

IBM Well Heads
Alberta Pumpjacks

Pipelines move gas and oil from the pumping source to the refinery and downstream to the markets.  A great deal of negative press has been seen in the last few years regarding breaks in pipelines and the resulting environmental damage.  Many special interest groups strongly oppose pipelines.  Yet, they represent the safest, most efficient means to move product downstream to the markets.

These disasters can be averted with smarter technologies deployed to monitor the pipelines and to provide situational awareness back to the operators.  Through the Internet of Things, combined with new internetworking technologies, and watched with artificial intelligence empowered to act and react to emergencies, damage can be caught in advance with new preventive maintenance approaches, and remote control and isolation of leakage with 24/7 ever-vigilant cognitive computing platforms to close values and stem the flow of product.

The issue today with pipelines is the dependence on older SCADA solutions widely dispersed over broad spans of the pipeline.  We simply do not have the ability to see every meter of the pipeline with these older systems.  We need a far more granular solution that is real-time, and not only monitors the pipeline, but has command and control capabilities to manage it.

YORKTOWN HEIGHTS, NEW YORK––IBM has created a computer, called Watson, that will play against the b
Watson IoT Platform, a cognitive computing tool for pipelines

Artificial Intelligence will play a major role in this next generation pipeline network management system.  It is not meant to replace workers but to augment them and collaborate with them to keep the environment safe and protect our natural resources.  With a comprehensive end-to-end sensor network  tied to analytic engines and to AI. any pipeline breaks can be minimized and contained in seconds instead of hours or days.  This high quality solution and rapid response approach will stem the flows and shut down leaks almost instantly.  In fact, these systems can precisely predict the point of a potential future leak and through proactive measures, stop leaks before they ever happen.

IoT sensors can now do things that we could never do before.  The variety of sensors and the new low cost model make it affordable to deploy on a more granular scale than SCADA could ever be configured.  These IoT sensors can deliver numerous data measurements including smart sensors that collaborate and share data all along the pipeline.  External data, such as weather information can be gathered and shared upstream too.

A Sterling Engine to generate electricity in remote locations

Next generations IoT networks based upon LoRa star architectures can cover long distances, as much as 10 to 20 kilometres in each direction from the gateway depending upon the height of the gateway, terrain, and trees and foliage obstructions.  Line of sight to the nodes is important.  New mesh architectures can offer higher data rates in a relay topology that permits bidirectional hops along the pipeline and can overcome many issues related to line of sight and terrain as they bend and twist in concert with the pipeline.  Both internal and external sensors can be used on the pipelines.

A new broadband geostationary VSAT satellite

Back-hauling the signals from the pipelines with new satellite connections can also make the links robust and low cost.  Reaching gateways in rural or remote locations where no conventional back-haul technology exists can be solved with satellite links.  New satellite solutions that offer very low cost connections are now available that make use of BGAN  (Broadband Gateway Access Networks) satellites or LEO (low earth orbit) transit satellites that operate in arrays.

eStream Generator

The new IoT technologies use minimal power so the sun can energize solar panels to charge up batteries and operate for years without intervention.  Gateways can make use of modern energy harvesting techniques to power large points of presence along the pipeline.  Nearby streams can cleanly generate electricity.  Solar panels and small scale wind turbines can be used as well to generate electricity in an environmental friendly manner.

“Wireless sensors are emerging as one of the strongest options for pipeline monitoring applications,” Varun Babu, Frost & Sullivan TechVision research analyst, said in a statement. “With the adoption of wireless sensor network (WSN) technology, on-board computational sensing and wireless communication capabilities, the quality of monitoring will significantly improve”.

“The WSN sensor nodes and algorithms can provide rich information for detection, location and assessment of structural damage caused by severe loading events and progressive environmental deterioration. WSNs can also monitor more data points and be reconfigured more easily than wired sensors,” Babu said.

BGAN Satellite Terminal

Edge computing can add to the robustness of the solution with Edge nodes at stations to act independently if the back-haul network goes down.  So, the network can be segmented into sections and then these sections can be further subdivided leveraging a ‘divide and conquer’ method to enhance the effectiveness of the connections.  All of these nodes can be programmed to operate autonomously and thereby overcome disconnects from the network.  Once the networks are returned to service, collected remote data can be refreshed to the cloud.

Xcel- Tiffany 025 LinkedIn
IBM Designed VSAT Terminal at the Durango, Colorado Gas Pumping Station for Xcel Energy.  Over 360 Satellite Nodes are planned to be implemented in 8 States

When combined with other technologies, such as drones, aircraft, ultrasound, harmonic analysis, LiDAR, intermediate sensing, PIG sensing, and even the existing SCADA systems, all of the data sources can be combined to concatenate these disparate readings into one harmonized over-watch solution whereby the situational awareness of the pipeline can be dramatically enhanced.  The diversity of data solutions can cross-conform and augment each other to tell a richer story of the health and performance of the pipeline.

Satellite Arc Look Angle Analysis for the Xcel Energy Tiffany Compressor Station

The IoT sensors alone are not enough.  The data needs to be analyzed and understood by artificial intelligence systems like IBM’s Watson IoT Platform.  One major industry challenge is making data available to decision-makers in a way that benefits everyone.  So, derived data outcomes need to be shared with the people controlling the pipeline, but also with the managers and field workers operating the pipeline.  It is this return path of data that brings the pipeline to life and provides the real-time feedback, analysis, and dashboard parameters so the team can act in concert to the autonomous systems.  This same data can be searched for patterns, trends, and history, as well as combined with legacy data and external data.  The story of the pipeline becomes vivid and more telling when aggregating data from so many sources.

Network drawing for Mist, Fog, and Cloud Computing
IBM End-to-End IoT Ecosystem

The speed at which data becomes available and depreciates, its sources, types and trustworthiness all play a critical role in reaching informed, accurate decisions. Although the amount of data available may be a looming “pot of gold,” more data is not necessarily better.  Data volume alone is not a sufficient prerequisite for excellence in decision management in the pipeline industry.  To contextualize, organize, and draw true meaning from data, operators are turning to artificial intelligence (AI) and cognitive computing to augment the capabilities of their business experts.

Gathering information and curating data are two keys to effective decision management.  However, the ability to operationalize data for decision making is the crucial capability operators need to survive in a world of heightened competition and changing risks.

The pipeline industry has always had to balance seeking additional data while not overspending on the operations. This dichotomy has created an environment where information has been challenging to obtain and made risks more difficult to assess. As a result, operator decisions are typically made as much from experience and intuition (art) as they are from empirical evidence (science).

IBM Pig 1
A PIG being inserted into a pipeline for sensor data gathering

The “art” of pipeline decision making comes from using learned associations and experience to account for missing data:

  • For operators, using prior experience to account for unknown risk details has been central to decision making.
  • Analyst staff have been conducting reviews of sampled data, developing correlating hypotheses and making decisions by analyzing several years of service disruption loss outcomes.
  • When settling claims, operators predict likely outcomes based upon their experience, which is employed to reduce claim costs and avoid unexpectedly large settlement amounts.

With AI, better models can result to forecast product leakage to the litre and therefore predict the impact to the environment and the cost for clean-up.

AI can instantly and autonomously command the pipeline to shut-down at much more granular points to stop any leakage, then notify the company of this action, the actions taken to manage it, and recommend a course of action towards a speedy resolution of the issue.  With in-depth, real-time analysis of the pipeline, the AI systems can get ahead of leaks and help get crews assigned to remediate and fortify the pipeline once issues are known – even before they happen.  It is this predictive capability that saves money and reduces or eliminates financial impacts to the business.

KONICA MINOLTA DIGITAL CAMERAIBM’s Watson IoT Platform has the tools, people, and processes to help operators move from a reactive lightly aware state of operations to a proactive, deeply aware condition.

At the end of the day, it is a win – win scenario.  Companies can operate more efficiently, costs are controlled and predictable, damage and lost is eliminated or minimized, and everyone lives in a safer cleaner environment protecting our natural resources all the while enjoying the benefits of clean, safe, low cost energy to heat our homes, fuel our cars, and make our world work.


IBM. (2018). AI / Cognitive. International Business Machines Corporation. Retrieved on November 23, 2018 from,

IBM. (2018). Cognitive decision making in insurance: From art to science. International Business Machines Corporation. Retrieved on November 23, 2018 from,

IBM. (2018). Digital transformation is reshaping the oil and gas industry. International Business Machines Corporation. Retrieved on November 23, 2018 from,

IBM. (2018). Extracting digital rewards: Digital Reinvention in petroleum. International Business Machines Corporation. Retrieved on November 23, 2018 from,

Smith, M. (2017). Wireless sensing technologies advancing pipeline monitoring sector: Frost & Sullivan. JWN, a division of Glacier Media Inc. Retrieved on November 23, 2018 from,

About the Author:

Michael Martin has more than 35 years of experience in systems design for broadband networks, optical fibre, wireless and digital communications technologies.

He is a Senior Executive with IBM Canada’s GTS Network Services Group. Over the past 13 years with IBM, he has worked in the GBS Global Center of Competency for Energy and Utilities and the GTS Global Center of Excellence for Energy and Utilities. He was previously a founding partner and President of MICAN Communications and before that was President of Comlink Systems Limited and Ensat Broadcast Services, Inc., both divisions of Cygnal Technologies Corporation (CYN: TSX).

Martin currently serves on the Board of Directors for TeraGo Inc (TGO: TSX) and previously served on the Board of Directors for Avante Logixx Inc. (XX: TSX.V). 

He serves as a Member, SCC ISO-IEC JTC 1/SC-41 – Internet of Things and related technologies, ISO – International Organization for Standardization, and as a member of the NIST SP 500-325 Fog Computing Conceptual Model, National Institute of Standards and Technology.

He served on the Board of Governors of the University of Ontario Institute of Technology (UOIT) and on the Board of Advisers of five different Colleges in Ontario.  For 16 years he served on the Board of the Society of Motion Picture and Television Engineers (SMPTE), Toronto Section. 

He holds three master’s degrees, in business (MBA), communication (MA), and education (MEd). As well, he has diplomas and certifications in business, computer programming, internetworking, project management, media, photography, and communication technology.