Reading Time: 8 minutes

What is cloud computing? The so-called “cloud” is a new spin on an old topic, but with some differences and advancements to drive greater benefit for the user. It is the virtualization of the network back office systems as well as the sharing and outsourcing of the systems.

With cloud computing, all of the users share some or all of the requisite technology – hardware and software – necessary to run the back office systems. This technology can be located at one site, but is normally spread across multiple data centres. The location of these data centres can be diverse spanning the city, region, province / state, country, or even the globe. The hardware and software is most often virtualized for cost effective sharing and scalability. Sometimes dedicated systems can be provided when necessary. Sophisticated, secure, and highly available, robust data centres are fundamental to the cloud concept. Services are fundamental to the cloud definition and they are thrown into the mix to advance and boost the value proposition. Services can be minimal or elaborate depending on the packaged offering.

In fact, the cloud is available in many versions and offerings, perhaps too many. So, it is understandable that it is a very confusing concept to properly comprehend as everyone seems to have a slightly dissimilar take on what exactly cloud computing means. Below are a basic set of definitions to help frame this topic.

Cloud computing is all about packages of services. These packages include varying amounts of hardware and software wrapped in internet connected services. Exactly what package of services you acquire over the cloud and from where you access them is how it is typically defined. Different customers will make use of these different packages. Examples include:

  • IaaS – The IT department – Extending current data centre infrastructure for temporary workloads (e.g. increased Christmas holiday site traffic)
  • PaaS – Developers – Increase developer productivity and utilization rates and decrease an application’s time-to-market. Test new deployments on temporary facilities before going live
  • SaaS – End Users – Replacing traditional on-device software. Application deployments – email, customer relationship management, sales management, online meetings, web sites, etc.

Package descriptions of services include:

  • IaaS – Infrastructure as a Service is the most basic offering on the cloud. It is typically thought of as access to simple services and other supporting infrastructure that underpins the application like security and storage. The Utility is required to install their own operating system and the application software on these virtualized hardware and software infrastructure service offerings and self administrate them remotely
  • PaaS – Platform as a Service is similar to IaaS as it includes the core infrastructure but also adds in the operating systems, databases, web server applications and all of the other necessary hardware and software aspects to complete the platform. The Utility just provides the core application itself and they maintain it too
  • SaaS – Software as a Service combines all of the attributes of IaaS and PaaS with the applications too. The Utility just pays a monthly / annual fee for the complete package and it is ready to go. Everything is virtualized and operated by others for the Utility. It is highly scalable and able to be operated in a very cost effective manner compared to the Utility owning and operating their own core infrastructure, data centre systems, servers, storage, security, networks, and applications. It is a pure outsource scenario whereby all of these systems are provided with the requisite support, updates, and maintenance
  • UCaaS – Unified Communications as a Service is now evolving whereby communication network services such as multi-platform communication solutions like: IP telephony, mobility applications, voice, web, and video conferencing, messaging, and more are provided as an outsources service. UCaaS enables the Utility to access social platforms and to communicate with their end customers on any appliance desired, often seen as smartphones, tablets, computers and wearable technologies

There are three main approaches to cloud computing based upon who owns and operates the parts of it. These three models include: private, public, and hybrid cloud.

  • Utilities lean towards private cloud which is a solution that they operate internally for their own needs. They worry about privacy and security of the alternatives. Private cloud is primarily a CapEX investment.
  • A public cloud is open over the internet for access and sharing amongst a wide variety of disparate users and applications. Utilities may elect to outsource the entire cloud requirements as an OpEX investment.
  • A hybrid cloud blends the private and public models and directs traffic to each cloud based upon specific needs and metrics. While it also blends both CapEX and OpEX, it may permit the desired security and privacy desired, whilst accessing supercomputing analytic engines that would be unaffordable otherwise. IBM’s Watson is an example of a popular outsourced tool for analytical analysis.


Figure 1 – IBM Cloud Framework

Cloud computing can be complex and as stated it is not as standardized as we might like to see it today. Many see cloud offerings as a commodity service, yet there is a major difference between the leading cloud vendors and their offerings, so it is prudent to do you homework when shopping for cloud services. As I work for IBM, I naturally promote our offering as being superior to others and therefore I am clearly bias towards the IBM SoftLayer approach. The framework shown is a critical step to ensure uniformity and consistency when deploying cloud based services. With this bias declared, we can now proceed to consider cloud computing as it relates to the Internet of Things (IoT).

“At IBM, we believe there are two primary ways that enterprises and entrepreneurs can take full advantage of the transformational opportunities the Internet of Things represents – the invention of new IoT products or services, and the optimization of business operations to deliver new services, upset long-standing business models, and find new and radical ways to connect people to the world around them.” – John Thompson, vice president of Strategy, Internet of Things

To achieve these transformations, Utilities need to understand that the Internet of Things requires an integrated fabric of devices, data, connections, processes, and people. It is a unique challenge of integrating digital Internet technology with physical Utility infrastructure. Utilities will need to capitalize on the Internet of Things, supported by Cloud, Analytics, Mobile, Security, and Social attributes, as well as to enable their back-end systems to support a full range of applications, allowing them to apply insight-driven actions to their business with confidence.


Figure 2 – uBlox Sensor Acquisition Model

In the uBlox model shown above, it is clear that sensors can connect to the cloud in a variety of ways. They can direct connect via a cellular connection in a classic M2M approach, or they can use the cellular connection via a media converter to translate from one format to another, in this case from a serial stream to an IP stream. Repeaters are used to extend coverage of various RF signal types – 450 MHz band, 900 MHz band, 2450 MHz band, Bluetooth, and other near field communications protocols. Satellite signals in the 1.5 GHz band for GPS can also be integrated. Fourth Generation (4G) signals like Wi-MAX in the Wi-GRID format are very popular with Utilities too. Gateways designed into the network fabric are common with Utilities in order to aggregate data from a variety of sources like smart meters so it can be cleansed, packaged, and prepared for transport back to the network operations centre (NOC). The gateway can be used to connect mobile devices such as smartphones, laptops, and tablets. These same mobile devices may also connect via cellular or Wi-Fi back to the NOC for display and interaction with the IoT portal.

The cloud hosts a variety of web services for data storage, aggregation, integration, billing, management, authentication and security.


Figure 3 – IBM / Semtech LoRa / LRSC Network

IBM joined with Semtech to develop an end to end solution for IoT networks using cloud as a back-end interface technology. The Semtech LoRa (long range) radio solution was integrated with the IBM LRSC (long range signalling and control) system to create a cohesive end to end solution. Standard off the shelf servers are used that run LINUX and permit node authentication and billing extraction based upon network traffic flows and data consumption. Multiple users can interface to the network fabric via the cloud to connect to their nodes. All users are unaware of the other users that share the same cloud access. This means that disparate and perhaps even conflicting users can access devices within the same network topology and share data from sensors and nodes located out on the edge of the network. Traffic flows are routed as necessary to the end users. Revenue models can be developed from this sharing of content and information. Yet, user isolation can also be maintained and user’s applications protected. In contrast, user integration can mean that the users are fully aware and fully functioning with each other. Both aware and unaware interconnectivity can be had with cloud computing.

When using IPv6 as a protocol for connecting sensors and nodes, we gain the advantage of a quality security strategy with robust encryption and protection of datagrams. Normally, Utilities adhere to the NERC or NIST security standards. 6LowPAN, which is designed for constrained networks is the form of IPv6 used over the IoT and cloud networks.

The greatest advantage of using the cloud is the speed to market advantage to implement a solution. Cloud can be deployed very fast and to a high standard. Compared to other approaches that take many months to deploy, it just takes a few days for cloud implementations. Therefore, cloud has this powerful advantage for rapid deployment saving time, money and reducing complexity.

While IoT data rates are much lower than other network models, using the cluster strategy to segment the sensor and nodes into small workable packages of devices is the way that many millions of devices can be connected. These manageable sized clusters are controlled from just one or a few take-out points. The network can be centrally managed and still possess the desired density for sensor and node counts that Utilities desire, typically around 2,000 devices per cluster. This is the classic “divide and conquer” approach to internetworking.

Network management of the end devices can be managed centrally via the cloud too. Functions such as:

  • Device registration,
  • Device and application connectivity,
  • Securely receiving data and sending commands to devices,
  • Storage and access to historic data,

can all be easily managed over the cloud network.

As we indicated, in the cloud model shown with the LoRa / LRSC graphic, multiple end users can share the same network topology completely unaware of the other users. Nodes can be mapped exclusively for a single end user or shared seamlessly between multiple end users, again, in complete isolation from each other.

Utilities will pull data from a myriad of sources: smart meters, power line meters, transformer meters, reclosers, segmentation switches, tie switches and more. In the PowerSense graphic below, data can be sourced directly from the power line itself. Data values for current, voltage, advanced line fault indication, including distance-to-fault indication, three-phase diagnostics, including three-phase voltage (phase-phase), three-phase distributor current, phase angles, and three-phase distributor power (kVA, kVAR, kW, PF), and more can all be shared in the cloud.


Figure 4 – PowerSense, a Landis + Gyr Company Sensor Connections

An important aspect for Utilities to remember with cloud implementations is that regulators sometimes demand that traffic reside only within the country of origin and not be stored or shipped outside of the country into other legal jurisdictions. This is especially important for privacy laws and the expectation to a right of privacy for personal information that might reside on the cloud. Other traffic or traffic that can not be identified to a specific individual may be permitted to transact over geographic borders. Some better cloud offerings allow for the limitation of where the data goes and on which data centres the data rests. While the data may need to remain in-country, the cloud administration may be located offshore and still adhere to the regulations.

Cloud is the future and most of the IoT will leverage cloud in some form. So, planning your needs with cloud in mind from the outset is smart.


Michael Martin has more than 35 years of experience in broadband networks, optical fibre, wireless and digital communications technologies. He is a Senior Executive Consultant with IBM’s Global Center of Excellence for Energy and Utilities. He was previously a founding partner and President of MICAN Communications and earlier was President of Comlink Systems Limited and Ensat Broadcast Services, Inc., both divisions of Cygnal Technologies Corporation. He holds three Masters level degrees, in business (MBA), communication (MA), and education (MEd). As well, he has diplomas and certifications in business, computer programming, internetworking, project management, media, photography, and communication technology.