Historically, we built marvelous data centres and positioned multimillion dollar firewalls around them like moats protecting the castle to guard our data. Almost all of the data resided within the data centre, so it was safe and protected. We then processed this data to: bill customers, understand metrics, analyze performance, feed the supply chain, track projects, manage staff, and other important activities.
None of these actions were performed in real-time. The time domain to make sense of the data was often long and the process was arduous. Batch processing was the preferred method to crunch the data and it took time to compute. Besides the shear computational processing power to calculate the results, a vast amount of computer storage was needed to host the data. The IT architecture was centralized and notoriously slow to execute. Understanding the data meant asking questions of it, and if we did not ask the right question the first time, then modifying the question, or asking new questions was exasperating as it consumed even more time. Often, the data results were stale before they were known. The old ways were not fast ways. So, industries moved towards faster approaches to gain better situational awareness in a far more timelier manner.
As computer technology evolved, the processes did accelerate and improvements were greatly valued. But, with the introduction of faster compute capabilities, came more and more data. Today, we call it, Big Data. This is a nice way to describe the tsunami of data that is now overwhelming our data centres and taxing our infrastructure to the point of endless expensive upgrades. Along came Cloud Computing which helped greatly too, but it just shifted the problem to someone else, which changed the risk equations and moved the cost structure from CapEX to OpEX. Most who migrated to the cloud, enjoyed lower costs and in many ways mitigated business risks too. Some saw no savings or any risk reductions. It all depended upon the application.
Today, we are seeing a new problem emerge. There is just so much data that needs to be understood immediately for 24 / 7 operations, thus it is all in real-time. The existing older batch approaches can simply no longer work. We need to change the architecture of the solution, the methodology, the topology and the thinking that underpins the requisite data science. We must leave behind the older centralized architecture and move to a federated model, which is composed of both distributed and centralized models operating in harmony.
Next, we need to embrace Mist and Fog Computing, which is added to Cloud Computing. Many know of Cloud Computing, which is the practice of using a network of remote servers hosted on the Internet to store, manage, and process data, rather than a local server or a personal computer, some may be aware of Fog Computing, which pushes the intelligence to the edge of the network and outside of the centralized data centres. But, few are aware of the next generation Mist Computing, which places compute, storage, and analytics at the extreme edge of the network fabric and fully embraces the world of the Internet of Things (IoT). Fog Computing is the use of microcomputers, microcontrollers, smart sensors, passive sensors, and actuators, to manage data at the extreme edge in real-time. Fog Computing does similar work, but with more computational horsepower, and is located at key junction points of the network topology. Mist Computing in outdoors or in the factory plant installed on the Industrial Internet of Things (IIoT) network tied to the end devices.
By pushing the intelligence towards the edge and extreme edge of the network fabric, several interesting things happen.
- Data might not need to ever travel to the Cloud. It might be processed in real-time at the edge or the extreme edge. If data is stored, it might be stored on the network fabric and not in the data centre (cloud). New data that is derived from the computations on the network may be all that ever needs to travel to the cloud. This derived data is often just a summary of the actual data sets, so this helps to reduce the Big Data tsunami at the data centre. It also lowers network transportation costs.
- Another benefit is that real-time processing can occur at the edge and the extreme edge of the network fabric. Data is analyzed with some form of stream computing that processes the data while it is in-flight over the network fabric. The latency of stream computing is measured in microseconds, not in hours or days, as was the case with the older methods. Therefore, situational awareness is greatly enhanced. Decision-making is performed by computers, so it is also real-time. Outcomes are reported to humans when required. Things can collaborate and share datagrams at the edge and extreme edge of the network fabric in real-time.
- A shift in the way business is funded from an intense and inefficient CapEX model to a variable, on-demand OpEX model. This approach permits the better use of capital and tolerates the variables and vagaries of business in the competition charge world of today. It virtualizes the business.
- The infrastructure economy is slowly being replaced by the services economy. Business is being virtualized and we see now global enterprises formed that operate in industries without owning any assets. Examples include: “Uber, the world’s largest taxi company, owns no vehicles. Facebook, the world’s most popular media owner, creates no content. Alibaba, the most valuable retailer, has no inventory. And Airbnb, the world’s largest accommodation provider, owns no real estate. Something interesting is happening.” These companies are already virtualized businesses that leverage services from others and therefore act as an arbitrage agent between the parties, between the providers of the services and the users of the service. Literally any business can be digitalized and virtualized.
What we are describing here is Industry 4.0. This is the fourth wave of the industrial revolution. Work will change once it is digitized end-to-end. As we connect things, make these things smarter, and these things learn how to learn; work is improved exponentially with a level of agility and dynamism never known in the past. These next generation solutions never need rest breaks and work tirelessly around the clock, thereby lowering the cost base and improving the quality of the products and services continuously. The four industrial waves are:
- First Industrial Wave – the use of water and steam power to make machines work
- Second Industrial Wave – the use of electrical power to run machines
- Third Industrial Wave – the use of computers to help humans to run machines to perform the work
- Fourth Industrial Wave – employs next generation technologies like artificial intelligence, cognitive computing, cloud, analytics, and the Internet of Things to perform the work in a fully automated and ever-learning way. It is the first wave that does not need humans to perform the work.
All of this ambition needs an intelligent network and a federated architecture to share the load and shift the computational, storage, and analytics burden out of the centralized data centre and place it throughout the network. It continues to be shared and collaborate with the data centre, but as a partnering solution with more than 50% of the data residing permanently on the network fabric. In this way, we gain speed, lower costs, raise quality, and forever change the way that work is performed.
It is Fog Computing, and to a lessor extent, Mist Computing, coupled with the federation of the networks that facilitates this transformation towards Industry 4.0. We cannot have digitization and virtualization of business if we first do not have the networks that ground these changes and underpin this movement. The next big innovative disruption is the intelligent, federated, network fabric. Then, and only then, will we see the digitization transformation of businesses and the evolution of IIoT and Industry 4.0.
You will find many articles posted on my LinkedIn Pulse page, so click on the link below to view and read my articles. Thanks for taking the time to read this post – comments, shares, and likes are always welcome and I will respond to questions pertaining to this article. LinkedIN
About the Author:
Michael Martin has more than 35 years of experience in broadband networks, optical fibre, wireless and digital communications technologies. He is a Senior Executive Consultant with IBM Canada’s GTS Network Services Group. Over the past 11 years with IBM, he has worked in the GBS Global Center of Competency for Energy and Utilities and the GTS Global Center of Excellence for Energy and Utilities. He was previously a founding partner and President of MICAN Communications and before that was President of Comlink Systems Limited and Ensat Broadcast Services, Inc., both divisions of Cygnal Technologies Corporation (CYN:TSX). Martin currently serves on the Board of Directors for TeraGo Inc (TGO:TSX) and previously served on the Board of Directors for Avante Logixx Inc. (XX:TSX.V). He served on the Board of Governors of the University of Ontario Institute of Technology (UOIT) and on the Board of Advisers of four different Colleges in Ontario as well as for 16 years on the Board of the Society of Motion Picture and Television Engineers (SMPTE), Toronto Section. He holds three Masters level degrees, in business (MBA), communication (MA), and education (MEd). As well, he has diplomas and certifications in business, computer programming, internetworking, project management, media, photography, and communication technology.
Curry, D. (2016). What is Industry 4.0? ReadWrite. Retrieved on April 8, 2016 from, http://readwrite.com/2016/04/05/industry-4-0/
Sharma, K. (2014). New Fog Computing To Out Smart Cloud Computing. Indian Web 2. Retrieved on April 8, 2016 from, http://www.indianweb2.com/2014/12/09/new-fog-computing-smart-cloud-computing/
Social Techcrunch.com. (2016). Retrieved on March 23, 2016 from, https://www.facebook.com/Booyah/posts/806198119456286