Reading Time: 4 minutes

As computer power advances and gains more intelligence, we have seen many new terms and phrases enter everyday language.  Unless you are engaged in the IT world, you may not be able to appreciate the exact meaning or definitions of these terms.  I will try to break them down for you in plain English so we can all better comprehend what these data scientist folks are talking about.


Artificial Intelligence – the goal of AI (Artificial Intelligence) is to make machines that act like or mimic humans.  The quest is to make computers into sentient beings.  Sounds like science fiction, right?  They can then provide all the attributes of human beings, such as possessing emotions, learning, problem-solving, communications, reasoning, knowledge, planning, and much more.  The Turing Archive for the History of Computing defines Artificial Intelligence as “the science of making computers do things that require intelligence when done by humans.”

Let us use an example in medicine.  In an Artificial Intelligence system, the system will tell the doctor which course of action to take based on its analysis.


Cognitive Computing – While Cognitive Computing and AI overlap a great deal, the difference is the level that the human is involved in the decision-making.  If we consider our previous example, in the field of medicine, in Cognitive Computing, the system provides information to help the doctor decide.  In many cases, a doctor will indeed make the same choice that a Cognitive Computing platform suggests, but the exceptions are where human experience and judgment become most important, compared to what a computer can offer.  In Cognitive Computing, the platform is more of tool for the user, it does not necessarily replace the user, but augments them.  Cognitive Computing is more collaborative them AI.

Cognitive systems are designed to solve problems the way humans solve problems, by thinking, reasoning, and remembering. As Saffron Technology explains, this approach gives Cognitive Computing systems an advantage over AI allowing them to “learn and adapt as new data arrives” and to “explore and uncover things you would never know to ask about.”


Machine Learning – This is the ability for a computer to learn from the data that it ingests.  It observes trends and patterns and then predicts future outcomes based upon the historical data.  Instead of a people writing code to process the data, the computer does this itself based upon what it sees in the data it consumes.  It studies the incoming data and then writes its own algorithms to process this data easier and faster going forward.  From these algorithms, it creates dynamic programs to make data driven predictions and decisions.  Machine Learning is built upon computational statistics and computer optimization using mathematics to refine its algorithms and optimize the outcomes.

Deep Learning 340-640x426

Deep Learning – If Machine Learning is very specific and even constrained to the process it is optimizing, then Deep Learning is a broader, more free style approach to constructing code (algorithms) to understand the data being read.  Deep Learning can be supervised, partially supervised, or even unsupervised.  It can use interpretation to derive algorithms and apply assumptions and conjecture to the data.  Whereas Machine Learning is aimed strictly at the data it is analyzing, Deep Learning can use representation and apply non-linear processes.  Machine Learning applies linear processes.  Deep Learning makes use of a broader perspective to observe data interactions and cause and effect results for layer interactions.  It then considers abstractions and composes algorithms from these abstractions.  It is hierarchically based and ranks and rates these abstractions to prioritize them.  Deep Learning takes the attributes that it sees and compartmentalizes them into principle components of the algorithm.  By writing in microservices, it can easily modify or replace a specific service without impacting the remainder of the code.

Without a doubt, more terms and language will evolve to describe the technologies emerging in the intelligent computing world.  These key terms are essential to gaining a better understanding of what is underway today.


About the Author:

Michael Martin has more than 35 years of experience in broadband networks, optical fibre, wireless and digital communications technologies. He is a Senior Executive with IBM Canada’s GTS Network Services Group. Over the past 12 years with IBM, he has worked in the GBS Global Center of Competency for Energy and Utilities and the GTS Global Center of Excellence for Energy and Utilities. He was previously a founding partner and President of MICAN Communications and before that was President of Comlink Systems Limited and Ensat Broadcast Services, Inc., both divisions of Cygnal Technologies Corporation (CYN:TSX). Martin currently serves on the Board of Directors for TeraGo Inc (TGO:TSX) and previously served on the Board of Directors for Avante Logixx Inc. (XX:TSX.V). He served on the Board of Governors of the University of Ontario Institute of Technology (UOIT) and on the Board of Advisers of five different Colleges in Ontario as well as for 16 years on the Board of the Society of Motion Picture and Television Engineers (SMPTE), Toronto Section. He holds three Masters degrees, in business (MBA), communication (MA), and education (MEd). As well, he has diplomas and certifications in business, computer programming, internetworking, project management, media, photography, and communication technology.


Evans, D. (2017). Cognitive Computing vs Artificial Intelligence: what’s the difference?  IQ Publication. Retrieved on August 22, 2017 from,

Hoffeberg, S. (2016). IBM’s Watson Answers the Question, “What’s the Difference Between Artificial Intelligence and Cognitive Computing?  VDC Research.  Retrieved on August 22, 2017 from,

IBM Research. (2017). AI and Cognitive Computing. IBM Corporation. Retrieved on August 22, 2017 from,

Wikipedia. (2017). Artificial Intelligence.  Retrieved on August 22, 2017 from,

Wikipedia. (2017). Cognitive Computing. Retrieved on August 22, 2017 from,

Wikipedia. (2017). Deep Learning. Retrieved on August 22, 2017 from,

Wikipedia. (2017). Machine Learning. Retrieved on August 22, 2017 from,