Reading Time: 7 minutes

With the mass proliferation of the Internet of Things (IoT), and the expected business value that it brings to the end users, there is a nagging question related to the quality of the data.


IoT is now everywhere and some are troubled about the quality of the sensors to provide trustworthy data.  In many cases, the role of the sensor is escalating to one that demands precision and trust.  For example, the recent generation of smart wearable devices, such as Apple’s Watch, can provide heart rate and arrhythmia detection data.  Starting with the Apple Watch 4, and now with the newer Watch 5, Apple is the first direct-to-consumer product that enables customers to take an electrocardiogram right from their wrist, capturing heart rhythm in a moment when they experience symptoms like a rapid or skipped heart beat and helping to provide critical data to physicians.  The irregular rhythm notification feature on Apple Watch can now also occasionally check heart rhythms in the background and send a notification if an irregular heart rhythm that appears to be atrial fibrillation (AFib) is identified.  Apple worked with the US Food and Drug Administration (FDA) for a number of years to receive De Novo classification for the ECG app and the irregular heart rhythm notification, making the features available over the counter.


Clearly Apple did its homework and earned credible approval from the Government to provide this new heart detection services.  It is impressive.  But with Apple’s extreme wealth, they can afford to break the boundaries and invest into the burdensome processes that would be an unassailable obstacle for most other companies.

With that in mind, how trustworthy is the data flowing from sensors from the average manufacturer?


Not all sensors are measuring your heart, some are monitoring traffic flows on our roadways, controlling our smart grids, counting passengers boarding trains, and some are detecting explosives at airport scanning stations.  So, it is all serious work that needs to be trusted and effective.

There are literally millions of applications.  And, all of the source data must be trusted.

The old rule is, “garbage in, garbage out“.  If the source data is bad, then the decisions, information, knowledge, and situational awareness that we learn from this source data downstream is also bad.


Therefore, we need to be able to trust the sensors to provide reliable and trusted data.  But how can we trust them and truly know that the sensors data values are valid?  We can:

  • set standards and seek adherence and certification towards these standards.
  • build in checks and balances into the data collection system to validate the inflows of data against thresholds, expectations, trends, and patterns.
  • leverage artificial intelligence to continuously analyze the data integrity and cleanse dirty data or discard it to a statistically trusted level of acceptability.
  • compare data samples against expectations and against a multitude of other sensors performing similar tasks to build a de facto standard of data quality for reference from an aggregation of history and statistically average the results.
  • build in self diagnostics to check the integrity of the sensor to ensure that it is performing as expected.
  • build in automated adjustments to permit the sensor to adapt to the environment and alter its results in concert to the operating environment so the sensor is not adversely affected by external influences like temperature, moisture, wind, harmonics, and more.
  • And, there are likely many other strategies possible to validate the data.

But, the easiest and smart way, that is also the lowest cost path forward, is to design and build sensors to a level of trust.  Calibrated sensors that meet standards and are pretested to deliver factual results is highly desired.  If you can trust a sensor right out of the box, then this is the lowest cost means towards success.

Arguably the greatest hurdles sensor technology for IoT faces right now are system integration and performance over time.  Difficulties during this process can occur for a number of reasons.


For example, if we consider sensors embedded in a bridge, we can listen to the ‘song’ of the supporting guy wires that vibrate in the wind.  They are affected by the wind direction, speed, temperature, and influenced by the currents and eddies of the water rushing underneath the span.  We can learn these harmonics and log the patterns of sounds they create, both sonic and subsonic.  In doing so, if a crack occurs in the concrete of the structure, then it will change the song and we can detect and locate this crack instantly, even if it is invisible to see with the human eye.  We know it is there since the harmonics change due to its presence.

The good news, however, is that sensor technology has proven itself steadfast despite these reliability worries.  For example, sensors must be resilient and strong in the face of harsh conditions.  Tech that monitors temperature, fluids and gases, vibrations and sound, or even light can be subject to potential physical damage, leaving a blind spot in the supposed holistic system that IoT entails.  But with the advent of powerful sensing materials, particularly various nano-materials, our sensors are more resilient than ever.

The versatility of IoT sensors is another requirement for effective utility, although, as you may guess, we already use a great range of sensors for individual industries and fields.  From chemical sensors to proximity or image sensors, the use of this technology is already nearly ubiquitous.  The next step is integrating these sensors into a single system.


Perhaps the greatest advantage sensors currently have in regards to implementation is their rapidly shrinking cost.  From 2004 to 2014, the average cost of sensors for IoT systems dropped by more than half, from $1.30 to 60 cents, and is expected to go as low as 38 cents by 2020.  Cheaper sensors mean that more can be packed into an IoT system, increasing the quality and quantity of the big data coming out of the system for analysis.

The number of companies developing sensor technology should grow in lockstep with the rising deployment of IoT devices, with a higher volume of sensor developers leading to higher competition and greater innovation.  For instance, sensors’ decreasing size allows for easier embedding into existing systems, meaning that workplaces do not have to spend as much money and effort to accommodate these new additions.

How well sensors work can determine the overall usefulness and longevity of IoT.  So, for companies that claim that IoT is the most important emerging technology, evaluating the quality of sensors is a fundamental aspect to rolling out a fully functional system.  Business spending on IoT hit $964 billion in 2017, and while sensors themselves represent a small fraction of that cost, any potential issues can mean having to once again integrate new sensors, or in some cases, replacing devices completely to work with compatible sensors.


But enterprises can overcome these challenges by virtue of proper planning and due diligence.  Best practices are essential to implementing this emerging technology.  Such practices include having an involved CIO, including external vendors on IoT teams and using third-party platforms to host IoT operations.  Specifically, direct involvement of external IoT sensor vendors can represent a more seamless integration process, and perhaps even save companies time and money if issues with implementation, damage or security arise.

The challenges of IoT implementation are tangled up in a large technology stack.  Sensor technology is not without its flaws, but as it stands, today’s sensors are well advanced and on a strong path into the future.  There is certainly still room for sensor technology to grow, and as cloud computing and other elements of IoT also gain traction, it is not so far-fetched to bet that our sensors will be the bedrock of a secure and powerful internet of things.


Apple Newsroom. (2018). ECG app and irregular heart rhythm notification available today on Apple Watch. Apple Inc. Retrieved on November 10, 2019 from,

Sappin, E. (2019). If IoT is Built On Sensing, How Good Are Our Sensors? Forbes Media LLC. Retrieved on November 10, 2019 from,

About the Author:

Michael Martin has more than 35 years of experience in systems design for broadband networks, optical fibre, wireless, and digital communications technologies.

He is a business and technology consultant. Over the past 15 years with IBM, he has worked in the GBS Global Center of Competency for Energy and Utilities and the GTS Global Center of Excellence for Energy and Utilities. He is a founding partner and President of MICAN Communications and before that was President of Comlink Systems Limited and Ensat Broadcast Services, Inc., both divisions of Cygnal Technologies Corporation (CYN: TSX).

Martin currently serves on the Board of Directors for TeraGo Inc (TGO: TSX) and previously served on the Board of Directors for Avante Logixx Inc. (XX: TSX.V). 

He has served as a Member, SCC ISO-IEC JTC 1/SC-41 – Internet of Things and related technologies, ISO – International Organization for Standardization, and as a member of the NIST SP 500-325 Fog Computing Conceptual Model, National Institute of Standards and Technology.

He served on the Board of Governors of the University of Ontario Institute of Technology (UOIT) [now OntarioTech University] and on the Board of Advisers of five different Colleges in Ontario.  For 16 years he served on the Board of the Society of Motion Picture and Television Engineers (SMPTE), Toronto Section. 

He holds three master’s degrees, in business (MBA), communication (MA), and education (MEd). As well, he has three undergraduate diplomas and five certifications in business, computer programming, internetworking, project management, media, photography, and communication technology. He has earned 15 badges in next generation MOOC continuous education in IoT, Cloud, AI and Cognitive systems, Blockchain, Agile, Big Data, Design Thinking, Security, and more.