June 26, 2020 | Denise Rael
Moving intelligence to the edge can lead to systems with better real-time performance, better power efficiency and enhanced security. But more intelligence requires more code,...
One widely reported analysis claims that Bitcoin consumes more than 40 Terawatt hours per year. That is more energy than we here in Ireland consume in a year. This makes sense – bitcoin operates by getting proof of work. To complete a transaction the miners in the network perform a series of operations. These consume power. But most of the work of the miners yields no useful results to those outside the network. Yet the miners earn currency for doing so. So it makes economic sense but they are using a lot of power doing number crunching of little utility.
Proof of work was one of the founding principles of how the Bitcoin system was set up. Yet the scale of the energy use was an unintended consequence. And it is something that is beginning to concern some people and have them look for alternatives. After all, you can still have a working cryptocurrency even if you got rid of a large fraction of the miners.
As I read about the power consumption of crypto currency mining networks I wondered where else the laws of unintended consequences might come to play. One area that springs to mind is the tens of billions of IoT devices expected to be in the field in the next few years. Would even a small power draw per device add up to something to worry about? After all, we are dealing with a lot of devices. To find out I thought I’d take a look at some of the numbers.
Trying to figure out a power consumption for all these devices is difficult. Some will be battery powered. Some may harvest energy from ambient sources. Still, others will use mains supplies.
An example is a modern smartphone’s battery. It can store around 5Whr (sometimes less) and may need to be charged every second day. That gives a yearly energy consumption of 912.5Whr. But in my experience energy efficiency is key to most IOT and IIOT devices. Most will be aiming for power levels of less than a few 10s of micro-watts. So, let’s say on average they will consume 30mW. That means yearly each device will need ~315Whr of energy (~1.1 MJ).
That’s not a lot over a year. But if we have 40 billion devices out there does it become significant?
With 40 billion devices that’s 12.6TWhr of power. The electricity consumed worldwide by final users for 2005 was 16,806TWhr according to Wikipedia. So our figure for new connected devices is 0.07% of the world electricity supply (at least as it was a few years ago). A small fraction but still far from zero.
So, it looks like we won’t be having trouble finding power for the IoT/IIoT revolution any time soon. But we also need to collate and maintain all the data from these devices. This simple analysis does not factor in that power need. Nor does it include processioning it into useful, actionable responses. Power is also a fundamental limitation in taking measurements in some areas. There is no point in taking measurement whose benefit outweighs the cost of the power needed to do so.
So making sure you process data at the most energy efficient location is important. Often this means processing at the edge of your IOT network. Designing your device with the lowest energy footprint remains key. But so too does understanding the end to end solution. To reduce the power it is crucial to have a good end to end architecture while still designing efficient IOT devices.