Minding the Performance Gap
Consider this, in 1969 a computer running at 0.043 MHz with fewer electronics than a modern toaster was able to help a team of astronauts to land on the moon. To meet the constraints of the time, the engineering team strove to make the software as efficient as possible by coding in the assembly and then uploading it with punch cards. Back in the days of expensive RAM and limited CPUs, programmers had to optimize their code logic, be frugal with memory, and fastidious about freeing it up.
Nowadays, even modest systems have processors with multiple cores running over 1GHz, roughly 100,000 times faster than the computer used in the Apollo 11 mission! Over the years, as computer resources became cheaper than developers, our priorities changed. It became easier and quicker to add RAM or scale up an Amazon server than to optimize code.
Then the smartphone era happened, and it was like going back in time. Suddenly we had to program for a fraction of the computation power we were used to, and choose between a few major platforms. Today, with the internet of things reaching into every corner of life, we have to develop software for even less powerful products. It feels like the pendulum has swung us back to some bygone era!
Finding the Gap
Although the speed of computers has increased insanely fast, it has not been as linear as you might think. The logarithmic graph below shows the Dhrystone benchmarks for various CPUs plotted against the year they were released.
These tests perform integer operations and report the number of iterations of the main code every second. While CPU power has grown steadily over the years, the trend line is about 5-7 years behind for phones, which means code must be optimized to run smoothly. Notice that the range of CPU power available is bigger than for desktop computers; while the average phone in North America is pretty powerful, in emerging markets devices are still rather limited. Yet we expect them to run the same software! IoT devices, which are about 5-7 years behind smartphones, have a similar trajectory, but the diversity of platforms and CPUs is even greater.
So, what does that mean for the modern IoT developer?
Programming for Slow devices
As in the early days of space travel and smartphones, developers are faced with a technological hurdle. Most devices used for IoT development are tiny, low powered, and have limited batteries. The tradeoff is lower computational power.
Programmers have to relearn many performance-enhancing techniques to deliver the form-factor and connectivity needed to succeed in the IoT business.
They have to be mindful that the latest and greatest tools that work smoothly on a development machine will most likely not run well in these constrained environments. Think of Docker containers and how they might overburden an IoT device.
The IoT Hardware Dilemma
The Arduino is a micro-controller. It runs very simple code, usually written in C, that loops over and over. It has very limited computational power (a 16MHz CPU and 32kB available to flash your code). You cannot run external software on it since there’s no operation system. It can only input or output for a few 3-5V pins. It has a serial port to receive or transmit information to an external machine, and you can connect a Bluetooth or a Wifi chip to enable wireless communication. This makes it perfect for running small self-contained devices like a fancy clock using a bunch of LEDs for the display, but don’t expect to be able to do any sort of calculations or machine learning!
On the other hand, the Raspberry Pi is a full miniaturized computer. It comes in many different formats ranging from a 700MHz CPU for the model “A” to a quad-core 1.2 GHz 64 bit processor for the Pi 3. They run a full operating system and you can attach various USB peripherals such as a keyboard, mouse or camera. Because these models all have an external display port, they are often used as a media center. It’s full featured but small, so you can easily integrate it into an IoT project.
These are completely different beasts, and you will not architect your hardware and software the same way for all of them. The impact of your platform choice is even greater than higher-level decisions like which OS to adopt. Choosing the right one for your project is a critical first step.
The Road from Prototype to Production
With all the platforms available and all the YouTube videos out there, it is fairly easy for just about anyone to create a quick prototype in a few hours. But a prototype is a proof of concept – not a proof of product. Getting your prototype to a state where it can be commercialized takes a lot of planning, effort, and expertise that many companies don’t have in-house. You will be faced with a host of decisions about hardware, which in turn will drive decisions about software.
Costs: Counting Pennies
An IoT initiative is a bit like the restaurant business, where cost control through ingredient portioning is critical for success. Chances are the hardware you used in your prototype will change in your final product for cost reasons. You might use a Raspberry Pi for a prototype because they are fast and easy to get started with, but they are relatively expensive and might be more powerful than you need for a smart toaster. Even differences in component costs that seem trivial when you’re building a prototype add up quickly when you get into full production.
Scalability: Getting ready for the big time
How do you build a server that will handle those 25,000 devices talking to it? The software architecture was completely different when you only had one prototype connected, and that code probably won’t scale up to a production environment. How is the messaging done between those devices? Will you be using a cloud-based solution? These and many other questions were probably not answered when you built a prototype, but they need to be answered early in your productization planning.
Power consumption: Every milliamp matters
Every electron is precious in small battery-powered devices, and power constraints should inform every decision you make about components. Your power budget may be driven by cost, size or weight restrictions. Given a certain power budget, how do you best allocate it? For example, if you will need to access the network a lot, this may mean trade-offs for the speed of the CPU or the type of display you choose. Conversely, if a power-hungry display is important, you may need to access the network in short, infrequent bursts to save battery power, or offload certain tasks to a server.
Security: You will be attacked
When building a prototype, you most likely didn’t bother with the security aspect of things. But once you have a commercial product out there, people will try to hack it. The consequences of security flaws in an IoT product are at least as disastrous as those in a typical software. Think of the baby monitor scandal, the smart doorbell that would let intruders come in or the Jeeps that can be remotely controlled. Security threats are often underestimated and should not be taken lightly. A high-profile breach can bring your company to its knees overnight.
It’s easy to overlook important commercialization decisions when building an IoT solution. The product you build can deliver a lot of value to your customers, but only if you build it in a way that is flexible, scalable, and secure. Managing the transition from something that works on a lab bench to a viable IoT product can be tricky, but understanding the technical gaps between conventional and IoT capabilities can illuminate the way.
Six Oversights to Consider Before Building an IoT Product
In this white paper, we outline six oversights that organizations face when entering the realm of IoT.