We can stop designing screens now!
As we all know, more and more companies are turning to embedded sensors and connected devices as a way to improve their service offerings and bring more value to their customers. For example, we recently wrapped up a project with Liberty Pumps who took advantage of embedded sensors to create a sump pump that will alert homeowners of potential issues, helping to prevent basement flooding. This connected ecosystem of technology, services and value provides designers with an opportunity to look beyond the design of a screen interface (UI – User Interface Design) and embrace a more holistic approach to designing full experiences and solutions (UX – User Experience Design).
Back in 2013, Golden Krishna delivered a talk at South by Southwest that was named one of the “Greatest Hits of SXSW” entitled “The Best Interface is No Interface”; he went on to write a book by the same name. Krishna’s talk clearly resonated with many and the examples he gave clearly illustrated the absurdity of the “…there’s an app for that….” situation. As an example, he walks through the 13 step process to unlocking your car using an app, instead of just using your key to unlock the car.
The ease with which sensors can be embedded into objects and environments makes me want to re-word Krishna’s mantra “…there’s an app for that….” to “…there’s a sensor for that….” since sensors are currently being embedded into everything from toothbrushes to socks. Can there be value in embedding sensors into objects and our environment? Absolutely. But, all too often there is an automatic assumption, that with that connected sensor, there’s an app for it. That assumption is a missed opportunity for a more holistic approach to designing the experience, one that looks beyond designing screens for everything.
In our experience, we have found that there are three tools in our user experience toolbox that are particularly useful when designing for connected devices:
1. Contextual Research
2. Mapping and visualizations such as ecosystem maps and journey maps
3. Notification Design
SEE MORE: UX Research Dimensions
When looking at embedding sensors into objects, it’s critical to understand how, and where, that object will be situated in the physical environment of the person that will be interacting with it, and the best way to understand that environment is by investigating and experiencing it first-hand. Depending on the needs of the project, we have found that it can be more effective to pair a User Experience Researcher with a User Experience Designer to conduct the contextual research. This pair are approaching the observation from different perspectives, and we find that they can come out the research with richer insights and findings, and less difficulty in bridging between the investigative and generative phase.
User Experience Practitioners tend to be visual thinkers, so we map and visualize as a way to see and understand things, to help us collaborate and identify pain points and opportunities. When designing for connected devices, the tools we use to visualize systems and journeys, eg: ecosystem maps and journey maps, become an important part of our design process. In a past post, I wrote about how business models for connected products and services could more effectively be thought of as existing within a dynamic ecosystem. Connecting a device adds a layer of complexity and dynamism to many things, including its ecosystem of services and stakeholders; its technology; and the business model. Mapping and visualization become key to ensuring all stakeholders are on the same page in their understanding.
If the connected device needs to provide information to the person that will be interacting with it, the contextual research is critical for understanding the physical environment of that person. With connected devices we have the opportunity to free ourselves of our reliance on phone, tablet or laptop screens and explore if there are other, more contextual ways, to provide that information. We can re-think how cues, such as visual, auditory and haptic, can come into play when we are no longer designing for a device screen and can, instead, integrate our design into the physical environment. But, in order to do that, we need to understand additional contextually relevant options there are to convey information, what other things the person is already looking at, or monitoring in that environment and consider what other surfaces or objects can be used as a canvas to convey information.
So, while these three above mentioned tools – contextual research, mapping and notification design – are not “new” tools to experience design, we have found that of all the tools in our toolbox, these three have been particularly useful in helping us with the added complexity that comes with designing for connected devices and services. They are also key in understanding the broader context of the person interacting with the connected device and services, which can help you think beyond just creating another app for that, and instead design something that is literally better integrated into their environment.
Insights delivered to your inbox
Subscribe to get the latest insights in IoT, Alexa Skills development and connected products.