Touch Screen Project

Or: How A Set of Icons Got Me Into UX Design


UX Design, Research, UI Design, User Testing

The team also included: a developer, a marketing director, a (very patient) team of engineers, and an in-house clinical support team.


Originally, I was brought on to the project in a visual design capacity to draw up a custom icon family. When I asked about screens to use to make mockups of my icons in context, I learned that no one had designed those.

A collection of custom icons that I drew

A snapshot of (some of) the icon family.

What followed was a slippery slope that led to me taking on the full visual design of the product very quickly. Eventually, static PDFs and animated demos led way to fully interactive prototypes, and eventually, a full on design system for the product.


Once my role on the project evolved to more than just "icon person," I did a deep dive into competitor products, as well as research common touch screen UI outside of the medical field. Most of the other touch screen UI in the medical field I was seeing felt very clunky and outdated, a definite byproduct of how heavily regulated these products are.

I wanted to bring in some mobile patterns from outside the medical field since the vast majority of caregivers were also smartphone users.

An illustration of a smart phone next to a touch screen device

I thought common smartphone patterns could work on our custom interface.

This product was going to have a LOT of functionality baked into it, which meant a lot of homework and learning on my part. I really wanted to avoid the "TV remote" approach where every function had a button on the top level, and instead, nest certain therapies together to tidy up the interface and make it easier to navigate.

An infographic showing the different therapy functions on the device

A small snapshot of some features of this product.

I conducted many interviews with nurses to make sure the therapy functions were grouped correctly, and that all of the necessary aspects of each therapy were accounted for. These interviews also included some card sorting exercises and eventually, prototype demos.

I also wanted to make sure that any caregiver, regardless of experience with touch screen tech, could confidently and easily operate the system. To do this, I built out some user personas based on my clinician interviews that accounted for these different experience levels.

An infographic of some user personas

A top-level view of some user personas. Persona names have been changed to that of the Golden Girls for the sake of reader entertainment.

After conducting all of my interviews (and learning a ton about different types of therapy), I had to map out the various user journeys. The successful states were fairly straightforward, but since this was a highly-regulated medical product, we had to account for multiple ways things could go wrong, including time-outs, errors, codes, etc.

A flow chart of a user journey

One of the more straightforward user journeys.


Of the many valuable lessons this project taught me, none came up more than the error message pop-ups. We were aiming for 60601 ISO certification, meaning we had to meet certain strict criteria when it came to many aspects of the product.

There were certain colors and icons that we absolutely needed to use (and therefore, had to be integrated into the design system).

An illustration showing some ISO 60601 safety styles

Some of the 60601 icons and colors in action.

We also had to keep a detailed error log stored on the product itself. To keep costs low, the onboard memory of the device was not very big, and most of it was taken up by that error log.

An infographic showing the breakdown of system memory on the UI

A rough estimation of the device's memory partition.

By time we factored in the error log, the basic functionality, and the system's graphics, there was essentially no extra memory for any bells and/or whistles. The result was a very streamlined UI. It was also an excellent exercise in designing with the bare minimum, almost like a very high fidelity wireframe.


Even though there wasn't much bandwidth for fancy interactions, we could still leverage basics like swiping and clicking. Dealing with these common interactions at their most bare-bones level really made me think through them critically and choose certain ones very intentionally.

A sketch showing some menu organization ideas
A sketch of some UI interactions
A sketch showing several homescreen layouts

A few sketches thinking through interactions and transitions.

To put these limitations in context, the fanciest bell and/or whistle I was allow to have was a single 30-frame looping animation that I used for load screens. This may not seem like a lot (it isn’t), but you’d be amazed how much impact a simple animated signifier can be in the context of “the system isn’t broken, it’s just loading.”


While the product in question is still in development, this project inadvertently led me into UX Design. Working through such a dense product while using only the most essential tools in the UX tool belt, all while navigating a very strict regulation system was a great learning experience.

I loved the holistic approach to designing this product. Being able to affect this product at every level was such a great learning experience and led to a real sense of investment in the final product. With this project, I really felt like I was "building" something instead of just "being the designer on a project.*

*While I may have been building parts of this project, the real builders of this project were the dev team who not only actually built this interface, but handled my psychotic functionality demands with grace and patience. They're the real MVPs.