Design an app that prints from a phone to a mobile pocket-sized printer.
Business goal
Redesign the Sprocket application into a world-class app that removes ambiguity and increases satisfaction and app ratings.
Design goal
Develop experience flows, interaction and design patterns, and build cross-functional relationships and define reusable dev components.
Problem
Sprocket app had a visual designer not a UX designer. The app lacked consistent design patterns and the code was “spaghetti.”
Contributions
- Weekly meetings with dev team and bi-monthly meetings with the dev leads.
- Design lead: defined design patterns, wireframes, UX flows, and testing scenarios.
- Created low, mid, and high-fidelity prototypes using Sketch/Invision/XD, Principle, Origami, and Framer.
- Prototypes defined interactions, microinteractions, flows, and layouts.
Contributors
- UX architect
- Hardware UX lead
- Two junior visual designers (not concurrent)
- Junior UX designer (design patterns, prototypes, and Jira management)
- iOS dev teams
- Android dev teams
Overview
In 2017, HP hired me as Lead UI (interaction and visual) designer and UI storyboard artist for the HP Sprocket printer app team. They desired to see the 2.5-star app become “World-Class.” My first task was to audit the app and define some areas that needed improvements. My initial examination revealed a heavy reliance on modals to dialogue with the user, features that had unpredictable outcomes, and a lack of consistent patterns. Based on this audit, the team (myself, a lead architect, and a visual designer) began planning ways to change the app and create consistency.
Summary:
Using heuristics, I began to focus on revealing hidden functions and creating consistent principles and patterns for any new features or updates that follow. 4.8 stars later, the app now has principles and patterns and has become more user-centered and standards-based.
Building a team
After the design audit, I needed to build rapport with the visual designer and the iOS and Android dev teams. On top of the weekly design-to-dev review, I started having weekly meetings to share the new design direction, get their input into how they see the current design working out, and discuss ways to develop with patterns and reusable/extendable code. A lot of the app, the developers, expressed, was hacked together with spaghetti code—due to short timelines. At the same time, the visual designer (VX) and I began to transition our designs to the new patterns and began exploring styles, colors, and standardized patterns.
While she (VX) did visual exploration, I transitioned our team to an Agile workflow using Jira. (We were the first HP design team to transition to a fully Agile team.) This flow allowed her and me to work in the same system the devs were using. I created a public design backlog where we could all be aware of what was currently being worked on, work in planning meetings together, and understand what the next sprint would contain. Daily standups, dev-to-des discussions, and design discussions made us a remote but functioning team excited about the new direction and potential.
New Definitions:
In concert with the above, I built interaction flows, defined new patterns for each new or revised feature, and planned a loose road map toward future versions. Our first real challenge, as a team, came when the PO wanted to use instructional coachmarks for the Camera, which I had shared with them during the audit. Upon further discussion, we (the design team (lead by me), the devs, and the research team) determined that this wasn’t the best way to handle the problem—heuristically, there were too many issues. The biggest issue we discovered was that too many separate camera instances were in use and none connected to the other. Because of this, we began a slow process toward rebuilding and simplifying the Camera use throughout the app.
Camera
I began by discussing the camera reduction possibilities with the devs. As mentioned above, we discovered multiple camera instances—exactly five. These multiple camera instances were a problem because each instance had to load individually, took too much time, and couldn’t communicate with each other. The goal, we determined, was to reduce to one instance—if possible. Thus began the hunt to decrease the number of cameras. After multiple alignment meetings and a lot of dev exploration to conclude which cameras could be combined, tossed, or which had to stay permanently.
We had three different AR APIs happening for our AR Feature (Reveal), and only two could function using the main camera instance. We realized that the limitations would require the Sprocket app to maintain two instances alone in the Camera function.
Having fewer but still multiple camera instances meant that there would be a slight delay each time a user switched cameras. To mitigate this, we needed to decrease or hide the delay. I defined a transition to work this magic. We would capture a view of the last camera viewport and blur it to make it seem like the change wasn’t as significant as it actually was. Otherwise, the user would see an odd black screen as one Camera paused and the other started.
We removed two instances: that left the primary Camera, the AR Reveal camera, and a custom sticker camera. Once we defined the camera instances, the design team explored competitive camera apps and polled them to discover universal patterns. We discussed the pros and cons of discoverability vs. findability. Discoverability, without coachmarks, won—because of the Camera’s many hidden features. A primary guiding principle that I defined for the Camera was: when a user wants to take a photo, they want to capture that moment, not be required to remember how something works.
In conjunction with the research team, I determined that obvious and extensible discoverability trumped findability. I made this choice because with a straightforward design we could analyze user behavior and easily tweak the interface in minor ways to make it better. Also, the more noticeable design would allow us to have consistent patterns. Only after we have done everything we could to make it easy would we consider falling back to animated instructional coachmarks.
The other reason that discoverability won was that static coachmarks are inherently broken. Their biggest failure is that they block users from being able to immediately use the features and leave no hint—as to how to use them when the coachmarks are gone. The result was a fully redesigned Camera and behaviors.
Alternate minimal camera flows
Camera Flows:
The changes we made brought to the Camera features (we call them modes). It contained a Camera, video, photobooth, Reveal (Sprocket’s AR capabilities), and, later, PhotoID. Bringing them to the front allowed them to be discoverable, and because of the patterns, new features could extend the functionality if needed.
Each of these camera modes have specific options: timer, front/back camera, flash, and, in the case of PhotoID, capture size, and hints. Because phones on the market continue to outgrow the size of many users’ hands, I designed and tested moving the modes and the options nearer to the user’s thumbs and, therefore, easier to interact with. The Sprocket Camera function hasn’t ever recorded a high usage; the redesigned Camera, however, did lead to a higher percentage of use than the previous version by a percent or two.
Gallery, Print Queue, & Preview
Global navigation bars: home bar (bottom), mode bar (middle), and options bar (top)
A benefit to this change was that users could quickly select a photo, multiple photos, change photo sources, or switch to the Camera from the gallery. Research showed us that even though the choice screen allowed users to choose their path, most of our users chose the same one—their local phone gallery. Added to this usage data was the lack of expandability of the original choice screen—at some point, it would run out of real estate.
Gallery:
In the examples to the right (left to right), one and two (top) were designed before my arrival—both had a camera instance. Example 1 (top left) had a discernible camera icon, but the Camera disappeared as soon as the user scrolled their photo roll. Example 2 (top right) was simply an icon swap—the business wanted to focus more on its AR capabilities—and hard-to-read paging dots. (Note: the live feed is missing from two because we killed the camera instance during the camera redesign because it slowed down the app). Those gallery examples failed from the same defect for the secondary galleries; users couldn’t find the social media galleries. Example 3 (bottom left) focused on implementing the reusable global home bar (bottom-most portion), mode bar (allowed users to swipe to other features), and option bar (options within each mode). Example 4 (bottom right) was a beautifying iteration—we refined the overall example 3 experience and lightened up the interface, increased the font size, and added new Gallery features.
Print queue iterations prior to my redesign
Print queue:
The print queue evolved from a modal in example 1 (top two images on the left) to fullscreen in example 2 (bottom left)—which could have just stayed a modal. Both of these designs were implemented before I took over.
The original idea was that example 2 would allow the company to create a Print Queue page. The goal was to iterate it; however, the only recourse for the user was to delete all unseen print jobs or do nothing. It was obvious, that this implementation wasn’t a good user experience and needed to be fixed quick. Example 3 (bottom right: my iteration) was rough, but it allowed the user to delete and move an item to the top of the queue—while it was a step in the right direction, its information was unorganized and clunky.
Example 3 was one of the first screens I worked on—just a few weeks into the job. I didn’t even fully know what the project would require. When we focused more on this function (after Camera and Gallery), I iterated the queue further, shortened the row height, lightened the interface, and introduced the home bar and options bar. But we did more than minor visual tweaks: we added the ability to switch printer queues to other connected Sprockets, defined the indication telling the users when the print job transferred from the app to the hardware, and added the ability to manage all sprocket printers right from the queue.
The challenge I faced, in this screen, was ensuring that the users understood where their print job was in the print/transfer process. Also, we had to define a minor distinction between two different kinds of queues—the new Sprocket printers had a hardware queue and a software queue. The software queue held the print jobs until all of the file’s data was transferred, over Bluetooth, to the hardware queue—which held up to 10 print jobs.
Further in the redesign, I introduced an accordion pattern—this pattern we would reuse in other places—for the users to see all of their Sprocket printers and switch them as needed. As I mentioned earlier, we introduced a way for the user to change to a secondary printer—if they had more than one—to see its queue (see the first image above). There was a history row at the bottom of the PQ in the previous versions—it took up a large section and would appear or disappear depending on if there were items.
In the redesign (see the clock icon in example one and example 4 above), we utilized the options bar pattern to allow the user to have a consistent place for history items. This history icon never disappeared—but it needed further iteration because it had some findability issues.
Preview:
The original version of Preview never underwent multiple iterations. Preview version 1 had an image preview, edit, share/save, print button, and a drawer that users couldn’t find—it had a line or dots depending on the version, and the affordance was too thin to be recognizable. There were minor adjustments but no significant differences.
When we updated Preview in version 2, we introduced the Navigation Bar (below), but it had a hidden “Add Printer” affordance. 95% of our users had one device and would never have more. We decided to design the Add Printer affordance with a press-and-hold until we could define a behavior we could use app-wide. We did this because, in v2, a modal “Add Printer” flow would appear the first time a user tapped the print button—when a printer wasn’t paired to it.
However, after the modal was dismissed and a printer was paired, the interface required a press-and-hold on the printer button. This design decision was clunky, but the risk was low because over 95% of the users wouldn’t have more than one device. In version 3, we moved the hidden “Add Printer” menu from the press-and-hold print button to a dropdown menu in the header bar. The new menu was easier to discover, and it acted as a switching mechanism for changing printers. Another addition in v3 was a lighter background and header, continuing with the intent to lighten up the interface.
Onboarding
The first thing I noticed when I audited the onboarding, was how its design and styling didn’t fit our target audience of 14 to 24-year-old females. Secondly, the dark opening screen switched to stark white in the instructions. This inconsistency created uncertainty about the app. However, as I used the app more, I discovered anywhere there were heavy information pages, they were typically (but not always) stark white. But, it was so inconsistent that no pattern ever emerged. While most information pages switched to white, other pages like Settings (information-heavy) were still dark grey. This swapping between colors created a sense of incongruity with the app.
As a new user, I nor our test subjects knew when the screens would be grey or white and were surprised when they did or didn’t behave the way we expected. Thirdly, I noticed that the screens weren’t functioning as a group but as individual screens. Each setup screen moved as a whole screen—when the user swiped. Even though parts of the screens were shared, the entire screen had to be redrawn.
These behaviors created excellent opportunities to revise onboarding. We wanted the onboarding screens paging dots to be stable: and only the picture and instructions to move; we wanted the colors to feel consistent; and we wanted it to match our target audience. After creating stability in the onboarding, we focused on refining the setup screens and transitioning to a new style.
Focusing on our target audience, we determined, through exploration and testing, to use a softer color palette and less techy/rendered images. The original pictures were an heavily rendered touchpoint for our users and set an unexpected tone. We intended to craft a more appealing introduction to Sprocket with the new style. My focus was to tie the onboarding and visuals into the updated iconography used throughout the app. This change in the onboarding style would create continuity throughout the app—beginning to end. Along with these visual changes (see below), we determined which touchpoints needed to be updated to lighten the entire app experience.
I mapped these changes and created flows to identify all areas needing improvement. Two examples needed to be fixed immediately: the first time a user tapped the print button, a blocking modal popped up to tell them about HP’s AR experience. After a few uses, another blocking modal popped, on print, informing the user that they could use the Share Extension to print from other apps. In both cases, the user’s goal was to print. In both cases, the modal blocked them from completing the task—until they closed the modal.
These two examples are why I needed to reorder the information and flow. We needed to show that info, but we had to determine the best time to present learning material to our users.
(Note: I would love to take credit for the illustrations, but the talented Melissa Smith LinkedIn handled the visual design.) The images above show how we iterated the visual design layer after the interaction/UX layer was defined (see wireflow).
I have presented above just a few of the many changes we made. I could go on about the many more tweaks and features. (Note: If you check it out in its current form, neither I nor my team work on the Sprocket app design anymore—new modals no longer follow the patterns we defined.)