Design a photo app that allows a user to print photos to a pocket-sized printer.

Wireflow of Sprocket Mobile App

Business goal: Redesign the Sprocket application into a world-class app that removes ambiguity and increases the user experience.

Design goal: Develop experience flows, interaction and design patterns, and build relationships with the developers to create reusable and extendable patterns.   

Problem: The original Sprocket design, version one, had a visual designer but not a dedicated user experience designer. The app had no consistent design patterns and failed in the App Store ratings. The code was “spaghetti,” according to the developers. 

Contributions: I collaborated with the dev team and set up a weekly meeting with the dev leads. I led the design team and defined design patterns, wireframes, UX flows, and testing scenarios, creating low, mid, and high-fidelity prototypes using Sketch/Invision/XD, Principle, Origami, and Framer. My prototypes defined interactions, microinteractions, flows, and layouts. 

Contributors: Two junior designers handled high-fidelity visual design, and another junior UX designer aided with Jira management and design patterns and prototypes.

Overview:

In 2017, HP hired me as Lead UI (interaction and visual) designer and UI storyboard artist for the HP Sprocket printer app team. They desired to see the 2.5-star app become “World-Class.” My first task was to audit the app and define some areas that needed improvements. My initial examination revealed a heavy reliance on modals to dialogue with the user, features that had unpredictable outcomes, and a lack of consistent patterns. Based on this audit, the team (myself, a lead architect, and a visual designer) began planning ways to change the app and create consistency.

Using design heuristics, I began to focus on revealing hidden functions and creating consistent principles and patterns for any new features or updates that follow. 4.8 stars later, the app now has principles and patterns and has become more user-centered and standards-based.

Building a team:

After the design audit, I needed to build rapport with the visual designer and the iOS and Android dev teams. On top of the weekly design-to-dev review, I started having weekly meetings to share the new design direction, get their input into how they see the current design working out, and discuss ways to develop with patterns and reusable/extendable code. A lot of the app, the developers, expressed, was hacked together with spaghetti code—due to short timelines. At the same time, the visual designer (VX) and I began to transition our designs to the new patterns and began exploring styles, colors, and standardized patterns.

While she (VX) did visual exploration, I transitioned our team to an Agile workflow using Jira. (We were the first team to transition to a fully Agile team of any design groups in HP print.) This flow allowed her and me to work in the same system the devs were using. It created a public design backlog where we could all be aware of what was currently being worked on, work in planning meetings together, and understand what the next sprint would contain. Daily standups, dev-to-des discussions, and design discussions made us a remote but functioning team excited about the new direction and potential.

New Definitions:

In concert with the above, I built interaction flows, defined new patterns for each new or revised feature, and planned a loose road map toward future versions. Our first real challenge, as a team, came when the PO wanted to use instructional coachmarks for the Camera, which I had shared with them during the audit. Upon further discussion, we (the design team (lead by me), the devs, and the research team determined that this wasn’t the best way to handle the problem—heuristically, there were too many issues. The biggest issue we discovered was that too many separate camera instances were in use and none connected to the other. Because of this, we began a slow process toward rebuilding and simplifying the Camera use throughout the app.

Camera flows

Camera:

I began by discussing the camera reduction possibilities with the devs. As mentioned above, we discovered multiple camera instances—exactly five. These multiple camera instances were a problem because each instance had to load individually, took too much time, and couldn’t communicate with each other. The goal, we determined, was to reduce to one instance—if possible. Thus began the hunt to decrease the number of cameras. It took multiple alignment meetings and a lot of dev exploration to conclude which cameras could be combined, tossed, or which had to stay permanently.

We had three different AR APIs happening for our AR Feature (Reveal), and only two could function using the main camera instance. We realized that the limitations would require the Sprocket app to maintain two instances alone in the Camera function.

Reveal

Having fewer but still multiple camera instances meant that there would be a slight delay each time a user switched cameras. To mitigate this, we needed to decrease or hide the delay. I defined a transition to work this magic. We would capture a view of the last camera viewport and blur it to make it seem like the change wasn’t as significant as it actually was. Otherwise, the user would see an odd black screen as one Camera paused and the other started.

AR Transition

We removed two instances: that left the primary Camera, the AR Reveal camera, and a custom sticker camera. Once we defined the camera instances, the design team explored camera apps and polled them to discover universal patterns. We discussed the pros and cons of discoverability vs. findability. Discoverability, without coachmarks, won—because of the Camera’s many hidden features. A primary guiding principle that I defined for the Camera was: when a user wants to take a photo, they want to capture that moment, not be required to remember how something works.

Camera hidden features

Through research and with the research team, I determined that obvious and extensible discoverability trumped findability. I chose this because we could easily tweak the interface in minor ways while analyzing how the user behaved more clearly. Also, the more noticeable design would allow it to have consistent patterns. Only after we have done everything we could to make it easy could we fall back on animated instructional coachmarks.

The other reason that discoverability won was that static coachmarks are inherently broken. The biggest failure is that they block users from being able to immediately use the features and leave no hinting—as to how to use them when the coachmarks are gone. The result was a fully redesigned Camera and behaviors.

Original camera flow

Alternate minimal camera flows

Camera Flows:

The changes we made brought to the Camera features (we call them modes). It contained a Camera, video, photobooth, Reveal (Sprocket’s AR capabilities), and, later, PhotoID to the foreground. Bringing them to the front allowed them to be discoverable, and because of the patterns, new features could extend it if needed.

Camera options map

Each of these camera modes has specific options: timer, front/back camera, flash, and, in the case of PhotoID, capture size, and hints. Because phones on the market continue to outgrow the size of many users’ hands, I designed and tested moving the modes and the options nearer to the user’s thumbs and, therefore, easier to interact with. The Sprocket Camera function hasn’t ever recorded a high usage; the redesigned Camera, however, did lead to a higher percentage of use than the previous version by a percent or two.

Camera redesign

Gallery, Preview, etc:

Post Camera redesign, we began planning to update the remainder of the Sprocket app.

Landing screen:

Once a user has onboarded and set up their Sprocket, the original app opened to a selection screen. That screen revealed everything and allowed users to choose from any source they wanted. However, through analytics, we discovered that 97-99% of our users printed from their local phone gallery and only secondarily used the Camera, share extension, or social sources. We refocused on our primary users and defaulted the landing screen to the local gallery with a research backed decision in hand.

Sprocket landing screen

A benefit to this change was that users could quickly select a photo, multiple photos, change photo sources, or switch to the Camera from the gallery. Even though the choice screen allowed users to choose their path quickly, most of our users chose the same one—their local phone gallery. Added to this usage data was the lack of expandability of the choice screen—at some point, it would run out of real estate.

Global navigation bars: home bar (bottom), mode bar (middle), and options bar (top)

Gallery:

In the examples below (left to right), one and two were designed before my arrival—both had a camera instance. Example one had a discernable camera icon, but the Camera disappeared as soon as the user scrolled their photo roll. Example two was simply an icon swap—the business wanted to focus more on its AR capabilities—and hard-to-read paging dots. (Note: the live feed is missing from two because we killed the camera instance during the camera redesign because it slowed down the app). Those examples failed from the same defect for the secondary galleries; users couldn’t find the social media galleries. Example three focused on implementing the reusable global home bar (bottom-most portion), mode bar (allowed users to swipe to other features), and option bar (options within each mode). Example four was simply a beautifying iteration—we refined the overall example three experience and lightened up the interface, increased the font size, and added new Gallery features.

Gallery iterations

Print queue:

The print queue evolved from a modal in example one (two left-side images below) to a fullscreen in example three—it could have just been another modal (second from right). The idea was that version 2 would allow the company to create a Print Queue landing page. The goal was to, later, iterate it; however, the only recourse for the user was to delete all unseen print jobs or do nothing. This implementation wasn’t a good user experience. Example four (far right) was rough, but it allowed the user to delete and move an item to the top of the queue—while it was a step in the right direction, its IA was unorganized and clunky.

Print queue iterations prior to my redesign

Example four was one of the first screens I worked on—just a few weeks into the job. I didn’t even fully know what the project would require. When we focused more on this function (after Camera and Gallery), I iterated the queue further, shortened the row height, lightened the interface, and introduced the home bar and options bar. But we did more than minor visual tweaks. We added the ability to switch queues to other connected Sprocket printers, defined the moment the print job transfers to the hardware, and added the ability to manage all sprocket printers right from the queue.

The challenge I faced was that the users needed to understand where their print job was in the printing/transfer process. Also, we had to define a minor distinction between two different queues—because the new Sprocket printers had a hardware queue and a software queue. The software queue holds the print jobs until all of the file’s data is transferred, over Bluetooth, to the hardware queue—which holds up to 10 print jobs.

Further in the redesign, I introduced an accordion pattern—this pattern we would reuse in other places—for the users to see all of their Sprocket printers and switch them as needed. As I mentioned earlier, we introduced a way for the user to change to a secondary printer—if they had more than one—to see its queue (see the first image above). There was a history row at the bottom of the PQ in the previous versions—it took up a large section and would appear or disappear depending on if there were items.

In the redesign (see the clock icon in example one and example four above), we utilized the options bar pattern to allow the user to have a consistent place for history items. This history icon never disappeared—but it needed further iteration because it had some findability issues.

Preview:

The original version of Preview never underwent multiple iterations. Preview version 1 had an image preview, edit, share/save, print button, and a drawer that users couldn’t find—it had a line or dots depending on the version, and the affordance was too thin to be recognizable. There were minor adjustments but no significant differences.

Original preview

When we updated Preview in version 2, we introduced the Navigation Bar (below), but it had a hidden “Add Printer” affordance. Most of our users only had one device and would only ever have one. We decided to design the Add Printer affordance with a press-and-hold until we could define a behavior we could use app-wide. Also, in v2, a modal “Add Printer” flow would appear the first time a user tapped the print button—when a printer wasn’t paired to it.

However, after the modal was dismissed and a printer was paired, the interface required a press-and-hold on the printer button. This design decision was clunky, but the risk was low because over 90% of the users didn’t have more than one device. In version 3, we moved the hidden “Add Printer” menu from the press-and-hold print button to a  dropdown menu in the header bar. The new menu was easier to discover, and it allowed it acted as a switching mechanism for changing printers. Another addition in v3 was a lighter background and header, continuing with the intent to lighten up the interface.

Redesigned preview

Onboarding:

The first thing I noticed when I audited the onboarding was how its design and styling didn’t seem to fit our target audience of 14 to 24-year-old females. Secondly, the dark opening screen switched to stark white in the instructions. This inconsistency created an unplanned feeling about the app. However, as I used the app more, I discovered anywhere there were heavy information pages, they were typically (but not always) stark white. But, it was so inconsistent that a pattern never fully emerged. While most information pages switched to white, other pages like Settings (information-heavy) were still dark grey. This swapping between colors created a sense of incongruity with the app.

As a new user, I didn’t know when the screens would be grey or white and was surprised when they did or didn’t behave the way I expected. Thirdly, I noticed that the screens weren’t functioning as a group but as individual screens. Each setup screen moved as a whole screen—when the user swiped. Even though parts of the screens were shared, the entire screen had to be redrawn. This behavior created an excellent opportunity for revising onboarding. We wanted the onboarding screens to feel like the paging dots were stable, and only the picture and instructions moved. After creating stability in the onboarding, we focused on refining the setup screens and transitioning to a new style.

Focusing on our target audience, we determined, through testing and exploration, to use a softer color palette and less techy/rendered images. The original pictures were an iconic touchpoint for our users and set an unexpected tone. We intended to craft a more appealing introduction to Sprocket with the new style. My focus was to tie the onboarding and visuals into the updated iconography used throughout the app. This change in the onboarding style would create continuity throughout the app—beginning to end. Along with these visual changes (see below), we determined which touchpoints needed to be updated to lighten the entire app experience.

I mapped these changes and created flows to identify all areas needing improvement. Two examples: the first time a user tapped the print button, a blocking modal popped up to tell them about HP’s AR experience. After a few uses, another blocking modal popped, on print, informing the user that they could use the Share Extension to print from other apps. In both cases, the user’s goal was to print. In both cases, the modal blocked them from completing the task—until they closed the modal. These two examples are why I needed to reorder the information and flow. We needed to show that info, but we had to determine the best time to present learning material to our users.

Wireflow of Sprocket Mobile App

Onboarding flow

(Note: I would love to take credit for the illustrations, but the talented Melissa Smith LinkedIn handled the visual design.) The images below show how we iterated the visual design layer after the interaction/UX layer was defined (see wireflow).

I have presented above are just a few of the many changes we made. I could go on about the many more tweaks and features. If you check it out in its current form, neither I nor my team work on the Sprocket app design anymore—new modal

Sprocket App Design