Many years ago, early in my career as a UX designer, I was at an agency finishing up wireframes for a client. The account executive came to my desk with questions and a suggestion. He stated he had doubts about how I had designed a particular feature.
“Why did you design it that way?” he asked. So I provided an answer citing current design trends and how this approach eliminated clicks. He was unconvinced, and argued for a different design approach. I referenced ease of use and insights we gained from user interviews pointing to my approach being better. The account executive called into question whether the research was sound — “why that way?” he asked again — and again pushed for his approach citing how he thought users would behave. Round and round we went, me making the case for my design based on my experience and expertise, him questioning it and pushing for his approach. He was not moved by anything I said. Finally, quite exasperated, I responded to his latest “but why that way?” question with: “Because I’m the UX designer and you’re not!”
I’m rather embarrassed by that response. I’ve thought about it many times over the years and I cringe every time. It was absolutely the wrong thing to say. I did go up to the account executive afterwards and apologized for my behavior and we cleared the air. Still… uggh.
Many designers struggle to justify their work to others. The above example, if not the poor response by me, likely sounds familiar to many designers. It should be self-evident that design matters and is valuable. The struggle is real. Many organizations wrestle with this challenge: how do you quantify the impact of design?
Now the easy answer here is: focus on metrics. Yes, do that. Design teams need to frame their work through the lens of business metrics. Define success criteria before the work kicks off and measure the change in those metrics when work is completed/launched.
But for this post, I wanted to share examples of how design can demonstrate impact and value beyond just improved metrics. I’ll share three real world experiences where my design teams have delivered results and changed perceptions of our efforts for the better.
1. Stop calling… please
When I first arrived at Liberty Mutual I was tasked with building out a new design team for our part of the organization where one had not previously existed. Most of the PMs and engineers were not used to working with design and as such we didn’t have their trust. There was a lot we wanted to push for but was not making progress. What we could do and how much change in features and page construction was very limited. Often we were asked to use only existing (old, poorly designed) components.
At that time there was a big push to reduce calls. Calls cost money and Liberty wanted to push more people to a digital self-service model (that didn’t quite exist at the moment; more on this later).
Data analytics identified the page that generated the most calls into Liberty Mutual: the Contact Us page. I was asked to have my team redesign the page so it generated fewer calls.
The problem is the page was already designed in a confusing manner that made it hard to find a phone number. In fact, users found prompts to mail a letter or fax a request before you came to a phone number. There was little I could do to make the page less directive to calling (short of transposing digits in the provided phone number).
My team started by reviewing the available analytics for the page. One data point that stood out was the high number of call transfers it generated. There are two main numbers for Liberty Mutual — one for customer service and one for sales. Liberty had calculated that the cost for a representative to transfer a caller from one call center to another to be $3.50 per switch. So we ran an A/B test: the test page would display clear headers indicating one phone number was for customer service and one was for sales. The control version was the existing page without headers.
The results were dramatic. In the control version calls were evenly split between the two numbers (571 calls for sales vs. 452 for service). For the page with headers, almost all calls were for customer service — 852 for service vs. 29 calls for sales. Users were now getting to where they wanted to go. We virtually eliminated call switching, which saved Liberty hundreds of thousands of dollars annually.
This was a simple test that didn’t really require design work. But what it showcased was design’s ability to evaluate a problem, hypothesize a solution, and deliver results. This small success put our nascent design team in a positive light. We earned just enough trust to explore more involved changes. We would iterate on content and design patterns that made the Contact Us page more aesthetically appealing while also better helping users accomplish what they needed to. We leaned into self-service, getting users answers and increasing login-rates. We would even launch a successful chatbot.
All of this became possible because we could succinctly demonstrate the ability of design to solve real business problems and deliver meaningful results.
Let design lead the way
One summer the engineering team at Drizly was bogged down in extensive, and necessary, infrastructure work. It was all-consuming and demanded everyone’s attention. Except for design.
With a lighter workload, I had the design team take advantage of this lull to focus on users. Specifically, we undertook the effort of mapping out the primary user flows of our consumers. What were the main tasks they wanted to accomplish and what was the experience actually like for them. We documented everything. We also layered in available user research findings. We zeroed in on every point of friction for users, calling them out. We also identified opportunities to make different points more joyful (one of our design principles). We had everything laid out for onboarding, browsing, gift-giving, checkout, and several others.
This wasn’t busywork or a way to continue practicing our craft. I had an insight. Q4 is the busiest time of the year for Drizly. Historically we’ve been slow to get work done ahead of Q4 that we know will improve the Drizly experience. We end up rushing out quick fixes scattershot. I also knew the rest of leadership and the PMs hadn’t had time to think beyond the infrastructure work. We would need a plan. To borrow a hockey metaphor, I was skating to where the puck would be.
I pitched the C-suite on what I was doing and got buy-in. I presented our work at a product team all-hands. Sure enough, there had been little planning done on what to launch in Q4. The design team’s work solved that. I met with senior PMs to break down the findings and plan out the work. We decided our big focus should be on gifting enhancements. We had queued up 55 tweaks, fixes, and general improvements to the gift-giving flow that became the consumer team’s priority. We launched them in time. What we saw was a dramatic reduction in customer-service contacts (the need for Drizly CX to contact recipients) and a flattening of order void rate. Historically gift-giving void rates were much higher than non-gift orders. For this holiday season, the rate was the same as all other orders.
Some unknowable answers are knowable
One fall while at Drizly, I was part of a lengthy discussion with senior leadership about developing in-store pick-up as a new feature for order fulfillment. Drizly is an alcohol delivery service, but the theory was that users may like to order ahead and pick up at the store if it’s nearby or on the way home.
This was actually a discussion that was had multiple times. We would discuss it, try to pull some numbers on the opportunity, feel the data was inconclusive. And then we would have the same discussion again a month or two later.
This felt like an unanswerable question. It seemed as though the choice was to build it and find out, or not build it. But results would be determined after it launched. There seemed to be no way to know ahead of time if users would want this feature.
I will state for the record that at the time, I thought it might be better to build the feature than to not because at the very least it would be something some people used so… bonus?
There was one thing that changed that fall. I had onboarded our first UX researcher. More interesting to me than whether we should build this feature or not — and be right or wrong about it — I felt this was an excellent opportunity to prove the value of UX research. After conferring with our new researcher, we developed a study that could help get an answer to this question and I positioned UX as the team to solve this question from the C-suite.
We conducted what’s called a Kano analysis where users are asked about a bunch of features to evaluate their interest in those features. It will show some signal about whether users think the feature is exciting, considered table stakes, or something that’s just not that compelling. We asked users about a bunch of features related to fulfillment — the last mile of the experience.
The findings of the study showed that users weren’t excited about in-store pick-up. People might use it, maybe a nice-to-have, but not a driver. What the study also showed was that user’ biggest interest was around transparency. Once an order was placed there was no information available until a driver showed up with their delivery. Users found this frustrating. Users were interested in more information about their order post-order. This led us to prioritize live order tracking.
Live order tracking for Drizly is a bit more complicated than the same feature on Uber Eats or DoorDash. With those examples, the driver is part of the delivery service (Uber drivers and DoorDash drivers). With Drizly the driver is employed by the store, so we have less real-time data to provide. This makes the service a bit more complex. But the Kano study and some additional user interviews gave us the confidence to build live-order tracking. We would launch it the following year to great success.
Design demonstrates impact
Three examples of design proving their value, showing the impact the team can have on the organization and the products they build:
Design able to properly diagnose the problem and deliver a solution that returns desired results.
Design able to anticipate organizational needs and develop quarterly roadmap planning when teams are focused on immediate concerns.
Design able to answer questions that seem unanswerable, identifying the features and innovation that users desire.
So yes, as a design team you should understand business metrics and organize your work around delivering those results. But more than metrics, design practices a methodology that helps organizations move forward — to identify opportunities and quantifiably determine the best paths forward.