While the labels used to describe what Jen Matson does—interaction designer, information architect, user experience designer— may change, for over 15+ years she’s been doing much the same: designing, developing and leading teams to create the best possible user experiences for web and mobile. She is currently Senior Manager of UX Design at Nordstrom.
She’s also a maker and a teacher, having created her own prototyping framework and speaking on the topics of mobile web design and prototyping at conferences and at the School of Visual Concepts in Seattle.
IA Summit 2015
Topic(s): analytics and data
Building Consensus Around the Meaning of User Behavior Task maps. Customer journeys. Cognitive walk-throughs. All are artifacts of our process of seeking understanding about our users that we likely create on a regular basis. But how can we better connect that work to the process of web site data collection and analysis?
In this talk I’ll show how we can adapt our existing process and artifacts to drive the definition of what user data we need to collect, as well as how to better analyze and validate what we do, including:
- Using existing site analytics to set a behavioral baseline.
- Defining what we want to measure based on task maps and other UX artifacts.
The result? A shareable document that can be used to tell the evolving story about our users as we continue to learn more about them through data.
- Defining data we need to collect to validate business goals/direction
- Using task maps and customer journeys as the blueprint for both in-page interaction patterns and site navigation paths we want to track and measure
- Applying behavioral data from a newly-launched product to those same task maps and customer journeys as a way of telling an updated story about our users
Jen Matson: I’m here to talk about data and how I use it, and how we use it at my company to help us design better products.
Everyone loves data. At least it’s a very hot topic. A lot of the big companies, one of which I came from, I was most recently at Amazon. We use a lot of data for all sorts of decisions.
I found it helpful as a designer to become more comfortable using it, and not just for validating designs and being able to say, “Hey, look. What I did actually made an impact.” But looking at not just user data, like user research, but digging into Google Analytics and looking at that information and working with business teams so that we can work together to make better decisions.
But, there are pitfalls. Warning. Danger. Obviously data itself does not equal insight. You can be capturing data and collecting it. You not only need to look at what you collect, but have some people to help you out, like business analysts, to get a sense of what’s going on.
Another thing that I’ve encountered is a lot of companies, they like to talk the talk about using data, but it is a little more difficult in some organizations, to be really rational in using it. I found that it takes a lot of organizational maturity to set aside biases and use data in an effective fashion.
I’m going to talk about a project at my company and our process and how we do things. It’s called “RealSelf.” I’m one of two UX designers. It’s a really small company. It’s a startup in Seattle.
What we do is a unique niche. We’re the website that’s the largest community for people that are considering elective medical procedures, plastic surgery, dermatology and dentistry. There are a lot of people that come and visit our site, even though we’re not that well known.
Last year, we have over 51 million unique visitors viewing half a billion pages. People spent a lot of time researching this, as you would imagine. Five million hours.
These are procedures that are expensive. They’re often very invasive, and people find that they can’t necessarily talk to their friends and family about it. So, they come to the community, our website, to do that.
Perhaps not surprisingly, our users are overwhelmingly mobile. 72 percent is the latest number, and that’s only going up, because when you think about it, this is personal content. A lot of the photos we have on our website are pretty graphic, showing before, during and after procedures. You’re not going to want to use your family PC or your work computer to be looking at that.
Core team that we have, UX design, product manager and developers. We have business analysts and community managers, which are important to other members of our team. The analysts obviously help us interpret the data. The community managers are our connection to our users, because they moderate all of the site content, all of the postings. That’s the basic team.
In terms of approach, Agile process. Two week sprints. We all gather together. Do our stand ups together. We try and talk as much as possible. We are very much a fan of the build launch small things, small fully realized things as opposed to slightly larger half-baked things. Not that I’ve ever been a part of such a process.
That’s the overall way that we work, but of course, the data piece? What is it about the data? This is where what we’re doing is perhaps a little different. We share a lot of data in addition to collecting it.
We have weekly meetings. Our business analysts every week, we sit down. We have a full product team meeting. They go over the Google Analytics data, and there are certain things that we track. Visitors, number of reviews created, page load time. There’s a whole bunch of things and we get to look at the trend lines. We have year over year.
We have a robust discussion about it. It’s great, because it really is a discussion. The analysts will say, “Well, we saw this spike and we think this is why it happens.” Then someone will raise their hand and be like, “Oh, we actually got some news coverage from a couple of local TV affiliates on the popularity of botox, so we think that’s why that topic is trending.”
The community managers, we also meet with them on a weekly basis, so they can share their insights with us and we can tell them what’s coming up in the product pipeline and get their feedback.
This is all really helpful stuff. But how do we interpret that data? That’s the ongoing challenge.
I’m going to walk you through a project that I worked on recently for our question and answer page so you can get a sense of how we work as well as our in development eight stop process for how we products out the door. We’re a small company. We’re always changing it, but this is what’s working for us.
This is what we call a “question and answer page.” We get a lot of people that come to our website because they search on Google. They have a question. They type it in, and then they land on our page.
This page is created whenever a user on our website types a question and asks it. Then we have our board certified doctors, we only let board certified doctors participate. They answer the questions for everyone to see. This is valuable content, and we want people to check it out.
Since three quarters of our traffic is on mobile, this is really our web page, looks a little bit different. Like many companies, we’re still transitioning to mobile. We think about it first and foremost, but this definitely could use some help.
We identified some things that we needed to work on. Step one is what’s the problem or is there an opportunity here?
It’s not just that something’s wrong. Maybe we can do something great. Why do we think this is worthwhile, and all important, what data do we have to support it? We want to make sure that we’re looking at that.
In this case, Google Analytics was able to tell us we have a lot of people dropping off on this web page as compared to other pages on our site. This is this one with the highest bounce rate. We get a ton of traffic, but a ton of people abandoning.
We also have a ton of great content that is very relevant to questions that people have, because chance is if they want to know about cost, they want to know about financing options. Maybe they have other pre-op questions. We can help them with that.
Here’s the Google results that shows RealSelf. We’re a second result in organic. We’re also number three. We really cover that long tail. We get a lot of traffic that way.
Here’s a redacted snapshot from our internal Google Analytics dashboard that tells the story pretty darn well in and of itself.
The left hand column is all visits to all of our question pages. Then the right hand column shows traffic from that page to other pages on our website. That big red bar down the side of the first column, that’s all the people that only view that page, and then they’re out of there. They never look at any other pages on our website.
Clearly, we want to reduce the size of that red bar. Get more people going to other pages on our website.
Step two. Defining your users. That’s something all of us should know a lot about. What’s the profile of the person visiting this particular page, engaged in this task? Obviously, what’s motivating them and how can we help them with what we have to offer?
These are “Googlers” Googling for answers. They just want to get the answer first. Some of these people may want some additional information, but they’re unaware of what RealSelf is. They don’t particularly care what our brand is. It’s fine.
Here is a little sketch I created with a very simple task flow. Like I say, we do things kind of scrappy. I just used this as a way to show everyone else on my team this is what I’m thinking. Here’s how they go. Am I right about the background and the strategy? Checking with the community managers. This is kind of what I had in mind. I want to remove distractions, so I just break out a Sharpie, write it all down on a piece of paper. Use that to start discussions.
Step three, we want to make sure we have the business result that we want. The business folks, they definitely let me know. Making money.
What does product success look like? In this case, reduce the bounce rate. They do want to convert some of these casual users to active users. They contribute content to our website that really makes it more valuable for everyone, so that is a good thing.
They will increase the awareness of the brand, but as we’ll find out, when we’re trying to identify the intersection between business and user goals, where do they overlap to make awesome products? It’s really getting the answer and providing the relevant content.
Brand? Well, if they engage, they’ll learn more about our brand.
What are we going to design and build? How are we going to do it? This is the big question. We have a pretty good idea of what we’re shooting for.
What changes are we going to make? How long is it going to take? How many people is it going to take to do it? Sort of the Agile mantra of delivering highest value to user quickly.
In this case, we want to remove a lot of junk on that page. There are a lot of things on there which is preventing us from showing the content up front, especially on mobile. In order to do it fast, we decided to tweak the existing design.
I sketched a whole bunch of things that were really awesome ideas. Some of it’s there. The little “show more” and expand and collapse. It’s like, “That’s great, but let’s just try and fix the biggest problems we have first.”
I sketched some wire frames. Make my little “doctor answers” with expand and collapse. We’re unable to do that, but reduce the size of the breadcrumb. Move the search up into the header. Anything to move that content higher up, so people can see what we have to offer.
Here’s the big one that takes a lot of time. You’ve probably noticed up until now, it’s just sketching. It’s scribbling. It’s not putting a huge amount of investment in the design, because we haven’t yet had the conversation about how we’re going to measure this and how we’re going to launch this.
Everyone on the team at least needs to get an idea of what’s in my head and what I’m thinking. The developers need to get a sense of how big of a thing it is to build that.
We need to figure out what is going to determine success or failure. What data do we actually need? This is a big one. Do we have the ability to measure it now?
Lot of times, I’ve seen it happen where the metrics are determined by what we’re capable of measuring now versus what we actually need to do.
Then, obviously, how many people are going to see this? What percentage of our audience are we going to expose it to? What devices? What regions? What’s our target?
In our instance, top line we want to have a reduced bounce rate percentage in Google Analytics. We didn’t pick a specific number. We weren’t necessarily sure, but we definitely wanted to see that go down.
We wanted to see more clicks on specific links to related content. There are a lot of links on the page, and we determined the high value ones, the ones that would help the open to learning more Googlers would be the ones that took you to the “tummy tuck” topic page, for example, or it took you to photos. People on our site love photos. They could just click on photos all day long. If we can instrument that, see the people click on those content types, that’s success.
Unsurprisingly, we’re targeting mobile for the initial test. We decided, you know its fast. Let’s just look at mobile and not even do a design for desktop. We’re just going to release this to our mobile users. We’ll be able to get the information that we need.
The small percentage of users to start, this is going to vary based on your organization.
When I was at Amazon, one percent of our users, more than enough for statistical significance, but at a smaller site, it’s much smaller. That’s where we work with the business analyst folks, because they have all these formulas to determine what is statistical significance.
Then the fun part. Design it, the final version. Build it and instrument it.
Creating the final visual design. In my process, what I do, because it’s tweaking an existing page design, and I do have a little bit of coding skills, I’ll either go into the Chrome Inspector and tweak the code to do the design, or on mobile, I can connect my phone to my computer and use Safari’s developer tools to make the same changes in Safari, and it shows up on my phone. Then screen grab it for the devs.
Then the design needs to be built in Optimizely, which is the tool that we use to run multivariant tests. It’s a web-based tool. Basically, the way it works is you go in and you insert the code that you want for the new version of the web page.
For this particular project, I didn’t do that, because there were a lot of variables, but for smaller projects, I learned how to go in and make those code changes myself. Developers love that, because that’s one less task on their plate and then finally setting up the click and the other goals in Optimizely. Those are keyed off of the user behaviors. I’m going to show you some examples.
Here is the revised design, first of all. It doesn’t look all that different. It is minor changes to clean things up. We’ve got our “doctor answers” there, with the “have a question, ask a doctor.” “Ask a doctor” was much higher up on the previous page, but I was like, “These people just want answers. Asking a doctor is for when you’re more engaged.” That was a choice, it’s an assumption.
Here’s a screen grab from Optimizely, where it has the desktop version of the page behind. You need to use a little imagination when you’re using Optimizely to set it up if you’re doing it for mobile.
Here are experiment goals that we set up in Optimizely. This is, again, instrumenting all of those clicks, including, I should note, things we don’t necessarily care about, because we want a baseline. So we have clicks on the doctor card and the follow button.
We don’t actually expect the Googlers to click on those things, but we want to make sure that we’re tracking it, so we can at least see when it goes down, as we expect it will when we move things further down in the page, how that affects the things that go up. What is the relationship between things going up and down? Finally, launch, observe and learn. What do we learn?
We learned that the “show more” link, which shows you the full answer, which is what people want. When we move things up on the page, we got a lot more clicks on it. Good thing. That was a nice result. Success.
That follow button, which used to be next to the doctor card in the old design, we removed it. That was kind of a radical decision. “Let’s just remove it. Get it out of there.” Then we got the red. Conversion rate went down. Again, that is by design, totally OK.
This is why you want to have all those conversations with your stakeholders up front, because people that are not used to seeing data, they would freak out if they saw that red lit. “Why is it going down? I don’t understand.”
It’s totally fine, because yes, they’re Googling. They want the answer. They are not engaging with doctors at this point in their journey, and because we’ve talked about who our target is and who are users are, we know this.
Another quick lesson. [laughs] Unrelated to this experiment, but another thing you should know when you’re setting up these types of tests, you always want to test before you launch to everyone. You should test every single thing, because the problem is once you start doing it, you get addicted to it. Then you’re like, “Why did that happen?” “I don’t know.” “Oh, crap. We didn’t test it.”
We had a photo upload was so clunky, people created multiple review entries to do multiple photos. We made it really easy then. We fixed it.
Guess what? Reviews went down. That’s one of our really big metrics, and the CEO was like, “Our number of reviews has gone down. This is so terrible.” That’s because our initial data was misleading. I mean, it was accurate, we didn’t actually break anything. We made the user experience better.
There’s a way in Google Analytics for you to insert events whenever changes are made, so then when people are looking at the data, they can see, “You launched this A/B test.” “You put in this new photo uploader.”
This is our process. We try and follow it. It’s working well for us and you might want to try it out if you’re interested in using data to help make better designs and products.