Uncovering User Expectations

Vitaly Mijiritsky
UX Planet
Published in
5 min readJan 6, 2024

--

An old joke tells of a student who approaches his university professor and asks, “Excuse me, may I ask you a question?” The professor replies, “Of course, how can I help?” The student asks, “Tell me, when you go to sleep at night, do you tuck your beard under or over the blanket?” The professor says, “You know, I’ve never really thought about it. I have no idea.” The student thanks him and goes on his way.

A week later the student sees the professor again — exhausted, weary, and bleary-eyed. The professor spots the student and yells at him, “You bastard! Because of you I didn’t get a wink of sleep all week! I’m not comfortable either way!”.

Yesterday I tried to unsubscribe from the mailing list of some website. I clicked the link, landed on the page, and marked NO for each type of email I didn’t wish to receive. But just as I was about to save, I glanced at the column heading and realized it wasn’t asking which emails I wanted to keep receiving, but rather which ones I wanted to unsubscribe from. Meaning I was supposed to click YES. At first I was annoyed by the dark pattern, because look, people come in with certain expectations, and clearly no one reads this stuff, and the site is clearly taking advantage of that and uses the opposite approach of what people expect.

But then I thought — maybe it’s just me. And even if it’s not just me, maybe it’s still not most people? Maybe less paranoid people who click an unsubscribe button do actually expect the site to use the “unsubscribe” approach, and since YES refers to that action, it’s actually fine?

I guess there are both kinds of people in the world. Now the question is, how do you make that call? How can you get that data? How do you uncover people’s hidden expectations?

So you could try asking people what they expect when landing on a page like this — whether they expect to select emails they want to receive, or select emails they don’t want to receive. Most likely you’ll get somewhat bewildered looks. And the bleary-eyed response you’ll get a week later is that they’re uncomfortable both over and under the blanket. The simple truth is we don’t think about these things. We only become aware of our hidden expectations when they clash with reality. If you ask someone to pay attention to what they expect when clicking an unsubscribe button, they’ll start over-analyzing themselves, and you’ve just ruined your own experiment. You need something cleaner and more objective. You need to measure, not ask.

As is often the case in situations like this, someone will chime in with the magic bullet for every design quandary, and tell you “What’s the problem, just do A/B testing and see what works better.” Except A/B testing is meant to launch two versions of an interface out into the world, then measure something specific (like how many people click the “Purchase” button). Both parts of that definition don’t really work here. First — there’s no way you launch two completely opposite versions of the interface to users — one where it’s “Subscribe to receive emails” and you’re supposed to click NO, and one where it’s “Unsubscribe from emails” and you’re supposed to click YES. That’s not the normal A/B testing of product cosmetics, playing with colors, images and text changes — this is creating fundamentally opposite functionality (which does ultimately come down to just two letters, but that’s not the point).

And even if we ignore that, what exactly are we going to measure here? The number of people who click NO in the “subscribe” condition versus the number who click YES in the “unsubscribe” condition? But how do we know what their expectation was when entering — meaning did they click what they clicked by mistake, like me, or consciously?

Confused? Take a number. It basically looks like this:

But I hope you see the light at the end of the tunnel here. There is one thing we can know with pretty high certainty — their true intent. If they came in through the “unsubscribe” link, we know exactly what they wanted to do — stop the emails. That means we don’t need any A/B testing, their behavior on our site will reveal their true expectation. And it doesn’t even matter if we went with an “unsubscribe” interface like the real site, or a “subscribe” interface as I expected. If more people click YES, that means their expectation is to be unsubscribing. If more people click NO, their expectation is to be subscribing. Because we know exactly what they wanted to do, and we know what they actually did. Crossing those two reveals their expectations. And it’s important to look at their first click, not the form they ultimately submitted, because we want to catch them before they realize something might be wrong and start analyzing the UI more carefully.

The ironic part is that this whole mess doesn’t even mean you should design the site according to that expectation you uncover, unless it really approaches 100% of users, which it never does. You design the site in a way that works for as many people as possible — whatever their expectations. So you don’t need to play around with opt-in or opt-out headings. Just explicitly state on each line exactly what it does. In our case it’s really simple, just toggle the buttons from Yes/No to On/Off for example. That would give us lines like “Marketing emails on/off”, and I think that leaves no room for interpretation.

“So why”, you might say, “couldn’t you just tell us that from the start and skip the whole mental gymnastics?” Well, that’s because research is one thing and design is another. Also, in a different scenario the solution might not be so straightforward, but the research still holds.

--

--