Web form specialist Caroline Jarrett on designing surveys that work

Creating a survey may be quick and easy, but so is creating a bad one. Is there a way to design a thoughtful, effective survey that gets results without boring customers to death?


For years, people have been abusing surveys. From dull subject lines to the wrong questions, from sending out questionnaires after the fact to sending them way too often, there is a lot that organizations have gotten wrong about how to design a survey.

Today’s guest is fighting to preserve the value of surveys in the face of all this misuse. Caroline Jarrett became interested in forms around 30 years ago while delivering optical character recognition systems to the UK Inland Revenue, which included processing tax forms. It didn’t work very well, and it turned out it was because people made honest mistakes when filling in forms that the technology couldn’t read.

Ever since, she’s become utterly fascinated by forms that are easy to fill in and the thought process that goes behind them, and on the way, she picked up an interest in surveys as well. She is the coauthor of Forms that Work: Designing Web Forms for Usability, and she’s been working as a survey consultant for companies and government agencies that want to improve the way they design their surveys.

Creating winning forms and surveys has been top of mind for us lately. We recently launched Intercom Surveys, a new feature that empowers you to connect with customers wherever they are, capture valuable customer insights, and use collected data to drive more engaging, personalized customer experiences. But as Caroline points out, even the best tools need to be used thoughtfully in order to get meaningful results.

In today’s episode, we chat with Caroline about her seven-step process for designing, running, and actioning considerate surveys that get results and don’t bombard customers all the time.

If you’re short on time, here are a few quick takeaways:

  • Catching people in the moment and not days after the fact is the best way to get quick, accurate results. Ask people something simple and interesting that goes straight to the point.
  • Be selective on the sample and make it clear why that person’s exact feedback is important that week. That way, you’ll get much better response rates and data quality.
  • Use the smallest sample you can that can still deliver good quality results. It’s less work, it’s less intrusive, and it gives you the opportunity to do another one if the first one goes wrong.
  • Ask one or two questions in the language, the style of speaking, and phrasing that people use rather than your corporate jargon.
  • To preserve the value of surveys and not exhaust customers with constant questioning, do fewer, much smaller surveys.

If you enjoy our discussion, check out more episodes of our podcast. You can follow on iTunes, Spotify, YouTube or grab the RSS feed in your player of choice. What follows is a lightly edited transcript of the episode.


The makings of a form specialist

Liam Geraghty: Caroline, you’re very welcome. It’s great to have you on the show.

Caroline Jarrett: Oh, it’s great to be here. Thanks so much for inviting me.

Liam: Before we get into the nuts and bolts of surveys, can you tell us a little about how you got started in the world of forms and surveys?

Caroline: Well, completely by accident. I used to tell people that when other little girls were playing with Barbies, I was cutting out forms from paper and putting them together, and I just like to gauge how long it takes the audience to realize that’s obviously a joke. But what happened was that, back in 1990, I got the opportunity to be a project manager of a project that was using optical character recognition techniques. It was a Rank Xerox project for the European Patent Office to capture patents which is a very, very specialist area, and the idea was we would scan the patents and digitize them.

It turned out that optical character recognition was good enough, but there were some challenges around patents. We imagine that patents are interesting, but they’re really not. I mean, a good 30% of them are obscure chemical names. So, our OCR, optical character recognition had to scan things like, oh, I don’t know, [A,C-hexifloro[, and suffice it to say, a spell checker would not work. We also had to scan patents exactly as submitted. If there were spelling mistakes or nonsense or anything like that, it had to be submitted to the patent examiner for legal reasons.

“A thing I’m particularly fascinated by is how people answer questions and think through the process of answering questions”

In the European Patent Office, the patents could be submitted in English, French, or German, and they always had an abstract in each of the three languages. So, you were guaranteed to have English, French, and German in every submission. The majority of them were in English, but they could be submitted in French or German. German has longer words than English. Both languages have a lot more accents than English. So, I accidentally ended up learning quite a bit about the practicalities of implementing optical character recognition in a fairly complex environment.

Then, I got a job with a big computer systems integrator in the UK. At the time, they were called Bull Information Systems. They were delivering all of the in-office personal computers to what was called the Inland Revenue, which had struck a deal with Bull to try using optical character recognition to process some tax forms, and I got involved in project managing the delivery of these optical character recognition systems.

Long story short, they didn’t work at all, so I got permission from the revenue to go and observe the systems in use, and it turned out, for example, one of the systems was in use for processing forms that were, broadly speaking, filled in by retired people with very low incomes. Optical character recognition, to this day, cannot deal with a form where someone has written “see the letter attached.” There were many inventive ways in which people could mess up the form. For example, it had one line for your interest from your bank or building society account. If people had two accounts, they would write two lines of tiny little numbers one above the other in the one box, which is a perfectly reasonable thing to do, but my OCR machine had no chance of dealing with that, and that was around 1992.

“I found buckets of really interesting and useful literature in the world of surveys that helped me in my work on forms, and one thing led to another”

I started working on that at the beginning of 1992, and we’re just coming up to 30 years. I just got really, really interested in how we design forms so that people can fill them in correctly, and I’ve just not lost interest in that. Ever since, I’ve continued to be absolutely fascinated by that challenge, and here we are still really interested in it, still really, really interested in it.

I’m the form specialist, and I’m a pragmatist about surveys. I could not describe myself as a survey methodologist. To be honest, I couldn’t even describe myself as being all that enthusiastic about surveys. What happened was that I obsessively tried to find all the academic literature and books written on forms, and to this day, I have one small bookshelf in my office, which is pretty much everything I could find. There wasn’t much out there, so I started looking for other areas where people might have written about forms and surveys came to mind because one of the really interesting challenges for me, or a thing I’m particularly fascinated by, is how people answer questions and think through the process of answering questions. How do we write and pose questions in ways that get accurate answers? That’s also a problem that’s really, really important to the survey methodologists.

And so, I found buckets of really interesting and useful literature in the world of surveys that helped me in my work on forms, and one thing led to another. I started answering questions on mailing lists about surveys because I accidentally learned about them, and then my mentor, Ginny Redish, told me that I had to write a book on surveys, and here we are.

The thought process behind a survey

Liam: For people listening, could you take us through the process of creating a survey? I’ve been reading through your presentations and everything. I know it should take a couple of days, but consider we’ve about 15 or 20 minutes.

“In terms of designing a questionnaire, you need to put that into the context of what you want to achieve with the survey overall”

Caroline: Well, most people come to me and say, “Can you have a look at my survey and give me a questionnaire?” Now, obviously, these two words are used very interchangeably. One of my favorite books on survey design actually has the world questionnaire in the title. That would be A.P. Oppenheim’s book. And one of my favorite books on questionnaire design, by Mick Couper, and I can’t recommend his work too highly to anyone who is interested in the literature, is called Web Survey Design, but it’s actually about questionnaires. Now, let me clarify. I’ve tried to use the word questionnaire for the question and answer thing we actually put in front of the people, and the word survey for the whole process starting from, “oh, I think we might want to do a survey” right through to delivering some kind of number as the result. So, that’s how I use it. Of course, in real life, people tend to use them very much interchangeably.

In terms of designing a questionnaire, you need to put that into the context of what you want to achieve with the survey overall. And I have a seven-step process to design a survey where I start with saying to work out your goals for the survey, figure out why you want to do it, what you hope to gain by doing it, and what sort of decisions you will make based on the results of the survey, and then working through who you want to answer and how you can find a sample of them because generally, with most surveys, we’re much better off using as small a sample as possible.

And then, think about the questions you want to ask, which obviously relate to the goals but also to the people you want because you want questions that make sense to them. Then, building that stuff into a questionnaire. Of course, these days, we have many different tools to help us do that. But the hard bit is the thought process that goes into it, getting it out. I borrowed the word fieldwork from the world of market research – they talk about putting a questionnaire into the field, and I couldn’t think of a better term to call it than fieldwork, which is getting your questionnaires out to people and getting them back.

“Just having responses isn’t good enough. You’ve got to do something with them that gets those responses to the decision-makers”

At that point, you have some responses which hopefully people will have answered. So, you have to clean the responses, think about them, and analyze them in various ways. And then, I’ve called the final step reports because just having responses isn’t good enough. You’ve got to do something with them that gets those responses to the decision-makers.

For example, once I was wondering, “should I write this article or this article for my next column?” People on Twitter who follow me are predominantly my audience for what I write, so I thought, “I’ll make a little Twitter poll.” I offered them two options. It wasn’t in any way something statistical. I just needed to make a quick decision. Put it up, people clicked on the poll, and one topic was more popular than the other. That was good enough for me. I just wrote the thing. That probably took me 10 minutes to think it through and 24 hours to wait for the results, and it was better than hammering my head about which one to write.

That’s the tiniest end of the whole thing, and at the most elaborate end, we have things like 10-yearly censuses, and the preparation, planning, and testing for the next census starts before the existing census is finished. It takes 10 to 15 years to do a census. That’s the other end of it.

Asking one person the right question

Liam: So, once you come up with the questions that need answers, how do you go about targeting the correct people? And how important is it to catch them in the moment?

Caroline: I’m really glad you mentioned catching them in the moment because I characterize three different strategies for finding people you want to answer and catching them in the moment is, I think, one of the most effective. A lot of us see these popup surveys on websites. And so, you can catch people in the moment they arrive at the website, and you have an opportunity to ask them perhaps one, maybe two interesting questions. Just as an aside, the number of websites that waste that opportunity by asking the question, “Are you willing to answer a survey?” absolutely astounds me because we already know that everyone’s going to click no and all we’ve done is learn what we already learn.

I mean, if you are going to do that sort of catch-in-the-moment survey, my pro tip is to ask them something interesting. For example, “why did you come to this website today?” is something they can answer straight away without having to distract themselves too much from what they’re trying to do. So, catching them in the moment is a great strategy. Another example I really love is seeing people queuing at kiosks at the end of museum exhibits to give their feedback on the exhibit, unbelievably. It’s so unusual to see people actually queuing up to answer a survey. And not just once, I’ve seen it regularly if it’s a really interesting exhibit and a well-designed survey. That’s one of the strategies.

“If you’re going to use a list, try and be selective and make it clear why that person’s exact feedback this week is more important”

Liam: Yeah, the number of times I’ve got emails a couple of days after the fact where I’m not really sure how I’ve felt about something or I’m not kind of in the mood or in the zone to be answering questions, whereas I might have been if I had been offered the survey straight away.

Caroline: Absolutely. And what you’ve got there, being asked after the fact, is the most typical way people try and find people, where they have some kind of list. In this case, the organization or business you interacted with has you on their system as a customer or someone who’s contacted them. They have a list of those people and they’re using it to generate a sample. And again, pro tip, don’t ask everybody. We’re all bombarded with this stuff all of the time. If you’re going to use a list, try and be selective and make it clear why that person’s exact feedback this week is more important. Could you remember the title of any recent email that you’ve had of that nature?

Liam: Oh god. Now you’re asking someone with a terrible memory for these things.

Caroline: It just shows how unmemorable they are, right? Another one comes in, “tell us about your recent transaction” or something. Yawn, I’ve got dozens of those things. Now, let’s try something that says, “Thank you so much for buying from us. Please can you answer our question of the week?” Right? We’re only asking 10 people. “Ooh, I’m one of 10. I will answer. It’s only the question of the week. Excellent.” You know? It would stand out. It would be a little bit different. It would be a short thing. It’s a question of the week. It’s not like your life history or something.

“Asking one person the right question is more useful than asking 10,000 people the wrong question”

These days, we’ve got so many techniques for sending little tiny surveys which could be much more engaging and short and respectful of people’s times and could be sent to much smaller samples, and we could get much better response rates and data quality, rather than this constant bombardment. And the response rates are terrible.

Liam: I suppose that leads us to the actual questions, and as you said there, “you’re one of 10 people we’re asking,” is it about asking one person or a few people the right question, rather than just blanket asking a hundred people?

Caroline: I think you may have read my slides. Have you seen the slide that says that asking one person the right question is more useful than asking 10,000 people the wrong question? I see this sort of thought process in colleagues and clients all the time. Somebody in a senior exec role says, “We need to find out about our customers.” Great. That’s a great thing to do. We need to get 10,000 of them. Why? And that’s it. There’s no further thought process involved in it. It’s lovely that they are doing research, but if they knew what decisions they wanted to make based on that, they could craft the thing a lot better.

“Why do you do the smallest sample? Because it’s less work, it’s less intrusive, and it gives you the opportunity to do another one if the first one goes wrong”

One of the things I’ve learned from the survey methodologists is that they try and do the smallest sample they can consistent with being able to get good quality results. Why do you do the smallest sample? Because it’s less work, it’s less intrusive, and it gives you the opportunity to do another one if the first one goes wrong, which can happen to the best of us.

I’m going to give you an example that is quite painful. Survey methodologists in Russia are finding that people just slam the door in their faces. “I’m not answering your questions. I might be thrown into jail because the Russian state is locked down because of the war they’ve inflicted on Ukraine.” So, boom, that would ruin your fieldwork, wouldn’t it? So, having a small sample gives you the opportunity to do another one another day, and it also gives you the opportunity to learn from the first one. So, with the exception of the censuses, which obviously cover everybody, they try to keep the sample as small and tight as possible.

The point you made is absolutely so accurate. Getting the right question in front of the right people; asking questions using familiar words in familiar ways; asking them in the language, the style of speaking, and phrasing that people use rather than your corporate jargon; all of those things really, really help get good quality results.

Preserving the value of surveys

Liam: The other thing that caught my eye was reading about what you wrote on fieldwork. You had an example of invitations and how invitations are so important, and one of the examples you had was a screenshot of an email that was very leading. It was basically encouraging you to leave a review, and it had five golden stars proudly displayed at the top. And I recognized these emails. They’re so transparent in what they’re trying to do that it kind of puts me off leaving a review.

“People think, ‘Oh, it’s so quick and easy to do a survey,’ which is sort of true, but it’s also quick and easy to do a really bad survey”

Caroline: Exactly. Using a survey to manipulate customers is clearly something that I cannot possibly recommend. And then the other one is using feedback to reward or punish your staff. So, you get these unedifying stories of customer service reps saying, “Please can you give me a 10 on the Net Promoter Score, otherwise I’ll be punished?” And that doesn’t feel any good to your organization whatsoever. It’s undermined the entire method. All you’re doing is measuring the ability of your staff to guilt-trip your customers. Is that something you really want to measure? I think not. Trying to get some of these routine bad behaviors eliminated would be a great thing. Don’t just routinely ask and ask and ask in ways that get gained or manipulated. It’s an unhelpful thing to do.

Liam: Before we wrap up, do you have any particular plans or projects for the year?

Caroline: Well, secretly, I’d love to get back to forms. I sort of accidentally refocused my work into surveys, which is fun and I’m happy to do it, but really, I love forms the most. So, if anyone happens to have any forms things they’d like me to do, I’d be extremely excited. But also, having spent a lot of time writing the book, I’m very happy to have the opportunity to talk to people about their survey questions. For example, I do form studios a lot, which is a meeting where we look at a particular form. I’m also doing quite a lot of survey studios where I get together with an organization that has a particular survey problem, and we spend an hour or so together just looking at the ins and outs of the sort of things I say, with a focus on what that organization’s looking at today. They’re a lot of fun. I certainly would love to do more of them as well.

Liam: Absolutely. And where can our listeners go to keep up with you and your work?

Caroline: Well, my website is called effortmark.co.uk, and I have quite a lot of stuff on there. I’ve been writing various articles and so on for about 20 years now. I also try, wherever I can, to release my slides for people to use under a creative commons license. So, if you want to pick them up and use them, providing you keep my strapline and my creative commons there, I’m very happy for you to have them. I’m also on LinkedIn and on Twitter, and I enjoy chatting with people about forms and survey questions. Those are the three places to find me, and of course, my contact details are on my website. I impress my nephews and nieces by saying, “If you google me, you get me.” They think that’s amazing.

Liam: I love that. We’ll link to all of those in the post. Caroline, thank you so much for joining me today.

Caroline: Oh, thanks so much. It’s been so fun. I really appreciate the opportunity.

Intercom_Organic_Feature_Surveys_FB_1200x628_FY22Q1