97 UX Things

The Right Screener Sets Up Your Recruit and Research for Success (feat. Katelyn Thompson)

April 18, 2023 Katelyn Thompson & Dan Berlin Season 3 Episode 5
97 UX Things
The Right Screener Sets Up Your Recruit and Research for Success (feat. Katelyn Thompson)
Show Notes Transcript

Katelyn Thompson discusses her chapter "The Right Screener Sets Up Your Recruit and Research for Success."

Sponsored by Watch City Research
Watch City Research is your trusted UX research partner

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Dan Berlin:

Hi everyone and welcome to another episode of the 97 UX Things podcast. Dan Berlin here, your host and book editor. I'm joined this week by Katelyn Thompson, who wrote the chapter The Right Screener Sets Up Your Recruit and Research for Success. Welcome, Katelyn.

Katelyn Thompson:

Hey, Dan. Thanks for having me.

Dan Berlin:

Thanks for joining the podcast. Can you please tell us a little bit about yourself?

Katelyn Thompson:

Yeah, sure. I'm Katelyn Thompson. I use she

Dan Berlin:

Nice. Can you tell us your UX journey, please? How and they pronouns. And I work as a UX researcher at a nonprofit supporting LGBTQ youth. And I'm currently based out of Western Massachusetts. did you discover UX? And how did you wind up where you are today?

Katelyn Thompson:

Yeah, I was lucky to have found UX early on. I was actually a junior in high school, and I was touring Bentley College. After the tour guide took us through the fancy trading room, they took us by the Design and Usability Testing

Dan Berlin:

Great, yeah, thanks for that journey. You mentioned Center and they told us a story about how the Center had been featured on 2020 about a story about how parents often struggled with toy instructions and setting up toys for their children to use. At the time, I had planned to study web design and thought that the UX grad program would be a great supplement to my studies and web design. But once I started in focusing more on research and some of the reasons behind that. Can you tell us a little bit more about that? What about research draws you in? the grad program, I realized that UX research was actually

Katelyn Thompson:

Yeah, I think getting to connect with folks and learn about their experiences and how the products and systems that I'm working on fit into their lives, I think is the thing for me. I loved talking with people and really interesting. And also helping them achieve their goals more easily is really exciting. So I love that contextual observing how folks interacted with products and systems, and inquiry work and the features that products can offer. then taking that feedback back into the design and development

Dan Berlin:

Yep. Are there any research studies in particular that you enjoyed from your from your career.

Katelyn Thompson:

I think one that was really interesting to me when I was back at Bentley was one around a phone system processes. Since going through the grad program, I have done a that helped elderly folks with support as needed in emergencies. And one of the key features that we were interested lot of different things. I had some internships during school, in was around the setup of this phone. So at the time, it was something that needed to be plugged into a telephone line as well as a power plug. And so we actually had the opportunity to actually. And I ended up working at the Design and Usability go out and visit folks in their homes and have them walk through the process of the installation. And we also gave folks the opportunity if they had a caregiver or a family member who Center, which was cool to kind of come full circle. I then also they might typically call on to help them do such a thing to have that person also participate in the session. So that was an exciting opportunity to see that environment and worked for a few months after school, doing usability research understand what that process would be like for folks.

Dan Berlin:

You bring up a great point about caregivers. When on tax software, and then spent eight years trying to make more doing that sort of research, you should be deciding that in advance, how much help a caregiver can give or not. Right? What did you do in that study? Do you remember? customers have an office supply retailer say that was easy. And

Katelyn Thompson:

Yeah, I don't remember exactly the specifics. I think we allowed those folks to be there. And we also interviewed the participants as well about what level of support after helping folks explore destinations and plan travel, they typically have a caregiver give. So sometimes it's a family member who visits them every week. And so they could have I've now landed at an LGBTQ youth crisis support line, someone do that and some folks like lived in assisted living where there were staff available for them to help. But some people had to move a couch to access the telephone line. So working primarily on the systems and tools that our staff use clearly, that wouldn't have been something that most of the participants would have been able to do. So I think we noted while on shift. a lot of the hurdles that folks couldn't find when doing the physical setup and then there was also some setup with the system where you had to call a number to confirm that your phone worked and activate the service. And so we tried to have the elderly folks be the ones who did more of that setup and then have the caregiver step in as needed, as opposed to having the caregiver be the primary participant.

Dan Berlin:

Right. Well, thanks for all of that. Your chapter, the Right Screener Sets Up Your Recruit and Research For Success. Can you tell us about that, please?

Katelyn Thompson:

Sure. The process of writing screeners and making sure that you get the right people so that you can do your research activity allows you to learn what you want to learn out of it. Before I jump into the act of writing screener, I want to talk a bit about the recruiting process. So when I first started in UX, recruiting was typically done either using your own list of customers that you had, either because they purchased on your website and provided an email address, or some other method that you had to contact them. Or by using a recruiting agency, who typically had their own list of folks in a panel. Sometimes they would cold call people, and then they would walk through the screener questions with folks on the phone. But in the past few years, there's been a number of new tools that have made finding participants so much easier. There's remote, unmoderated tools, where they have their own panelists, where you create a survey screener, and folks complete that in real time. There's other online panels with folks just waiting to participate in your research who complete the screener and then you select from a list of qualified folks to invite them to participate. And this has made the process of recruiting and scheduling participants so much easier. But it's also made the art of writing screener questions so much more important. With such a large group of people to choose from, and most of the screening typically happening automatically, you want to make sure that you're getting the right folks who will be qualified to provide the feedback that you're looking for. So with that said, with regards to writing the screener, the first thing that you'll want to figure out is what are the criteria of the folks that you're looking to talk to? Are there certain characteristics or behaviors that you need participants to have? So let's imagine maybe that you're running research on on a redesigned flight booking tool. So maybe one criteria that you want to have is to have folks who have booked a plane ticket online recently. You don't want to start a session off about your plane flight booking website and have a participant say, Oh, I usually just have a travel agent do all this stuff, the websites are too confusing for me. That is unless of course, you're looking to recruit people who don't typically use websites to see if your website is easy enough for folks like that.

Dan Berlin:

To use the naive approach.

Katelyn Thompson:

Exactly. Once you've got your list of criteria, you'll want to prioritize that. What are the characteristics that are must haves? What are nice to haves. You'll also want to get alignment with your stakeholders on this so that once your recruit is started, you'll know where are the areas that you can make concessions if needed. Maybe originally, you wanted somebody who typically pays to check a bag when they're flying. But as a recruitment goes on, you're having trouble finding these folks. Maybe this isn't a key element of the website that you're researching. Maybe you haven't changed the design there. Maybe that's something you can forego and then you could ask participants to imagine that they needed to check a bag and that price was no object with regards to booking

Dan Berlin:

Do you tell the recruiter in advance where that flexibility is? Or do you tend to hang on to that until later?

Katelyn Thompson:

It can depend. Typically, when working with recruiting agencies, I tend to use the same folks over and over. So I have that relationship with them. And so I will try and tell those folks up front that this is something that we're willing to compromise on, but these are the questions that we're definitely not willing to compromise on. And so that allows them, I find, to do their job a little bit better. What you don't want to do is to have them be like we were unable to find anyone who is willing to pay for bag fees, which is not true, because there are a lot of people who pay for bag fees myself included. So I think it's just I find the more information that you can provide them, the better they'll be at giving you the participants. So setting those firm ones I think are more important maybe than the ones that you're willing to wiggle on, but it's the same process.

Dan Berlin:

Gotcha.

Katelyn Thompson:

So another thing that you'll want to consider when you're writing your screener is the format or method that your questions will be asked, are they going to be shown on the screen one at a time or will they be shown on the screen all at once? Sometimes with using the online tools you don't know actually know, so I try and sign up for these tools as a participant myself so I get a bit better feel of what it's like to go through the tests or the screeners. So that can be helpful to understand the format. Is a question going to be read aloud over the phone? If you are using that online survey tool, does it have the capabilities to have complicated screening criteria where you could have multiple options that could be selected to qualify somebody? Or does the tool just use more simple logic? And then you might have to kind of break your question into two separate questions in order to allow or disallow somebody through?

Dan Berlin:

Why is it important to know about whether it's one question per screen or multiple questions?

Katelyn Thompson:

Yeah. So one thing that I like to think about is having your questions build up to your requirements. So thinking about this flight thing, I might ask people what brands they've booked at, because maybe I either want them to have booked through my website, or to not have booked through my website or flown through my brand. And so I might not want to start off with that question of like, which brands are you familiar with? I might want to start off with like, Hey, have you flown in the last year? And so if the questions are all on one page, the question of what have you done in the last year with flown on an airplane or purchase air tickets being one option? And then the next question is like, oh, what flight brands are you familiar with? If it's all on one page, the first question is not super helpful, because people could go back when you see that the rest of your questions are about flying, and fudge that first choice of like, oh, yeah, well, I flew, it was like two years ago, but maybe I'll choose it here. So that's just something that I think about, if they're all on one page that the element of progressive disclosure is not as useful.

Dan Berlin:

Yep. Is there a way around that?

Katelyn Thompson:

Get the tool to give you the option to show the questions one at a time or add page breaks. The tool that I'm thinking of, I believe they do allow for page breaks. But I think it is an added feature, you have to be at a certain level or pay for advanced recruiting or something. So I do think you can get around it. But I think you might get those people who are more likely to kind of like fudge into... The people that we used to call the professional testers. So once you've got your criteria defined and ranked in terms of priority, and figured out the methods that you're going to be using, it's time to actually write the screener questions. And the same principles that apply to survey and task writing also applyu to screener questions. You want to make sure that you use language that folks will understand, avoid jargon and acronyms that they might not be familiar with. And you also want to make sure that you're giving folks an out to the question. So that question I mentioned before about the brands of airlines that you've flown, you also always want to have a none of the above. Even if you asked the question before, have you flown in the US and you list all the brands, you always want to give folks that none of the above. You don't want to put folks in a position where they have to lie in order to like advance to the next question. And then I did mention earlier about that progressive disclosure, but there are some times, as often can in the UX space, it can depend. Sometimes you might want to start off with a more specific question. So maybe you're looking for folks traveling to Washington, DC during the weekend of July 4. And so you might want to start off with that is like one of the first few questions instead of having that the 10th question where you've wasted folks time and then they don't qualify since that's such a specific requirement. And so also thinking about the method for asking the questions. Circling back to that now that you're writing the questions. If you were asking that brand question over the phone, instead of reading the 10 brands, and then having people remember which 10 were listed and which ones they've flown, you might be better off to ask the question, thinking about the last year, have you flown on American Airlines? And then they can respond yes or no? Have you flown on British Airways and run through the list that way. Instead of on a survey, you would just list them all and they could pick them. You wouldn't want to ask yes or no questions for each of them in the online survey.

Dan Berlin:

That doesn't get too onerous for the respondents that way?

Katelyn Thompson:

I mean, it does take a little bit more time, but maybe you would reduce the list that you read to folks. And so maybe if you had 10 for the online, maybe you'd reduce the list to five and including the one that you care about. But yeah, the phone screener does take a lot longer. And again, it also can depend. I haven't worked in the side of things, so I don't know how they do it. But sometimes they might ask questions differently if it's in their panel versus whether they're cold calling someone. So if there are people in the panel who have already opted into spending time seeing if they qualify, maybe they go through all the questions in order. Maybe if they're cold calling someone, they're like, Hey, we're looking to do research around booking flights. Have you flown on this brand recently? They might pick the one that's more important to lead with. But yeah, I can't speak too much of that, because I've never worked on that side of things. Thinking about the method, when you have your intro meeting with your recruiting agency, they might bring something like that up. This is how we typically do things, we find it works better instead of reading 10 options, that you maybe only care about one. Or let's see if everybody qualifies and then get the full list of what other brands they've flown. Because sometimes that's helpful to know what other brands they also fly with. So once you've got all your questions written, it's time to start the recruit. And again, depending on the tool, sometimes you just launch the recruit, and the sessions happen. Sometimes you have that ability to select folks from a qualified list, again, referencing your original prioritization requirements. And then there, you're ready to start the research. But like I always say, it's always good to evaluate and iterate on your own work. And so maybe you've got your participants in and you realize that a lot of the folks that are sitting in for the sessions have been only traveling for business. And that experience of booking flights for business is very different. Typically, you use an internal tool. So maybe they flew on our brand, but they didn't actually look on our website. So maybe there's some additional questions to ask around whether they use our website, or whether they're traveling for leisure or something. It's important to always iterate. And then hopefully, with all that in mind, you've recruited the right folks, and they'll be able to provide the best feedback that you're looking for.

Dan Berlin:

make sure they don't lie, because we want to make sure that we aren't getting the right people in the door.

Katelyn Thompson:

Yeah, there's definitely a lot of survey design in your screener.

Dan Berlin:

I want to go back to something that you mentioned earlier, it sounds like you've used a lot of different types of recruiters, whether it's someone using the phone and their panel or cold calling or an online services panel. Are there quality differences that you've come across? Something that comes up in the field a lot, and as you said, avoiding professional participants, expert participants. So I'm wondering if you've ever find with the online panels that they are professional participants?

Katelyn Thompson:

Yeah, I do think it is always a question of quality versus price versus also speed. And one thing that I have learned, especially when it comes to unmoderated sessions, actually having people who are well versed at doing moderated sessions are much better at doing moderated sessions. As opposed to if you send a moderated session link to your own users who maybe aren't used to talking aloud, they're probably not going to do as great at getting feedback in a moderated session. With regards to the online panels for a scheduled interview. I do think that there are people in the in actually both the panels of the online panels versus the recruiting panels who tend to do a lot of that work over and over. So I've included this before and I've seen this a lot where folks are like, when was the last time that you participated in a research activity and screening out people, at least who have last three months or something. I've also seen it where it's like, when was the last time you participated in something about travel? So maybe they participated in feedback for Gillette product, but I'm looking about airline travel. So I'm less worried about them. I think it is something to consider, I have found most of the people that I've gotten tend to be more regular people. But every once in a while you will you will get that professional tester.

Dan Berlin:

Right. You gave a lot of great information here today. Was there anything else about your chapter that you were hoping to convey?

Katelyn Thompson:

Yeah, I think the only thing that I didn't really mention was about ways that you can use screeners to capture somebody's ability to provide feedback. So we were just talking about the quality participant. And so sometimes, more so on the phone you can do this, but you can ask people, tell me about a recent trip that you took, and if they can describe and provide, you know, put words together in a way that makes sense and describe their trip, you can evaluate whether they would be a good participant for an interview. And you can do that on online screeners as well using like open ended questions. I always try and say, in two to three sentences, tell me about your last trip that you took, or about an upcoming trip. And the people who say, I'm going to New York City, right? It's like meh, you couldn't follow the simple instructions. Then sometimes the topic isn't as exciting as travel. And so folks may have trouble coming up with the two to three sentences. But if you're planning to speak with them for an hour, then you'd hope that they could write a few sentences. So that's another soft skill type of screener criteria that you could use if you find that... Maybe if you're particularly using your own customer panel, and not people who are used to doing types of research activities, that as something that you could work into your screener.

Dan Berlin:

With that though, and I love doing that, too, I have that in all of my screeners. But the question always comes up is how do we avoid biasing who we are recruiting because of that? Maybe, English isn't the first language of the respondent? Is there something we could be doing there or thinking about there?

Katelyn Thompson:

Yeah, I think that's an interesting point and I think communication skills is an interesting thing. So I would say, thinking about grammatically correct answers is maybe like less important. But if the person is having trouble, stringing together... I do think the online survey with a written response does put people at a slight disadvantage, who may be better communicating orally. And if that's the case, if you are maybe looking to do research with with a population like that, then maybe you're better off to have people fill out the online survey, but then do an added phone screen to gauge more of their communication level or an intro session. I've done that before, where it's like 15 minute intro before the interview, just to confirm that folks will be able to provide useful feedback, especially if you have high stakes observers or I know some places I've worked, we had customer panels where 40 people would be observing. We want to make sure that these people are going to be good participants. So having that pre-work activity or meet up could be helpful.

Dan Berlin:

So we're just about out of time here. And you gave us a lot of great information. So thanks for all of that. In the final segment here, we love getting a career tip. Is there a career tip you'd like to pass on to folks who are either starting off in UX or are advanced practitioners?

Katelyn Thompson:

Yeah, I think something that I would encourage folks to learn more about is the domain that they're working in, and the systems that their users are using. So we are often very interested in our users and how they interact with our products. But I think understanding how your product works a little bit on the back end, allows you to provide better recommendations about how you can meet the needs of your users and better implement the feedback that you're recommending. One example of this was we had a section on our app that was called Nearby. And given the context of the rest of the page, a lot of users thought that this was things that were nearby to the destination that they had just been looking at. But in fact, it was things that were nearby to where they currently were. And after sharing that feedback, my recommendation was to update the content in that section to have it be things nearby to their destination. But we were about to launch. We didn't have time. And so the answer was basically no. But the recommendation that I came back with was, Oh, can we just change the title of the section to say, Nearby to You Now or your city. And that was easy to do. And we were able to implement that. So just an example of how understanding your system allows you to better provide recommendations and work with your stakeholders.

Dan Berlin:

Yeah, knowing the implications of your recommendations is huge.

Katelyn Thompson:

Exactly. Yeah. And oftentimes, I'll provide one, this would be amazing if we could do this. But if we can't do that, here's maybe another thing that we could do. So stating the problem and then providing different recommendations.

Dan Berlin:

And by showing that progression, it shows your understanding of the entire system and and helps with buy in as well. Yep. So great. We are out of time here. You gave us a lot of great information here today. So thanks for joining the podcast here today.

Katelyn Thompson:

Yeah, it was great chatting with you, Dan. Thanks so much.

Dan Berlin:

My guest today has been Katelyn Thompson, who wrote the chapter The Right Screener Sets Up Your Recruit and Research For Success. Hope you enjoyed listening. Thanks, everyone. You've been listening to the 97 UX Things podcast, a companion to the book 97 Things Every UX Practitioner Should Know, published by O'Reilly and available at your local bookshop. All book royalties go to UX nonprofits as well any funds raised by this podcast. The theme music is Moisturize the Situation by Consider the Source. And I'm your host and book editor Dan Berlin. Please remember to find the needs in your community and fill them with your best work. Thanks for listening