On this episode of the Boagworld Show, we look at testing your user interface and how to integrate it into your design process.
Links Mentioned in Show
- Rocket Surgery Made Easy
- How to Get Started With Usability Testing
- Testing design: How do you test a design comp?
- Usability Hub
- Alice’s Website
Paul: On this episode of the Boagworld Show we look at testing your user interfaces, and how to integrate testing into your design process. This season the podcast is sponsored by Balsamiq and Fullstory.
Hello, and welcome to the Boagworld Show, the podcast for all aspects of digital design, development and strategy. My name is Paul Boag and joining me on this week's show is Marcus Lilligton. Well, I've drank half a bottle of cider and already it's gone straight to my head. Nobody cares what …
Marcus: Nobody cares about me.
Paul: We've also got Alice on the show. Alice Herbison, is that how you say your surname?
Alice: Yeah, that's right.
Paul: So, nice to have you on the show, Alice. Tell us a little bit about yourself. You sound distinctly Scottish. I'm gonna go ahead and make that guess. Is that right?
Alice: Yeah, that's right. I'm from Edinburgh.
Paul: Ahhh, the nice bit of Scotland.
Alice: It's very nice, yeah.
Paul: Well, you know, I'm just saying that some bits aren't as nice as Edinburgh. I like Edinburgh. It's got a nice castle and stuff, which you know, is good.
Alice: Yeah, for sure. It's really pretty. Very lucky to be here.
Paul: Have you been in Edinburgh your whole life, or, did you move there? Or what's your background?
Alice: Yeah, pretty much. I went to university in St. Andrews, so I didn't make it very far and rolled back down the coast again. I work as a user experience designer at a digital agency called Signal. I've been there for just over three years now.
Paul: Wow. So is this your first job after St. Andrews or did you do a few other things in between?
Alice: This is my first job after that, yeah, and I completed a master's program in 2013 and have been at Signal since then.
Paul: Now, I had the privilege of doing some work with St. Andrews a while back, and I don't remember them doing Masters in User Experience Design.
Alice: No, no, no. When were you involved with them?
Paul: That was a few years ago now. It must have been, I would have thought, three years ago many? Two, three years ago?
Alice: Oh, well, so my program was Human/Computer Interaction and I was on the first year that the course ran, and it was pretty new. There were only a few of us that completed the program that year. Obviously, I think it's grown since then.
Paul: See now, immediately I'm suffering from imposter syndrome at this moment, because you've got a masters in human/computer interaction, and I did an art degree.
Alice: Oh, God, no.
Marcus: At least you did a degree, Paul.
Paul: Well, yeah. I mean, compared to you Marcus. Alice is genius level, isn't she?
Marcus: Yeah. [inaudible 00:03:31], I'm just currently in that depressed mode of staying at a hotel next to an airport. It's sucking the life out of me as we speak.
Paul: Where are you then, Marcus?
Marcus: I'm at Gatwick.
Paul: Ohhhhh, exciting. Are you flying anywhere or is it just that you're, 'cuz it's even worse if you just happen to be staying at a hotel and you're not going anywhere nice.
Marcus: No, I am going somewhere nice, that's the good bit. I'm going to work, but I'm going to fly to Geneva tomorrow morning, so I'll be getting on the train to go down next to Lake Geneva, all the way to Vevey to see the nice people at Nestle.
Paul: Ahhh, that sounds nice 'cuz they've got lovely officers, too, haven't they?
Marcus: They have. Yes, next to the lake looking at the mountains. As you would expect Nestle to have.
Marcus: So this is depressing, but tomorrow will be nice.
Paul: Well, there you go.
Marcus: They've got a new brand and they're wanting us to apply it to their website. It's great.
Paul: Kind of just slap it on. Is that it?
Marcus: Pretty much, yeah. I shouldn't say that, talking about, oop, watch me say that.
Marcus: No, no, Paul. We will spend many days working …
Paul: Doing lots of …
Marcus: … to slap it on there.
Paul: You are the plasterer of the web design world.
Marcus: Anyway, so that's me. I'll just fizzle out into the gray carpet next to the gray wall.
Paul: Ahhh, I'm so sad. It makes me weep. So what kind of work have you got do, Alice? Have you got any, you don't need to name clients, but what client, unless you want to, but what kind of work do you get to do on a day-to-day basis? Any particular sectors or is it just a real mix of stuff?
Alice: Yeah, it's a real mix. It can be different, anything from commercial clients to charities, a financial sector. The client list is quite broad, actually.
Paul: Oooh, that's nice. I like having a good mix.
Alice: Yeah, it's amazing.
Marcus: So do I.
Paul: It's funny, I don't know about you, but I like a mix on one hand, and I like the variety and that kind of stuff. But there's a little bit of me that likes the idea of working in an organization as well, and actually taking something through to completion. Can you ever imagine yourself going in-house?
Alice: Actually, yeah. It's quite interesting, 'cuz obviously an agency is so varied and so fast-paced. I would really love, yeah, at some point in my career to go on the other side and really … I guess it's a stronger sense of ownership as well, isn't it? 'Cuz you're really embedded. You need the right kind or organization, though. I think it's got to be something that really has a very strong in-house digital side of things.
Paul: Yeah, otherwise you live in a constant state of frustration, as many of our clients do. You know?
Alice: Yeah, I know what you mean. Yeah.
Paul: Which is never a good sign, is it? You never want to be in that kind of situation.
Paul: Marcus, I never asked you that. Does that ever appeal to you? The idea of working in-house?
Marcus: Weirdly, yes it does. It's kind of one of those things that it's not necessarily from a work point of view, or is it? 'Cuz obviously I'm involved in a lot of kind of, getting business in type of stuff. And I think, "Well, wouldn't it be nice not to have to do that?"
Paul: Yeah, I know what you mean. Yeah.
Marcus: Not to have to, to say I'd run an agency, not obviously on my own, but there's a lot of responsibility there. So, to go somewhere where you're kind of fed stuff, it might not have the kind of variety that the agency life has got. I wouldn't certainly have to go to Switzerland. I didn't know about this 'til Monday, but, which is exciting obviously. Having those kind of responsibilities taken away so you focus on just fewer things, that would be really nice. It's not going to happen, but it would be nice.
Paul: Is that idea of, kind of, just doubling, yeah, it's that focus, isn't it? When you work, when you say that, it depends where you end up working, 'cuz I've worked with a lot of in-house teams, dragged in all kinds of different directions. But that idea of just, your job is just to improve the user experience on this app really appeals to me.
Marcus: I love the idea of writing, like it's your job, like you're kind of like a copywriter. Maybe you were given the kind of job to develop a content strategy and do all of that. We were talking last week about design systems, the tone of voice stuff. That would be great! I never get to do anything like that.
Paul: Aww…you're so [inaudible 00:08:29].
Marcus: But, I'm not. But I really am not [inaudible 00:08:33], so. The grass is greener, isn't it?
Marcus: I wish I was flying round the world doing stuff that was maybe more responsive.
Paul: I wish I could be spending my evenings in gray hotel rooms recording podcasts.
Marcus: Well, I'm going to be really [inaudible 00:08:52]now. I spoke at a conference today, and that's a rare thing for me. Purely because Chris couldn't do it and landed it on me last Friday.
Paul: Ah, nice.
Marcus: Thanks, Chris. Yeah, so I did his, it went all right. But I was going to stay in London again and then go to Heathrow in the morning and then get a British Airways flight at a sensible time to Geneva, but they were 1,500 pounds each.
Marcus: There's an all too easy jet from Gatwick, then that leaves at 6 a.m.
Paul: Ouch! Not nice.
Marcus: So that's the only reason I'm here.
Paul: So, Alice. What would your perfect job be? Or are you already doing it?
Alice: Gosh. Um…
Paul: That's alright. I just thought I'd drop that one on you.
Marcus: Are the agency owners listening?
Alice: No, oh gosh, you know that started a whole question. Um, I think when …
Paul: No, I'm not expecting you to necessarily answer it, don't worry. Go for it.
Alice: Imagine job. Like, really good colleagues. A collaborative working environment, which I have got at the moment, so that's really great. Something maybe, a public sector. I think that would be nice. And yeah, in-house is definitely sometimes appealing. Although I really love the fact that I get insight into different industries and different areas at the moment. The kind of detail you can get into, you know, it's amazing.
Paul: You find yourself learning about all kinds of things that you never thought you'd know about, you know?
Alice: Yeah. Absolutely.
Paul: But I never expected to know about American law firms, but somewhere along the line I had to do that. If you go back a very long way, I once designed a website for a chicken incinerator plant, so I learned about chickens being incinerated. Now there's a weird thing to learn about. So yeah, that's the weird world we live in, really.
Alice: I know, yeah. It's amazing the stuff we find out and we learn just meeting people from different job backgrounds and stuff as well.
Paul: Yeah. So tell people about where you work at the moment. What's the name of company? How big is it? You know, that kind of thing.
Alice: Yeah, so it's called Signal, which is a bit of a recently rebranding. It's part of a larger, I guess, group of agencies, kind of umbrella group. And a couple of years ago, maybe a year ago, several of them under that umbrella kind of merged and became Signal. So previously it was called, the one I was at was called Blonde, which is digital delivery and design. And a couple of the sister agencies are CRM kind of things, and one of the ones that's just kind of often, I guess, still part of the group in Signal is the Leath Agency, which is the advertising agency. In Scotland, they do quite a lot of the Scottish campaigns for stuff. And Signal I guess, in its merged kind state now, is about 200 people.
Alice: Yeah, so between three locations. Having said that, it's really, we're now one unified agency having come together from a few of them. We're really still in the same teams. That we're digital builds aspect of it. And a few things have changed, but it's just a bit bigger now. A bigger culture.
Paul: So, how, your particular part of it. How many people are in the digital build part?
Alice: Roughly, I want to say like maybe 40-50.
Paul: OK. Yeah.
Alice: That's probably …
Paul: For a digital agency, that's a reasonable size, isn't it? It's a nice mix of people. But still small enough that you can know most people. When you're not kind of going, "Who are you? You've been working here for 4 years?"
Alice: Yeah. No, absolutely. It's definitely, it's a good size.
Paul: Cool. Well, I mean, what we're intending to talk about today, dearest listener, is testing, as you would have heard at the start of the show. So we're going to get into that in a little bit. User testing. I'm quite interested to hear how Alice's company does it, and her kind of opinions on it. But before we get into that, I just want to quickly talk about Balsamiq, who has been our sponsor through this season. And thank you very much for them for doing that. So Balsamiq is an easy-to-use prototyping tool that's been around forever. I personally used to use it a huge amount, but kind of over time, I stopped. I think there's quite a lot of people that have stopped. Probably as more fancy prototyping tools came along. The like of Sketch and InVision, and things like that. And actually, it's interesting that Sketch has just introduced prototyping features into its latest version, version 49 I think it is.
So we've kind of, as digital professionals, we've got these great tools for us to use these days. But what about everybody else? What about stakeholders, or clients, or you know, your boss to express their ideas? There's no way they're gonna sit down and learn something like Sketch or InVision. And that is where something like Balsamiq comes in, because most prototyping tools are really hard to use because, well, not really hard to use. They're find once you get to know them, but they're aimed at professional designers. But for a lot of people like product managers, business owners, consultants, programmers, those kinds of people, they need a really simple prototyping tool that they can just pick up as easy as picking up pen and paper, and that is Balsamiq.
Balsamiq wireframes has really kind of democratized the design process, and I have known designers to participate in it and then express their ideas. And some designers might have a problem with that because they like to have all of the ideas themselves, but when you grow up and realize that that's not how the real world works, and you know, if you don't include people, if you don't collaborate with people, then they're gonna reject your designs and you're gonna get into conflicts. You really want to involve them, and Balsamiq is the tool that will enable you to do that. So you get a 30-day free trial of it, but if you then decide to sign up, when you enter your billing information, maybe sure you use the code Balsamiq BOAG, 'cuz that will give you an additional three months for free, so it's definitely worth doing. You can find out more about them at Balsamiq.cloud. So that's Balsamiq.
OK, so I wanted to talk a little bit about kind of testing, and testing of designs and user interfaces. We're very much looking at the fundamentals of user interface design, and obviously testing is a huge part of that. I wanted to start off by looking at, really, two types of testing, 'cuz all testing basically boils down to these two types. There's qualitative and quantitive testing. And so I'm just kind of really interested, from both Headscape's point of view, and I've already forgotten, Signal's point of view, sorry. Pint of cider, I've forgotten everything. How you guys go about approaching these things? So, um, Alice, why don't you kick us off. I mean, do you do much kind of quantitive testing in terms of kind of, using hard data or, do you prefer the more qualitative usability testing type approach?
Alice: We do actually wee bit of quantitive. I know that, I was thinking about what that means in terms of the actual data side of it, and I think practically, the tools that we use are something called Hotjar.
Paul: Oh yeah.
Alice: Yeah, so that's probably the best example, which will give you things like statistics around page clicks and scroll depth and stuff like that out of a sample of site users and does also do qualitative stuff as well. It has wonderful screen recordings, which are really great to watch back. So that's a tool we definitely use. And A/B testing at times, it's something my colleagues do. It's not something I've ever personally set up on a website, but I know that for some financial services clients that's been a tool that's been used and has shown some interesting things around what color a button should that gives more conversions. We do a little bit of tree testing by using Treejack.
Paul: Oh, yeah.
Alice: So that's actually, I really love that because if you can configure this in the right way with your labels and your sections and all that kind of stuff, it gives really good statistics back around like percentage of people who got to succeeding in finding what you wanted them to find, and that's great, because it can be remote as well, so you can just send off to your participant and put it around a bunch of people and get feedback back, so it doesn't have to be in person. Hotjar is the same, obviously.
Paul: I mean, that's a good thing about quantitive testing, isn't it? You know, you can get large numbers of feedback because you can do it in this kind of remote way and over distance. It's like card sorting. You can do some remote card sorting, which is the same thing really as tree testing you were talking about. Marcus, I mean obviously Chris is the guy for quantitive testing, isn't it? I mean, he does all kinds of stuff with Google analytics. It's like witchcraft, isn't it?
Marcus: I've used Treejack, but not for ages. The only thing I can add to that is we've done some design testing, kind of what the early stages, when you're kind of testing mood and things like that, where we've done some online testing, sort of getting as many, hundreds of people to respond over a few days to a test. Where we will kind of ask questions associated with a design that they're looking at, so you can say, a class example would be, "Does this design make you feel happy, sad, violent, whatever," and you can get them to click on those words. So therefore, you are getting numbers on those particular types of words. Obviously that was a silly example, but you might be sort on a balance between on whether a design should be just a little bit that way, or a little bit this way. And doing that kind of test can help with that. I remember we did some work with the butterfly conservation [inaudible 00:20:14].
We're now redesigning that site again, but I remember we did some testing for them where we did that, and it just basically helped us to kind of, to know that we were doing the right things. We were making the right decisions, which was really helpful at the time. Yeah, to add to it, that's the only thing that we've really done that quantitive on the design testing front.
Paul: That's particularly good for testing aesthetics, rather than, when you're doing things like tree testing or, you know, some of the time to complete tasks, that kind of thing, that's obviously usability focused, but if you're trying to test visual aesthetics, then what Marcus described has got a fancy name called semantic differential testing.
Marcus: Really? I didn't know that.
Paul: It's a stupid, I hate these words that people make up, but basically what it's saying is, you kick off your project by deciding, well what kind of words do we want to communicate, right? Do we want it to feel progressive? Or dynamic? Or simple, or whatever. So you take those, then you take their opposites, right? If you want it to be progressive, then the opposite might be conservative, for example. So you take the opposite words, then also you throw into the mix some random words, which when you look at the design you think, "Ah, I wonder whether this is going to come across as a bit busy," so you add the word "busy" in. It's not a word that you want, but it's one that you think might come across as. So you mix all of these together, put 'em in a survey and shove it online.
And, of course, the big thing when you're testing something like design aesthetics is you want … with usability testing, most people suffer from the same struggles when it comes to usability, unless you're trying to design something for a very young audience or a very old audience, or a peculiar audience. But when it comes to aesthetics, you really need the right kind of people, 'cuz people will respond very differently. And you need numbers to get rid of all of the little wrinkles. You can't just send it to six people, and that's where a survey works really well.
Marcus: Yeah. It's a kind of connotation tool, I often think of it as. 'Cuz normally, you kind of know …
Paul: It's particularly good in those situations …sorry, go on Marcus.
Marcus: I said, if you've got 500 people validating your decisions, then that does not help [inaudible 00:22:40].
Paul: Yeah, yeah. Because you get into that situation where a client personally doesn't like it very much. I mean, I remember, do you remember back in the day when we designed University of Portsmouth website?
Marcus: Oh yeah! The purple site.
Paul: Yeah, this was back in the MySpace generation. You know?
Marcus: 2003 is what it was, Paul.
Paul: [inaudible 00:23:01]. So I designed that site, and I hated it with a passion. I was the one that designed it, but I hated it. The client hated it, but the audience loved it, you know, it's kind of, that's where design testing can be really useful. Have you ever done anything like that, Alice? That kind of aesthetic design testing, or is that the kind of thing you do as much?
Alice: Not in any of the ways you've just described, sort of with those words and thoughts and feelings.
Paul: Oh, OK.
Alice: I guess it's more feedback during usability testing for the qualitative kind of side of things that would maybe reveal difficulties that aesthetics are causing.
Paul: It's a very tricky on to test, really. To test aesthetic over usability. With usability, you can say things like, you can test time to complete task, or first click analysis, you know. If the first click they make is the right one, then they've got a good chance of completing, that kind of stuff. While aesthetics is all a bit woolier, really. So it can be a tricky one to get right, I think.
Alice: Yeah, definitely.
Marcus: Much more up my street.
Paul: You like wooly, do you?
Marcus: Well, I like aesthetics more than a good hard usability type stuff. [inaudible 00:24:30].
Paul: Alice, have you done much in the way of usability testing? And if so, how'd you go about doing that? 'Cuz there's no right approach. Everybody does it slightly differently, so I'm quite interested in how you guys approach it.
Alice: No, for sure, I've done quite a bit of that actually. It depends what stage of the budget you're wanting to bring that in. And whether it's looking at benchmarking, maybe, if it's a site redesign, benchmarking the current one. We would also bring in other qualitative research at that stage – interviews, discussions, that kind of thing with maybe current site users or certain audiences that we want the site to, that meet their needs. And then we would use usability testing; only recently we used it at a few different stages actually. Currently there is something going on with an InVision prototype, so that's going out to testers.
I ran a research study at using a live version of a website, which we've done, and that site had been put together in phases, so at different sections I did at different times, and it was quite incremental, so obviously as we got further on, things just, you're seeing things you didn't realize at the start. And like, "Oh, we could really improve this," so we ran quite an extensive set of usability tests actually. We had around, sort of five groups, five audiences that used the site in different ways and interact with that organization in different ways. There were physical visitors who would be checking in online and then going in person. And there were people who were sort of very much involved with the organization from a community level, so they had really in-depth knowledge, and obviously had very precise tasks that they needed to complete on the site. And each group had maybe a certain section that they would interact with. And we built tasks around them, so we get to know them, the kind of things they need to do on the site and then let them go and try and do that.
So, that was good. That was around 30 people overall.
Paul: Did you do them individually, or did you do them as groups?
Alice: It was individual, so it was just me and the person each time. It took a wee while.
Paul: Yeah, I was gonna say, with 30 people. Yeah. Did you record the sessions or were you taking notes as you go along, 'cuz it's often quite hard when you're by yourself and you're trying to facilitate it and kind of keep a note of what's happening as well.
Alice: Yeah, for sure. It was set up so that there could have been someone sort of logging in. I'm not sure if you've heard of a tool called Lookback.
Paul: Yes. Oh, I love Lookback. I love it.
Alice: Yeah, so obviously you can have people joining, but, I think just because of the time, at the time it was just me running the test so, what I ended up doing, but for my benefit as well, I don't push myself during the sessions to take written notes because I only do one thing at once.
Paul: Yeah, absolutely.
Alice: And I struggle to sort of like, listen, and I like to be engaged with them and to really get in and understand maybe the problems that they're having and the reasoning behind that, so, I set up a few different ways of recording. So we'll be recording using Lookback, which will grave the screen as well, which is amazing. And so, that can do device. I think I used Lookback for desktop and laptop devices, but then we also were testing on tablets and mobile devices as well, so I can of fashioned .. I'm not sure where it came from, I think it's actually like a tool, on Kickstarter or something, so we stand and it was meant to have its own camera, which you plug into your computer and it clips onto a stand, so it films the user's hands as well.
Paul: Oooh, that's good.
Alice: Which is really good in theory, but, it was one of those moments where I couldn't quite get it to work, so what I ended up doing was taping my phone, or one of the testing phones, to the top of this little stand and it actually, we are laughing, but it actually worked perfectly, because you can see and you could get the user's hands in the frame, so you could hear them talking. It was videoing the screen of what they were doing. And it was amazing. So what I do once it's all recorded and stuff is that just to make sure with the project timings and things, there's enough time for analysis, so I would sit through and listen and watch back every section and make notes from them. I know it sounds like a lot, but, I mean I'd lived through it already, so it's not like you're having to really go in-depth. You can skip bits and maybe make a note at times I knew I wanted to return to in particular.
I find that a lot easier, 'cuz then once you've written down sites, use Cello to gather all of them and on every single little insight there comes a card, and then you can tag it with your participant and sort of one of the labels. It ended up being this amazing board with all these different notes on it tagged with different labels, and then you can see how many people are having the same usability issue.
Paul: Yeah! That's good! I like that. I like the idea of using Cello. I never thought of that. That's very good.
Alice: It's not something we came up with, I must say. It was a template that is available online somewhere. I can't remember, I'm sorry, who did it, but I am absolutely [inaudible 00:30:45]. It's really good, really good for that.
Paul: Cool. The other thing I really like, which again I haven't done, is the actually filming. See, I've always whenever I've done mobile testing, I've always tested, I've always done it with a screen record, records the screen. But the idea of filming and showing someone's hand, because you can see whether they did the rover things and that kind of stuff. That's good.
Alice: Yeah, it's really amazing. And it's really good 'cuz you can, you've got such strong evidence for something if something is causing problems. And you're suddenly seeing something through their eyes as well.
Paul: Absolutely! I mean, that's the great thing, isn't it, about usability testing? Quantitive is great because it gives you that hard data and everybody, it's great for justifying things and it's great for showing you where problems are in the process, but nothing beats that qualitative sitting down with someone and seeing their struggles and their frustrations and their annoyances, and that kind of dithering and being unsure. And being able to then ask them, so "What are you thinking? Why did you pause there?" All of that kind of stuff. It's just so good, isn't it?
Alice: It's really, really good. I think, yeah, you can get some great insights that you can play back to people as well. I'd also say it's nice just to be involved with, it's a very human thing to do usability testing. Like, it's such a nice experience. It's quite interesting, the kind of feedback you get, like, a difference between participants who've come through a recruitment agency versus ones that will have more hands-on, definite in depth knowledge of the organization on site for your testing. And that can be quite interesting as well.
Paul: You use recruitment agencies, and I gather from that?
Alice: Yeah, we do. We use, obviously, screeners and stuff to make sure that people would fit the criteria. In general, when we reach to agencies, it's for the kind of website audience where we know we can get the, like a very general visitor, if that makes sense. The most, I think, on the other side of that, some of the most valuable insights we've had come from the more niche audiences where we've reached out to people and spoken with them.
There is just this absolutely wonderful moment last year where we used the contact form submissions for the site, because it's quite a large organization, so there's people all over the world contacting them using the form. We got in touch with this man who was having some difficulty using the online shop and we read his message, and we're like, "This is so weird. Like from reading this we can't work out what's going wrong. We don't know if it's technical. We don't know if it's something in the interface that's causing problems," and we were able to contact him, get him on the phone.
Paul: Oh, Cool!
Alice: And it was really great. We understood, we sort of tried to get him on his computer on his end; talking through some stuff. We were looking at the site as well trying to understand what he was seeing, so it was remote, but we were trying to see the same things at the same time.
Paul: So you didn't use Lookback for that, then, because it can do remote viewing, can't it? Where you can see …
Paul: … the screen at the same time.
Alice: It can. We didn't in that case. I think that would have made things a little bit more complicated 'cuz he was in reality, he was a customer, and he was also, I think he was on like an older Windows machine and he was a pensioner and things, so I think … he had good technical knowledge, but not, I wouldn't have wanted to complicate that by trying to send Lookback link, and things like that. But it was really great, because from that experience we had this kind of, we shaped our framework for this kind of feedback and testing. And I kind of helped develop it and it was, half of it was around spontaneous feedback loops, which is stuff like from contact forms and from staff and internal things.
And then on the other side is planned research, which can run concurrently, so that's things like usability testing, foreign interviews and stuff like that. So it's like two sides of the same coin that can work together to get insights.
Marcus: That reminds me, Paul, when you went and did some testing for the Wiltshire Farm Foods sites?
Paul: Oh, when I went into peoples' homes?
Marcus: Yeah. Wasn't there something about … this may be folklore, but I'm sure I remembering you saying something along the lines of, "somebody couldn't see the basket button because of the Post-it Note over the screen."
Marcus: I mean, that's a crackup.
Paul: That's the great thing. Alice, have you ever done in-home testing, where you've gone to peoples' homes.
Alice: Not homes. No. I've done it in the organization, like at their …
Paul: Right. OK, at their work desk.
Alice: … computers. Yeah.
Paul: Same kind of principle, isn't it really?
Paul: But I really like doing that because it just gives you incredible insights into the person, right? If you bring them into, like, a usability lab, then you don't, other than what they say and what they do, you don't really know much, but if you go into someone's home, or even into their workplace where you can see their office desk. You get an idea of how tidy they are. You get an idea of what they like, what they don't like. The things that they're passionate about. You know, all of these different insights into them as a person. And that particular one was just hilarious. Because she was, again, elderly audience, and a stereotypical cat woman, right? She was obviously widowed.
Marcus: Your favorite, Paul.
Paul: Oh, yeah, absolutely. She was obviously widowed, and she'd filled her life with cats. And so she had numerous cats in the house, and then every surface was covered with trinkets and knick-knacks of cat-related stuff. And so, you know, we had a chat for a bit, and I said, "Would you mind if we test, you know, have a look at the website and get some feedback on it?", and she said, "Oh, absolutely fine." So she takes me over to this desk that is absolutely covered with stuff, basically.
And she sits down and we wait for 10 minutes while her Windows 98 computer boots up. I might be exaggerating slightly for comic effect, but it was a bit like that. And then immediately, we started having problems because the amount of space on her desk was so small that she could barely move the mouse. She used to have to pick it up and kind of move it two centimeters, yeah, then pick it up and move it two centimeters, pick it up. It was just dreadful. And then, of course, as soon as she sat down, a cat jumped on her lap, so she was trying to do it one-handed.
And then she had this Post-it Note right over the top of the shopping basket. I mean, it was doomed to failure, really. But that's reality. That's how people use the internet, you know. You don't get that if you just bring people into a lab the whole time. It's good. Talking of which, Marcus, do you remember another Wiltshire Farms Foods one where the whole session had to be canceled. This was Rob told me this one, I wasn't actually there, 'cuz every person that they brought in had never used the mouse before.
Marcus: I do remember that.
Paul: They were all an elderly audience, right?
Marcus: All working off laptops, that's right. Yeah.
Paul: All using laptops, so they knew how to use a trackpad, but they didn't know how to use a mouse.
Alice: You know, you forget, don't you, there's joystick mice as well that can be common, so, yeah. Wow.
Paul: All kinds of things, isn't there? So, it's a really fun thing to do. I really enjoy it. I don't get to do it that much anymore, which is a bit of a shame. But 30, that's quite an impressive number to do 30 and then review 30. How long did you have to do that?
Marcus: Yeah, reviewing them. Ugh.
Alice: I mean, obviously, [inaudible 00:39:38] to the sessions, but I was jumping through them, especially as I knew sort of markers in the session, but the ones, the bits in particular I wanted to review, I think the sessions were only 40 minutes each, because each audience had specific tasks that they we wanted them to complete and observe. So yeah, listening to them back, maybe 45 minutes to an hour for each one.
Paul: Right. When you ran, I always think 40 is about as high as you can go for a session, 'cuz I think people get quite tired, don't they?
Paul: I also think 40 works well 'cuz then it gives you, if you do one every hour, then you've got 20 minutes to take a break, go to the toilet, and make a few notes, but you do need a bit of a break between sessions.
Alice: Yeah, absolutely. And you know, things can go wrong. Like people can be late, and you need to account for that as well. [inaudible 00:40:36] effect. Throughout the day, yeah.
Paul: Did you get any no-shows? People just not turning up?
Alice: We were actually very lucky. No one …
Paul: Oh! Crikey!
Alice: … didn't show up. We had, I think there were a couple of people that were a wee bit late and wee bit, kind of, rushed that emotionally, and I mean, aside from that. I think everyone came on time. Yeah.
Marcus: That's great.
Paul: That's quite unusual. I usually have at least one or two drop out, and out of 30, you would have thought. You were quite lucky there.
Alice: Yeah, no absolutely.
Marcus: I did want to say that, I made it sound as if I didn't, I wasn't approving of videoing these sessions, and that's certainly not the case, 'cuz I think, and this is something I didn't do, but when Chris, that we mentioned earlier, did some testing, I think it was the University of Edinburgh actually, and Hull, two different universities, what came out of that, we were basically able to change their minds or the mindset of some very senior people within those universities, just 'cuz they really didn't believe what we were saying.
But if you show them actual students going, "This makes no sense," or "I have to do this, then I have to do that, then I have to do this, then I have to do that," then they suddenly, it was like the stars and fireworks bursting in the air. So yes, videoing sessions is fabulous. It's just, I try to take notes at the same time, 'cuz then I can say, "Note to that bit at 23 minutes, or whatever."
Paul: Yeah, that's quite a nice way of doing it, just to make note timestamps that you need to look at. I think you can do that in Lookback if I remember. I think you can click something or, I can't remember.
Alice: Yeah, you can. You're right. There's like wee markers you can put on the videos and stuff. Yeah.
Paul: The big question with that, with 30 hours worth of testing plus then reviewing, I mean, that's quite a whack for a client to swallow, cost-wise. Do you find that you struggle to get clients to approve the cost of testing, or do you just have a good client base that kind of understands the value?
Alice: I think that's a tricky question for me in particular to answer because I'm quite, I'm obviously quite early in my career. For me to be in the stage where we're, like I'm discussing costs and pitches with clients, that's not, I'm obviously not quite there yet. So, I think it's definitely a portion of the project that seems to fall to cost-cutting and we just have to make sure; in that particular regard, that was a very precise project. Like a very precise research project, so that had its own luck in being able to be formed with a budget and planned out in that regard. And in terms of normal projects where integrating testing into the actual design process, I think that's a wee bit different, and I mean, even if it's not something you can get signed off, we still try to incorporate any feedback as best as we can because it's wholly testing. So guerrilla testing like in a café or something as well. I'm just grabbing it's possible.
Marcus: Exactly. Yes. [inaudible 00:44:10] testing, that kind of thing.
Paul: Yeah, that works well.
Alice: It's surprising how much can come out of that, yeah.
Paul: Yeah. It's really valuable, just anything, anybody outside of the project 'cuz they've got a fresh perspective and you know, you can get too close to these things, really. I always say that you're better off doing rubbish testing than no testing. Some people disagree, "Oh, it could do more harm than good if you get false results," but as long as you know you're doing rubbish testing and take it with a pinch of salt, you know, I still think there's a lot to be learnt from that. You know? And let you say, hallway testing where you just grab anybody you can to have a look at it. It's worth doing.
And also, there are some great services out there, which will basically recruit a tester for you and get your results in like an hour for $30/$40 dollars per user. Have you ever used anything like that?
Alice: No, we've not actually, but I know what you mean. It feels a bit dodgy, doesn't it?
Paul: Yeah, I know what …
Marcus: We've done it.
Paul: Yeah, I'm sure it is a bit dodgy. Well, it is a bit dodgy in some senses, but on the other hand, it's better than nothing, you know what I mean?
Paul: So you've done it, Marcus?
Paul: I didn't know that.
Marcus: Yeah, I'm really struggling to remember the name of the service.
Paul: Well, there's User Testing.
Marcus: Yeah, usertesting.com. That's what we used.
Alice: Is it actually?
Paul: Yeah, so it's called User Testing. There's another one, which is what users do, and another one called UserZoom. In fact, I'm speaking at the UserZoom conference, probably by the time this comes out I'll be doing that. Yeah, these are really good tools for that kind of quick testing, because the problem with hallway testing as they call it, where you just grab a colleague or whatever, is if you work in-house, that person is going to know the organization and so they're gonna kind of know how the organization thinks.
And if you work in an agency, it's going to be someone whose pretty digital literate. I'm not saying you shouldn't do it, you know, I think it's better than nothing, but a one-step up from that, something like usertesting.com or UserZoom, 'cuz at least with those kinds of things, you know, it's someone a bit more removed. Even though you can't maybe question them quite as much. And of course, going back to Lookback, as well, you can do unfacilitated testing with that, as well, can't you, where you can just set a task and people can do it on their own time?
Alice: Yeah, I think you can. Yes, that's remotely. They can just, the computer will record it for you, yeah.
Paul: So that's quite good, 'cuz then what you can do is take the URL and send it out to the Twitter followers of that particular company or their Facebook group and say, "Hey, give this a go." Or even put a banner on the existing website. All kinds of things you can do.
Alice: Yeah, exactly. That's a really great way to get participants, I think. Like, I think really for us, people who have taken the time out of their day to fill out that contact form …
Paul: Um, that's a good one.
Alice: … tells us something. Yeah, like, "Hey, I can't do this on your website. I can't find x, y, z." It's the most valuable user feedback because someone has actually participated. It's like they've opted in to take part already, and so, in making sure it's OK to maybe send them an email, and say, "Hey, we'd love to get to the bottom of the problem you're having," and "do you want to take part in anything?".
Paul: Yeah, that actually …
Marcus: That's perfect.
Paul: Again, that's not something I thought of doing it in that way. But yeah, that's a really good approach to it. Do you actually pay participants to help you out, or do you give them a gift voucher or anything like that?
Alice: Um, absolutely. It's paid or it's a gift voucher. I think we've used Amazon vouchers in the past, but …
Paul: Yeah, they always go down well.
Alice: … money. Yeah, absolutely. And that will be, regardless of where we've sourced the participant from, whether it's someone we've reached out to from those contact channels, or, I think we did actually get someone from social media once. Or if it's through the recruitment agency, that's all paid. Yeah.
Paul: I think that's really important to do, because peoples' time is valuable, isn't it?
Paul: And also, I think to some degree that reduces the amount of no-shows and it also, I don't know, it makes it clear that they are valued and they are appreciated, and I think that's a good basis to start testing from, because the last thing you want people to do is feeling like they're going into some kind of exam or something like that where they have to justify themselves, where actually it's the other way round. And that's so important, isn't it? That bit at the beginning of usability testing where you kind of, you explain, we're not testing you, we're testing the website or mobile app or whatever else.
Alice: Yeah. It's true, and I think that it definitely helps for them 'cuz if they've had some previous interaction with the company, they may be a little bit invested in it, like it's nice to feel that the company's giving back to them as a user for their feedback and time.
Paul: OK, I think we'll wrap it up there. I want to talk quickly about our second sponsor and then I'll share with you some additional resources that can get you going on doing this kind of testing that we've been talking about today. And interestingly, the sponsor, earlier on Alice mentioned Hotjar to record sessions and get data. Alice, you've got to dump Hotjar, I'm sorry.
Alice: Oh, no! I know you use a different; your sponsors are different too, isn't it?
Paul: I use Fullstory, yeah. And I'm now going to indoctrinate you into Fullstory and why you should use that over Hotjar. Because absolutely love Fullstory! So the big difference, right, is that what Fullstory is doing, is it's actually recording the DOM. The whole of the Document Object Model, so the videos that you watch back as sessions aren't screen recordings, but they are actually the, everything that's going on – all the code that's being run, everything that's happening on it. Now what that means is, is that you can add one small script to the website and then that's it. So there's no need to kind of tag individual things, though, or data that you want to get back. And it records every event. Every click, every swipe, every scroll, all the text is instantly indexed and searchable. So it means that you can go in, and you can even type CSS into if you want. You could say, "Show me everybody who interacted with this CSS class," for example. So you can do all kinds of incredibly, incredibly powerful things.
And it's also great, so for example you were talking about people that contacted you via a Contact Us form. What you can do with Fullstory is actually when they do that, you can imbed a link in the contact form where, which is the recording of their session, so you …
Paul: … I know. Yeah, yeah, yeah. Exactly! So you can watch back their session, so that guy you were talking about you couldn't quite figure out what was going wrong, you would actually get a link to his session and be able to watch that back and see where it's going wrong before you even talk to him. You know?
Alice: Oh, wow! Yeah, that sounds; the only problem we'd have with that is privacy and [inaudible 00:52:02] stuff.
Alice: Yeah, aside from that, yeah.
For example, at the moment, I'm running it on the homepage of Boagworld because I'm trying to work out a couple of usability problems. I'm only running it on that page, not across the whole site, because it's only that page I'm interested in. And that means that I don't have to pay for it 'cuz it's under 1,000 sessions per month. Not that I'm tight, you understand, but yeah. I'm a solo entrepreneur, we're struggling to make ends meet every day, so you know, we can't waste money.
Alice: That does sound good.
Marcus: [inaudible 00:53:28]
Paul: It does, see? See? I told you. So to find out more, you can go to fullstory.com /Boag, so there we go. Alice is sold, so you have to be, too.
Alice: I am sold, yeah.
Paul: I wanted to mention some further reading for … usability testing like this, it feels like a luxury. It's often the first thing to get cut. It maybe feels a bit difficult at times, too time-consuming, but it doesn't need to be any of those things, and it will really revolutionize your approach to design, so if you're not doing it, get started. If you need convincing, then what you want to do is get a book called, Don't Make Me Think by Steve Krug.
Alice: Oh, yeah. That's a good one.
Paul: Pretty good book, isn't it?
Paul: If you're convinced, but want advice on how to start, you've got several options. Option number one is you go to Boag.world/rocketsurgery, and that is Steve Krug's follow-up book, it's a link to his follow-up book, which is Rocket Surgery made easy. And that takes you step by step how to integrate usability testing into your every day workflow.
Alternatively, if you're too tight like me, to buy Rocket Surgery, although I did buy that one actually, but anyway, go to boagworld.com/usability/usability-testing and I've written an introduction to get you started with everything you need to know about usability testing.
If you want to test designs or aesthetics like Marcus was talking about earlier, then you want to go to boagworld.com/design/testing-design, and all of that semantic differential testing and various other pretentious stuff is all written down there.
In terms of great tools, lookback.io was the one that Alice was talking about earlier. Absolutely brilliant tool. And the other one that you want to check out, which is a little bit, is good for testing design aesthetics and they also help you recruit people and all of that stuff is usabilityhub.com and we've already talked about UserZoom; we've already talked about what UserZoom, and we're already talked about usabilitytesting.com. So, there you go. There's a load to get you started.
In terms of next steps, here's a little thing I did in a company, right? 'Cuz one of the big problems you encounter is other people not thinking it's important and worth doing, so just adopt a mantra in your company that whenever anybody disagrees; whenever anybody is unsure about the best approach, just every time say, "Let's test that." And 90% of the time, people will say, "Nah,' Let's make a decision. To begin with especially, that will happen a lot. But if you keep saying, "Let's test that," again and again and again, eventually you'll get a chance to do it. And the minute you get a chance to do it, and the minute your boss sits in those sessions and sees those sessions … like Marcus described it earlier, like Alice described it earlier, it's a light bulb moment, and everything will change.
So just keep banging on about it, and then, if not, trying doing your own testing on your own dime. Quick and dirty, I mean, Alice was talking about this. The idea that even when they can't get it built into projects, they try to do it anyway because once people can see it, they get it. And it changes everything. And then eventually you want to settle down into really doing a kind of monthly testing cycle, which is what Steve Krug talks about in his book Rocket Surgery Made Easy.
There you go! Marcus, do you have a joke to wrap us up with. Alice, I'm sorry about this. We have to do this.
Marcus: Some are better than others. And I particularly like this one, that doesn't mean that you will. Um, this is from Ian, one of the developers at Headscape, and we've had lots from Ian over the years. And they're of a certain level, so here we go.
I went to the zoo the other day and saw a baguette in a cage. The keeper told me it was bread in captivity.
Paul: Oh, no. That was actually quite good. I love that one. And even Alice politely snickered. Well done.
Marcus: She did. Thank you, Alice.
Alice: It was quite funny, don't worry.
Paul: It has her approval. That's good. Alice, where can people find out more about you and about the company you work for and that kind of thing.
Alice: So the company website is Cellosignal.com. So, cello, like the instrument, signal.com. I think I've got a website, which I really need to update a wee bit, but it's alicegherbison.com and [inaudible 00:58:22], but that's not always about UX stuff, but yeah.
Paul: People don't want UX stuff, that's boring. They want all the random stuff that people put out on Twitter.
Marcus: Cats and things like that.
Alice: It's mostly cats, I'm not gonna lie.
Paul: You're going to be turning into that cat lady, I did usability testing with, I see.
Alice: I probably will. It's all cats.
Paul: Oh, Alice. I liked you so much, right up until at that point.
Alice: Oh, sorry.
Paul: On that bombshell …
Marcus: Don't apologize to him, he' a nasty man.
Paul: I am a nasty man. I can't claim otherwise. Alright, so next week we're gonna be looking at, "Are your interface designs compelling persuasive?" And we may or may not have a guest on that. I'm still negotiating over that one.
Paul: I know. It's like, who knew it was going to be so difficult …
Marcus: It's going to be a big name next week, is it?
Paul: No, I'm trying to get …
Marcus: I'm not suggesting that you're not a big name, Alice. But I mean, I'm thinking Jeffrey Zeldman or someone like that.
Paul: Yeah, yeah. You have to book him years in advance. No, it's, it's Joe Leach, Mr. Joe.
Alice: Oh, wow. Yeah, I've seen him speak. I think I've met him, actually, at conference once.
Paul: He is such an awesome guy. I met with him recently in Bath. We went out for lunch together, and he wrote an excellent book on psychology which is why I want to get him on, but when we were due to be doing this show, he's at a workshop, so I need to talk to Marcus about that in a minute. Anyway, we will manage something along those lines. In the meantime, between now and next week, check us out at boagworld.com/slacking and join us on the Slack channel, but other than that, thank you very much, Alice, and thank you [inaudible 01:00:08], no, until next time, good bye.