When best practice is rejected

Paul Boag

Just because we follow web design best practice doesn’t mean our clients and bosses will accept it. What do we do when they adamantly demand things are done their way?

We have all been there. We have read the research, listened to the experts and ensured we are up on best practice. But, when it comes to getting the work done our client or boss thinks they know better than what we have so diligently learnt.

It’s a demoralising experience that leaves us feeling no better than a monkey who knows how to use Photoshop (although in my opinion that would be a damn impressive monkey!)

What then are we to do?

The first thing you need to do is move away from the ‘us versus them’ argument. When it comes to your boss or client, this is not an argument you will win. Instead, make things more objective with some user testing.

Guerrilla usability testing

Carrying out some quick user testing is valuable in resolving these disagreements. Create a mockup showing the two approaches (yours and the clients/boss) and test them both.

Because this kind of testing is about resolving a single issue as quickly as possible I recommend using a tool like usertesting.com. Usertesting.com allows you to submit tasks that you would like them to test and they send you videos of users trying to complete those tasks while talking about the experience.

Usertesting.com allows you to see real users completing key tasks on your site within the hour.

The reason I recommend usertesting.com is because…

  • It takes only a couple of minutes to setup.
  • They find users for you, saving recruitment time.
  • You get videos back that you can show your client or boss.
  • You get results within an hour.

Showing your client/boss video of users consistently failing to complete a task using their idea, while succeeding with your approach, really drives the point home.

Of course, some clients are stubborn and may reject the test as being unrepresentative in some way. Typically they will claim you didn’t test enough people to be representative (despite what Neilsen says on the subject) or the test participants are not the right demographic. In such cases, it is probably time to turn to hard data.

Using data

It’s amazing what evidence you can dig up in Google Analytics to support or disprove a hypothesis. The chances are that with a bit of analysis you will be able to find some hard numbers to help break the deadlock.

The great advantage of using Google Analytics data is that the data represents real users using your real site. This makes it hard for the client or boss to argue with the results.

If Google Analytics does not help, then the next option is split testing. Mockup the two different approaches and serve the two versions to different segments of your audience. This will resolve things one way or another.

Of course, I know what you are thinking; this all sounds like a lot of work.

Too much work?

I recognise that producing multiple versions of an approach takes time. However that is not always necessary. Admittedly full working versions are required for split testing or when digging around in analytics. However, user testing can be carried out on a low definition mockup.

If the argument is over aesthetics then testing some design comps using Verify may well be enough.

That said, I admit this isn’t without its costs. However, what is the alternative? Arguing with the client or boss is also a time consuming process and can sour the relationship. On the other hand if you just give them what they want, this can come back to bite you when the site does not perform as expected.

This is why testing should be built into all projects by default. Not only is it best practice to test with real users, it is also inevitably going to be required to resolve a disagreement at some point in the process. In my view testing is just a cost of building websites much like the need to write CSS or send emails. It is not something that can be avoided.

“Stop Sign” image courtesy of Bigstock.com