Is it ok for technical communicators to experiment on users?

Experiment sign Flickr CC image by jurvetsonBBC News is reporting the OKCupid Website has revealed that it experimented on its users. It decided to reveal the tests after the discovery that Facebook had been manipulating the feeds of its users.

It’s possible for anyone to run experiments on web pages, including technical communicators. You can display one page for 50% of your audience and a different page for the other 50%. Organisations carry out these tests to see if there is any change in user behaviour as a result of making a change to the site.

According to Christian Rudder of OKCupid:

“It’s not like people have been building these things for very long, or you can go look up a blueprint or something. Most ideas are bad. Even good ideas could be better. Experiments are how you sort all this out.”

So is it ok for Technical Authors to experiment on users? If it results in the creation of better, more effective, Web pages would you, as a user, object? Clearly, most of us would object to a website manipulating us or causing potential harm – could that ever apply to the type of experimentation a Technical Author might carry out?

2 Comments

Axel Regnet

Interesting idea – A/B-Testing and multivaraite testing are indeed tried and tested methods for web developers. But I think what makes these methods very usable for web sites or online shops is the fact that you have very clear and easy to measure desired outcomes. Usually you would look at the CR (conversion rate) of “green button” vs “red button” in the split test and use the variant with the better conversion. At the end of the day you want to improve sales or number of generated leads.
But if I test different variants of eg an article in my online help, what desired outcome would I measure? I think it is clear that both variants are correct and helpful. But what hypothesis could help me compare eg an article with screenshots with the same article with a video? The clickrate? I could improve the usability of a “this article was helpful” call-to-action button but the improved rate of clicks on the button (and the assumed satisfaction) would not really say much about the quality of the article.
Very interesting concept – I am eager to hear some of the ideas that will come up here….

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.