Testing...testing
The best way to ensure your document meets readers’ needs is to test it with the audience. Even testing that is quick and cheap can give you useful information about how to improve your document.
What is a test?
A test can be as simple as asking a colleague or a citizen to answer one question about your text or as complicated as inviting 30 people to read your document and answer questions. Almost any test will help you make the writing better. And any test with a member of your audience will get you closer to writing that works for your readers.
When should you test?
Start testing as soon as you have enough material to test. Don’t wait until your document is complete unless it’s a short email.
Make corrections based on feedback, and test again, if you can. The best practice is to test a couple of times, and more if the text is important and could be difficult for your readers.
How should you test?
Several methods will help you improve your document.
• Paraphrase testing: In individual interviews, your testers read a short document and explain the concepts in their own words.
• Controlled comparison testing. You create and send out more than one version of your document, either simultaneously or over several mailouts, then compare the responses. You’ll get the clearest direction from your results if the versions differ in only one way, for example, a different opening sentence, or different paragraph structures, or using a numbered list to highlight a step-by-step process.
• Usability testing: In individual interviews, your testers read a more complex text while you watch, ask questions, and note whether they answered correctly and readily or struggled.
Record your results in some way:
• write down what you recall after a conversation about a document
• note which version of a letter generated the most or least phone calls or errors
• have someone take notes on the test subject’s answers or audio- or video-record the conversation
Who should you recruit?
• people in your target audience (your best opportunity)
• people familiar with your audience
• people who have nothing to do with your document
• people outside your organization
• people outside your skill set
Testing documents internally with co-workers while you’re still writing is highly useful. You can then test a final draft with someone from outside your team. Your co-workers will likely read about as well as you do and have the same technical understanding, so you will want other perspectives on the complexity of the language and the topic.
Don’t test with people who know the information in the document or people who helped you get to this point in creating this document. They will know too much about your subject area to notice what’s missing or out of place.
Readability measures
Your word processor or a website like Hemingway Editor offer quick readability tests. But these long-established ways of describing how readable a text is actually don’t mean much. They were originally created to measure children’s reading skills, not documents for adults. The software counts syllables (to represent complexity in language), words and paragraphs (to represent complexity in expression) to roughly estimate reading difficulty. The resulting number claims to reflect the number of years of education a person needs to read the text.
Readability measures fail in three main ways. It’s easy (and tempting) to game them, even unintentionally. They cannot distinguish between a meaningless text and a useful one. They cannot tell you if the structure and organization are useful or if information is missing.
If your text readability score from Microsoft Word or Hemingway Editor is gets a Grade 10 or higher, you probably need to edit for clarity: many people don’t read at this level. If you test again, you will be able to see if your changes made your sentences easier to read, but you should rely on other methods to make sure your text is clear.
Try setting up a simple test of a bit of writing this week. You will learn something surprising or pleasing. But you will learn something.