HomeHome Breadcrumb Separator Blog Breadcrumb Separator Seven simple principles for good thought leadership survey design

Seven simple principles for good thought leadership survey design

A few months ago, a client got in touch with us to say that they were having a hard time getting people to answer a survey of theirs, and asked if we could advise on how to improve this.

A cursory review of their questionnaire quickly revealed why. At well over 50 questions in length – all of which were complex matrix-style questions – the survey would have deterred even the most committed respondent.

The fact is that while many thought leadership reports rely on an in-depth survey of a targeted demographic, the time and attention given to the design of the survey questionnaire itself is often the first area to be cut back in the rush to meet a report’s overall production deadlines.

Unfortunately, doing so can often weaken, or even fundamentally undermine, a research project’s proposed outcomes. This is especially true when key internal stakeholders are only consulted on the survey once the fieldwork is already completed, and thus when it’s simply too late to change anything.

Ensuring sufficient time – my usual rule of thumb is to try and allow about four weeks for survey design and internal approvals – is just a starting point though. There is a considerable body of academic and professional literature on survey design, but equally a few basic principles should not be forgotten, including some of the following:

1. Set clear hypotheses to test.

    What are the core research questions you’re trying answer? Do these link back to some basic hypotheses on the subject that you’re seeking to test? It may sound simple, but all too often surveys fail to meet their basic target outcomes by failing to set out some clear initial hypotheses that the research will seek to prove or disprove.

2. Think of the outcomes you wish to achieve

    1. Beyond setting clear hypotheses, you should also think about what each question might deliver. Most should rightly be focused on giving more in-depth insights into specific issues, but it’s helpful to have a few questions designed to

provide press-friendly outcomes

    – simple yes or no questions can deliver great results here.

3. Ensure relevance to the audience you’re planning to poll.

    Are you sure finance and treasury executives can answer in-depth questions about cloud-based technology systems? Or that IT professionals can comment on the company’s risk philosophy, or give views on niche banking regulatory issues? This problem crops up a lot, usually driven by a desire to get as much as possible out of a single survey. However, this can undermine the final results. Make sure you put yourself in the shoes of the target demographic.

4.Don’t turn off survey participants.

    Surveys don’t have to be a chore to answer: with a bit of creativity, you can make the experience easier and more engaging—and thus deliver better quality outcomes. For example, make sure you limit the use of open-ended questions, which are more time consuming to answer. Do you really need 30 or more in-depth questions, or can you get what you need from 20? Also consider whether you’ve got the balance right between being too broad (and thus overly generic), or too narrow (and thus far too niche).

5. Ensure consistency and clarity in your questions.

    Are your questions easy to read and follow? Are they overly leading? Do they follow a similar format? Do your answer scales run from easy to hard in one question, and then hard to easy in another? Have you tried to avoid making the questions overly complex, with too many matrix questions? In short, have you made the survey user-friendly both in terms of completion time and straightforwardness of the questions.

6. Map it to a relevant time frame.

    1. Do you think oil prices will go back up to over US$100 a barrel? Well, yes, of course they will at

some

    point — but do you think it will happen in the next six months? Adding a specific (and consistent) time frame to questions is vital for ensuring good, usable outcomes.

7. Plan your quality control measures.

    1. Are the answer lists in your survey randomised? Did you get someone who’s not been involved to read the questionnaire over before launch? Is the formatting user friendly? Do you have options for respondents who simply “don’t know” or for when an answer is “not applicable”, or are you forcing them to choose something that’s perhaps not relevant to them? These checks, among others, will be invaluable in

improving the overall outcome and assuring the credibility

    of your output.

And yes, do consult with key stakeholders—before the survey is launched. It’s infinitely better to get their views before fieldwork commences, rather than working to placate them further down the track.

Happy surveying!

In this article