(404) 210-9018 admin@stochasticgroup.com
Survey Design Services

When reviewing a survey instrument to ensure that the objectives of the survey are met, The Stochastic Group (TSG) focuses on aspects that directly relate to the design and wording of the survey itself, with special attention to reducing bias. Separate evaluations or analyses of the sample, sampling plan, programming of the survey instrument into a computerized survey, etc. can also be performed but fall under separate scopes of work. The list below includes, but is not limited to, key aspects of your survey instruments that TSG will review and check for problems. Any issues identified will be documented and suggestions for remediating the problematic areas will always be provided.

1. Question Wording

  • Clarity and simplicity: Ensure that each question is clear, straightforward, and easy to understand. Avoid complex or ambiguous wording that could confuse respondents.
  • Neutral wording: Avoid leading or loaded questions that may push respondents toward a particular answer. Use neutral language that doesn’t imply a “correct” or “preferred” response.
  • Jargon-free: Avoid technical terms or jargon that respondents may not understand. Use language that is appropriate for the target audience.
  • Double-barreled questions: Avoid questions that ask about two things at once, which can confuse respondents and lead to inaccurate data. For example, instead of asking “How satisfied are you with the quality and price of the service?” split it into two separate questions.

2. Response Options

  • Balanced response scales: Ensure that all question response scales offer balanced options. For example, Likert scales should include both positive and negative options in equal proportion (e.g., Strongly Agree, Agree, Neutral, Disagree, Strongly Disagree). In addition, verify that any question that might not be applicable to a respondent is either optional so they may skip the question or present respondents with a “Not Applicable” or NA response options (preferred).
  • Avoiding extreme or limited options: Ensure that response options allow for a full range of answers. If response options are too narrow (e.g., Yes/No), respondents may feel forced into giving an inaccurate answer.
  • Use of “Other” or open-ended options: Include “Other” or open-ended response options where applicable to capture respondents’ thoughts when pre-defined options don’t fit their experiences.
  • Judicious use of open-ended response questions: Research in the survey methodologist literature has open-ended response questions more than double the risk of a respondent’s failure to complete a survey . However, important insight can be gained from open-ended survey questions that may not be obtainable from closed-ended survey questions. Evaluation of survey requires a careful balancing between open-ended survey questions and closed-ended responses.
  • Avoiding overlapping categories: Make sure the response options do not overlap. For example, age ranges like “18-25” and “25-30” can create confusion since the 18-25 and 25-30 responses both contain the age 25.

. Question Order

  • Logical flow: Ensure that the questions are arranged in a logical order, flowing naturally from one topic to the next. Group related questions together to avoid confusion or cognitive overload.
  • Avoiding order effects: Randomize the order of questions or response items, when appropriate, to minimize the impact of question order on responses. This can help reduce bias such as primacy or recency effects, where respondents favor options that appear first or last in a list.
  • Sensitive or difficult questions: Place sensitive or difficult questions toward the end of the survey, after respondents have had time to ease into the process and are more comfortable. Beginning with such questions can increase the likelihood of non-response or biased answers.
  • Placing critical questions first: Since respondents may end the process of completing a survey at any time, and incomplete questionnaires can still be used to obtain the responses to questions that are answered, it’s imperative that the most critical questions are placed early on in the survey instrument. However, if such questions are cognitively burdensome, are sensitive, or may otherwise encourage break-offs, it’s important to carefully review the questions for proper placement in the survey to optimize the survey instrument so that more important information is retrieved, while not simultaneously sacrificing responses to other questions or encouraging respondent drop-off.

4. Social Desirability Bias

  • Anonymity assurances: In the survey introduction or instructions, assure respondents of their anonymity, if true and applicable, which can help reduce social desirability bias.
  • Indirect questioning: Frame sensitive questions indirectly to avoid respondents giving socially desirable answers. For example, instead of asking, “Do you follow all health guidelines?” we would advise asking “How often do people in your community follow health guidelines?

5. Question Length and Survey Length

  • Brevity: Keep questions as short as possible while still capturing necessary detail. Long or complex questions may tire respondents and lead to lower quality responses.
  • Survey length: Shorten the overall length of the survey if possible. Long surveys can cause respondent fatigue, leading to rushed or incomplete answers toward the end.
  • Avoid asking questions when the answer is already known: Survey researchers often ask respondents to answer questions that may be obtainable elsewhere for convenience of having all relevant data stored in a single survey dataset. However, each additional question posed to survey respondents risks the respondent not completing the entire survey due to fatigue. It’s often possible to obtain answers to survey questions from external sources like databases maintained in separate systems (e.g., CRM applications or accounting systems) or even third-party databases, and with TSGs data science expertise, we can simply merge data from multiple sources into your survey dataset, so a single, comprehensive dataset is available for analysis.

6. Use of Visual Aids or Layout

  • Consistent format: Maintain a consistent layout and format throughout the survey to reduce cognitive load. Inconsistent formatting can confuse respondents and lead to mistakes.
  • Avoiding unnecessary graphics: Avoid adding graphics or visual elements that may distract respondents or imply certain answers.
  • Clear instructions: Ensure that instructions for how to answer each question or section are clear and easy to follow.

7. Pre-testing (Pilot Testing)

  • Cognitive interviews: We typically conduct cognitive interviews where test respondents explain their thought process out loud to survey methodologists while answering questions. This helps identify any confusion or misunderstandings in the way questions are phrased, allowing for remediation of the problems prior to deployment.
  • Pilot testing: We also offer the ability to carry out a pilot test in which the survey is placed into the field only with an initially very small sample of individuals. Once the surveys are complete, we analyze the pilot survey data carefully to search for and identify unclear questions, bias, or fatigue effects before launching the finalized survey to all respondents in the targeted sample or population in the event the survey is being conducted as part of a census.

By focusing on these areas, the Stochastic Group’s expert survey methodologists can significantly improve the quality of your survey instruments, ensuring they are clear, unbiased, and aligned with the survey research objectives so that you can be assured the data you collect reflects that actual knowledge, attitudes, and/or opinions of your sample or population of interest.
In addition to survey questionnaire instrument design consulting, The Stochastic Group also offers survey programming, development, and administration of surveys, for all modes of electronic data collection. This includes:

  • Computer-Assisted Web Interviewing (CAWI), where respondents themselves self-administer the survey over a secured internet connection.
  • Computer-Assisted Personal Interviewing, where an interviewer uses a tablet or laptop to administer the survey face-to-face with respondents. Data is encrypted and stored locally on the computer hard drive or encrypted and streamed to secure databases via the World Wide Web.
  • Computer-assisted Telephone Interviewing (CATI), with/without random digit dialing
  • SMS Survey Interviewing, where surveys are conducted via short text messaging and respondents reply using their phone’s SMS technology.
  • Mobile Survey Interviewing, where surveys are either self-administered or administered by an interviewer while mobile. When conducted by an interviewer and when communication networks are not available, data is encrypted and stored on disk and later synchronized with the master database server. When self-administered or administered by an interviewer when Wi-Fi or cellular networks provide stable internet connectivity, survey responses are encrypted and streamed over the internet to a secure master database.

The following table summarizes all survey data collection modes with which the Stochastic Group can assist.

Acronym Mode Description
PAPI Paper and Pencil Interviewing Paper-based surveys filled out by respondents or interviewers
CATI Computer-Assisted Telephone Interviewing Telephone interviews using a computer system
CAPI Computer-Assisted Personal Interviewing Face-to-face interviews using a computer or tablet
CAWI Computer-Assisted Web Interviewing Web-based self-administered surveys
IVR Interactive Voice Response Telephone-based surveys with responses via keypad/voice
SMS SMS Surveys Text message-based surveys
IVI In-Vehicle Interviewing Interviews conducted while respondents are in their vehicles
Mixed-Mode Mixed-Mode Surveys Combination of two or more modes (e.g., web and telephone)
SAQ Self-Administered Questionnaire Self-completed surveys, either paper or electronic
App Mobile App Surveys Surveys conducted via a mobile app
SMS-to-Web SMS-to-Web Surveys SMS invitation directs respondents to a web survey
Copyright © 2015 The Stochastic Group