Question creation is the most important part of the questionnaire process. When designing a survey, one needs to distinguish between open and close-ended questions.
Open and close-ended questions
There are generally two categories of questions. (1) open-ended questions refer to any questions to which the respondents answer with their own words, whereas in (2) close-ended questions, answers are already provided and the participants need to select an answer they feel most represented by. Close-ended questions are also sometimes referred to as forced-choice questions (Bryman 2012).
The following table shows a brief overview of the main advantages and disadvantages of the two types of questions (Bryman 2012).
- Open-ended questions do not limit the respondents, allowing them to express themselves freely. This can increase the validity of the data gathered.
- Open-ended questions allow for the discovery of new data/concepts otherwise ignored or overlooked by the researcher.
- For the researcher, open-ended questions can be easier to create as they do not need to provide every possible response in advance.
- From researcher’s perspective, close-ended questions are easier to categorise and analyse. There is no possibility to misinterpret the answers.
- Close-ended questions do not discriminate against “less talkative and less articulate respondents” (De Vaus 2002:100).
- A set of answers options can prompt respondents to answer and it will ensure no response is forgotten.
- Answers can be easily compared.
- Open-ended questions require more attention and time from the researcher for their categorising/analysing.
- Such questions can often result in high non-response rates, as they are perceived as triggering too much effort to be answered, particularly when their context is not clearly explained.
- Wright (2006) show comparatively higher drop-out rates in online surveys when using
- Responses gathered from openended
questions are more
difficult to compare (reliability is
- Close-ended questions with a limited set of answers may result in item non-response or instrument error. This means that the participants feel prompted to choose an answer which does not fully represent their views, which can “create false and unreliable answers” (De Vaus 2002: 106). It is thus important to include a “Don’t know” option for participants who are not able to answer.
- A well formulated close-ended question requires a lot of effort in developing answers and responses. “The range must be
exhaustive” and can be achieved through pilot
studies and by allowing respondents to respond
freely with an “Other (please specify)” option (De Vaus 2002:100).
- If questionnaires contain a lot of close-ended
questions, respondents may feel unable to express themselves. This impersonal feeling may increase the likelihood of participants not agreeing for a follow-up survey or interview if such are needed.
The use of open-ended questions is recommended when it is believed to be important for the study that the participants use their own words. Furthermore, during the initial stages of the research design, exploration of a new concept, or if the research objective focuses on the personal views of respondents e.g. in participatory studies, feedback/satisfaction surveys, open-ended questions are more suitable. Thus, open-ended questions may be used in the initial stage of the questionnaire design (in pilot studies) in order to identify more variables/categories. After initial data collection and analysis, open-ended questions may additionally be used in follow-up surveys to explore ‘deviant responses’ (Reja, Manfreda, Hlebec, and Vehovar 2003).
Close-ended questions are more appropriate when there is a general agreement as to the anticipated responses of a given question. Furthermore, such questions are more suited for a study that focuses on providing a statistical report/analysis of data.
When constructing questions for a survey, it is important to design them in a way that avoids item non-response. The following table provides helpful questions that guide you in the design of the questions (De Vaus 2002; Bryman 2012):
|Checklist for question wordings
- Is the language simple?
- Can the question be shortened?
- Is the question double-barrelled (touches upon several issues)?
- Is the question ambiguous?
- Is there a prestige bias?
“When an opinion is attached to the name of a prestigious person and the respondent is then asked to express their own view on the same matter, the question can suffer from prestige bias” (De Vaus 2002:98). For example: “In your opinion, how well implemented is the new program introduced by the minister of education?”
- Is the respondent likely to have the necessary knowledge?
- Is the question too precise/too general, vague?
- Is the frame of reference for the question sufficiently clear?
- Does the question overstretch respondent’s memories?
- Is personal or impersonal wording preferable?
- Is the question a ‘dead giveaway’?
Dead giveaways are words such as: all, always, each, everybody, nothing, never, nobody. These words generalise the question to the point where most participants will disagree with such statement. For example:
- “Do you agree that all teachers should have a background in pedagogy?”
- “To what extent do you agree that every school needs to be located near parks?”
- Does the question have dangling alternatives? (Answers should not be found in the
- Does the question contain gratuitous qualifiers?
Gratuitous qualifies are phrases that might affect the response. For example: “Do agree with the new measurements, despite the fact that it can disrupt schools in their management strategies?”
Based on Bryman (2012), the following table provides examples of the most common types of badly
worded questions, as well as an explanation as to why they are badly phrased, and examples of
|Common types of questions to avoid
|Wordy, long question with complex terms
||Example of badly worded question: “How accessible, in your opinion, are
programs in Luxembourg regarding adults who desire to further their
educational career or to re-educate themselves in order to improve their
professional career?” (Very accessible - Not at all accessible)
Example of improved question: “How accessible are educational
- This question is too wordy and contains a lot of information that participants do not need.
- Avoid long questions and keep them concise.
- Avoid using long complicated words or technical terms.
opportunities for adults in Luxembourg?” (Very accessible – Not at all
- This question cuts down on unnecessary information, thus becoming clearer and more concise.
||Example of badly worded question: “Do you think students do not do
nothing to progress in their academic careers?” (Yes – No)
Example of improved question: “Do you think students try to progress in
- Avoid double negation as this may confuse participants. Answers usually result in confusion i.e.
- “Yes” = “Students do something…”
- “No” = “Students do nothing…”
their academic careers?” (Yes – No)
- This example gives a clear answer to a clear question, “Yes, the students try to progress…” Or “No, the students do not try to progress…”
|Example of badly worded question: “To what extent do you agree with
the management style and the teaching style of this school?” (Totally
agree – Do not agree at all)
Example of improved question: “To what extent do you agree with the
- This is an example of a double-barrelled question. This question in fact is two in one (question on management style and question on teaching style) while only allowing for one set of answers.
management style of this school?” “To what extent do you agree with the
teaching style of this school?” (Totally agree – Do not agree at all)
- Questions should address one aspect of a given topic at a time.
||Example of badly worded question: “How badly would you rate the
curriculum of this low-rated school?”
Example of improved question: “How would you rate the curriculum of this school?”
- This question is highly biased and suggestive as participants may be inclined to agree with the researcher thus invalidating data collected.
- Avoid making value judgements in your questions and use neutral language.
||Example of badly worded question: “How do you deal with the difficulties
your child has at school?” (Open ended question)
Example of improved question: “Does your child experience any difficulties at school?” (Yes – No) “If yes, what were the steps taken to help alleviate their difficulties?” (Open ended question)
- This is a loaded question as the question assumes a. that the participant plays an active part in helping their children and b. that the children of participants are having difficulties.
- This question first confirms with the participant regarding their children to avoid assumptions. The second part of this set of questions avoids assuming participants as playing an active role.
||Example of badly worded question: “In your opinion, what is the most important subject taught at school?” (Multiple Choice Question Maths, Languages, History, Sports, Art)
Example of improved question: “In your opinion, what is the most important subject taught at school?” (open question – it allows participants to answer with their own words, no predetermined answer options are given)
- Biased and leading questions are similar as they both limit and influence the answers of the participant. In leading/loaded questions, the bias is usually found within the question itself.
- However, biased questions can be written in a neutral language but by limiting the number of possible answers, participants may be inclined to answer what is expected of them to answer.
- It is good practice to allow the respondents to provide their own answers.
Bryman, A. (2012). Social Research Methods. Oxford: Oxford University Press
De Vaus, David A. 2002. Surveys in Social Research. 5. ed. Crows Nest, NSW: Allen & Unwin
Reja, Ursa, Katja L. Manfreda, Valentina Hlebec, and Vasja Vehovar. 2003. “Open-Ended vs. Close-
Ended Questions in Web Questionnaires.” Developments in Applied Statistics (19):159–77
Wright, Kevin B. 2006. “Researching Internet-Based Populations: Advantages and Disadvantages of
Online Survey Research, Online Questionnaire Authoring Software Packages, and Web Survey
Services.” Journal of Computer-Mediated Communication 10(3):00–00. doi: 10.1111/j.1083-