Exploring the impact of response option sequences/order on survey rating scale responses
Abstract
In the realm of survey data quality, inaccuracies and nonresponses pose significant challenges. One significant factor affecting this is the order in which response options are presented, leading to what is known as response order effects. This research delves into the extensive studies conducted on how the sequence of answer options influences respondents’ ratings in survey questions. Specifically, we focus on analyzing previous research to understand how the arrangement of scale points on a rating scale impacts the cognitive processes and reaction strategies of respondents. By synthesizing existing studies, this investigation aims to provide insights into the critical role that presentation sequence plays in shaping survey outcomes, thereby offering valuable perspectives for enhancing data quality in future survey designs.
References
Krosnick JA, Judd CW, Wittenbrink B. The measurement of attitudes. In: The Handbook of Attitudes. Routledge; 2018. pp. 45-105.
Nemoto T, Beglar D. Likert-scale questionnaires. In: JALT 2013 conference proceedings. 2014.
Curado MAS, Teles J, Marôco J. Analysis of variables that are not directly observable: influence on decision-making during the research process. Revista da Escola de Enfermagem da USP. 2014; 48(1): 146-152. doi: 10.1590/s0080-623420140000100019
Gure GS. Different scale construction approaches used to attitude measurement in social science research. International Journal of Research in Economics and Social Sciences. 2015; 5(1): 26-44.
Harpe SE. How to analyze Likert and other rating scale data. Currents in Pharmacy Teaching and Learning. 2015; 7(6): 836-850. doi: 10.1016/j.cptl.2015.08.001
Fricker S, Tourangeau R. Examining the Relationship Between Nonresponse Propensity and Data Quality in Two National Household Surveys. Public Opinion Quarterly. 2010; 74(5): 934-955. doi: 10.1093/poq/nfq064
ten Klooster PM, Visser M, de Jong MDT. Comparing two image research instruments: The Q-sort method versus the Likert attitude questionnaire. Food Quality and Preference. 2008; 19(5): 511-518. doi: 10.1016/j.foodqual.2008.02.007
Maloshonok N, Terentev E. The impact of visual design and response formats on data quality in a web survey of MOOC students. Computers in Human Behavior. 2016; 62: 506-515. doi: 10.1016/j.chb.2016.04.025
Schulte-Mecklenbeck M, Kuehberger A, Johnson JG, et al. A Handbook of Process Tracing Methods for Decision Research. Psychology Press; 2011. doi: 10.4324/9780203875292
Calcagnì A, Lombardi L. Dynamic Fuzzy Rating Tracker (DYFRAT): a novel methodology for modeling real-time dynamic cognitive processes in rating scales. Applied Soft Computing. 2014; 24: 948-961. doi: 10.1016/j.asoc.2014.08.049
Ülkümen G, Fox CR, Malle BF. Two dimensions of subjective uncertainty: Clues from natural language. Journal of Experimental Psychology: General. 2016; 145(10): 1280-1297. doi: 10.1037/xge0000202
Kahneman D, Tversky A. Variants of uncertainty. Cognition. 1982; 11(2): 143-157. doi: 10.1016/0010-0277(82)90023-3
Rabinowitz J, Schooler NR, Brown B, et al. Consistency checks to improve measurement with the Montgomery-Asberg Depression Rating Scale (MADRS). Journal of Affective Disorders. 2019; 256: 143-147. doi: 10.1016/j.jad.2019.05.077
von Davier M, Carstensen CH. Detecting response styles and faking in personality and organizational assessments by mixed Rasch models. Multivariate and Mixture Distribution Rasch Models. Springer New York; 2007. doi: 10.1007/978-0-387-49839-3
Muthukumarana S, Swartz TB. Bayesian Analysis of Ordinal Survey Data Using the Dirichlet Process to Account for Respondent Personality Traits. Communications in Statistics - Simulation and Computation. 2013; 43(1): 82-98. doi: 10.1080/03610918.2012.698773
Lombardi L, Pastore M, Nucci M, et al. SGR Modeling of Correlational Effects in Fake Good Self-report Measures. Methodology and Computing in Applied Probability. 2014; 17(4): 1037-1055. doi: 10.1007/s11009-014-9427-2
Furnham A. Response bias, social desirability and dissimulation. Personality and individual differences. 1986; 7(3): 385-400. doi: 10.1016/0191-8869(86)90014-0
Tourangeau R, Couper MP, Conrad FG. Up Means Good. Public Opinion Quarterly. 2013; 77(S1): 69-88. doi: 10.1093/poq/nfs063
Dillman DA, Smyth JD. Design Effects in the Transition to Web-Based Surveys. American Journal of Preventive Medicine. 2007; 32(5): S90-S96. doi: 10.1016/j.amepre.2007.03.008
Weigold A, Weigold IK, Dykema SA, et al. Completing Surveys With Different Item Formats: Testing Equivalence. Social Science Computer Review. 2020; 39(6): 1179-1202. doi: 10.1177/0894439320955143
Yonnie Chyung SY, Kennedy M, Campbell I. Evidence-Based Survey Design: The Use of Ascending or Descending Order of Likert-Type Response Options. Performance Improvement. 2018; 57(9): 9-16. doi: 10.1002/pfi.21800
Tellis GJ, Chandrasekaran D. Extent and impact of response biases in cross-national survey research. International Journal of Research in Marketing. 2010; 27(4): 329-341. doi: 10.1016/j.ijresmar.2010.08.003
Bogner K, Landrock U. Response Biases in Standardised Surveys, Version 2.0. 2016.
Paunonen SV, LeBel EP. Socially desirable responding and its elusive effects on the validity of personality assessments. Journal of Personality and Social Psychology. 2012; 103(1): 158-175. doi: 10.1037/a0028165
Dillman DA. Mail and Internet surveys: The tailored design method—2007 Update with new Internet, visual, and mixed-mode guide. John Wiley & Sons; 2011.
Zeuzem S. Heterogeneous Virologic Response Rates to Interferon-Based Therapy in Patients with Chronic Hepatitis C: Who Responds Less Well? Annals of Internal Medicine. 2004; 140(5): 370. doi: 10.7326/0003-4819-140-5-200403020-00033
Conway MJ. How to collect data: Measurement & Evaluation. American Society for Training and Development; 2006.
Omrani A, Wakefield-Scurr J, Smith J, et al. Survey Development for Adolescents Aged 11–16 Years: A Developmental Science Based Guide. Adolescent Research Review. 2018; 4(4): 329-340. doi: 10.1007/s40894-018-0089-0
De Jong MG, Pieters R, Fox JP. Reducing Social Desirability Bias through Item Randomized Response: An Application to Measure Underreported Desires. Journal of Marketing Research. 2010; 47(1): 14-27. doi: 10.1509/jmkr.47.1.14
Krebs D, Hoffmeyer-Zlotnik JHP. Positive First or Negative First? Effects of the order of answering categories on response behavior. Methodology. 2010; 6(3): 118-127. doi: 10.1027/1614-2241/a000013
Chan JC. Response-Order Effects in Likert-Type Scales. Educational and Psychological Measurement. 1991; 51(3): 531-540. doi: 10.1177/0013164491513002
Rammstedt B, Krebs D. Does Response Scale Format Affect the Answering of Personality Scales? European Journal of Psychological Assessment. 2007; 23(1): 32-38. doi: 10.1027/1015-5759.23.1.32
Weng LJ, Cheng CP. Effects of Response Order on Likert-Type Scales. Educational and Psychological Measurement. 2000; 60(6): 908-924. doi: 10.1177/00131640021970989
Kaplan SA, Luchman JN, Mock L. General and Specific Question Sequence Effects in Satisfaction Surveys: Integrating Directional and Correlational Effects. Journal of Happiness Studies. 2012; 14(5): 1443-1458. doi: 10.1007/s10902-012-9388-5
Lasorsa DL. Question-Order Effects in Surveys: The Case of Political Interest, News Attention, and Knowledge. Journalism & Mass Communication Quarterly. 2003; 80(3): 499-512. doi: 10.1177/107769900308000302
Huang FL, Cornell DG. The impact of definition and question order on the prevalence of bullying victimization using student self-reports. Psychological Assessment. 2015; 27(4): 1484-1493. doi: 10.1037/pas0000149
Huang FL, Cornell DG. Question Order Affects the Measurement of Bullying Victimization Among Middle School Students. Educational and Psychological Measurement. 2016; 76(5): 724-740. doi: 10.1177/0013164415622664
Höhne J, Lenzner T. Investigating response order effects in web surveys using eye tracking. Psihologija. 2015; 48(4): 361-377. doi: 10.2298/psi1504361h
Höhne JK, Krebs D. Scale direction effects in agree/disagree and item-specific questions: a comparison of question formats. International Journal of Social Research Methodology. 2017; 21(1): 91-103. doi: 10.1080/13645579.2017.1325566
Brunel FF, Tietje BC, Greenwald AG. Is the implicit association test a valid and valuable measure of implicit consumer social cognition? Journal of Consumer Psychology. 2004; 14(4): 385-404. doi: 10.1207/s15327663jcp1404_8
Meredith M, Salant Y. On the Causes and Consequences of Ballot Order Effects. Political Behavior. 2012; 35(1): 175-197. doi: 10.1007/s11109-011-9189-2
Yan T, Keusch F. The Effects of the Direction of Rating Scales on Survey Responses in a Telephone Survey. Public Opinion Quarterly. 2015; 79(1): 145-165. doi: 10.1093/poq/nfu062
Stefkovics Á. Are Scale Direction Effects the Same in Different Survey Modes? Comparison of a Face-to-Face, a Telephone, and an Online Survey Experiment. Field Methods. 2022; 34(3): 206-222. doi: 10.1177/1525822x221105940
Garbarski D, Schaeffer NC, Dykema J. The effects of response option order and question order on self-rated health. Quality of Life Research. 2014; 24(6): 1443-1453. doi: 10.1007/s11136-014-0861-y
Malhotra N. Completion Time and Response Order Effects in Web Surveys. Public Opinion Quarterly. 2008; 72(5): 914-934. doi: 10.1093/poq/nfn050
Chen PH. Item order effects on attitude measures. University of Denver; 2010.
Schell KL, Oswald FL. Item grouping and item randomization in personality measurement. Personality and Individual Differences. 2013; 55(3): 317-321. doi: 10.1016/j.paid.2013.03.008
Mackinnon SP, Firth S. The effect of question structure on self-reported drinking: Ascending versus descending order effects. Journal of Research in Personality. 2018; 73: 21-26. doi: 10.1016/j.jrp.2017.10.004
Zhao Q, Linderholm T. Adult Metacomprehension: Judgment Processes and Accuracy Constraints. Educational Psychology Review. 2008; 20(2): 191-206. doi: 10.1007/s10648-008-9073-8
Harrison DA, McLaughlin ME. Cognitive processes in self-report responses: Tests of item context effects in work attitude measures. Journal of Applied Psychology. 1993; 78(1): 129-140. doi: 10.1037/0021-9010.78.1.129
Krosnick JA, Alwin DF. An Evaluation of a Cognitive Theory of Response-Order Effects in Survey Measurement. Public Opinion Quarterly. 1987; 51(2): 201. doi: 10.1086/269029
Tourangeau R. Spacing, Position, and Order: Interpretive Heuristics for Visual Features of Survey Questions. Public Opinion Quarterly. 2004; 68(3): 368-393. doi: 10.1093/poq/nfh035
Harkness J, Schoua-Glusberg A. Questionnaires in translation. 1998; 3: 87-126.
Diener E, Inglehart R, Tay L. Theory and Validity of Life Satisfaction Scales. Social Indicators Research. 2012; 112(3): 497-527. doi: 10.1007/s11205-012-0076-y
Yang Y. Response Option Order Effects in Cross-Cultural Context. An experimental investigation. 2019.
Leon CM, Aizpurua E, van der Valk S. Agree or Disagree: Does It Matter Which Comes First? An Examination of Scale Direction Effects in a Multi-device Online Survey. Field Methods. 2021; 34(2): 125-142. doi: 10.1177/1525822x211012259
Krosnick JA. Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology. 1991; 5(3): 213-236. doi: 10.1002/acp.2350050305
Krosnick JA, Holbrook AL, Berent MK, et al. The Impact of “No Opinion” Response Options on Data Quality. Public Opinion Quarterly. 2001; 66(3): 371-403. doi: 10.1086/341394
Terentev E, Maloshonok N. The impact of response options ordering on respondents’ answers to rating questions: results of two experiments. International Journal of Social Research Methodology. 2018; 22(2): 179-198. doi: 10.1080/13645579.2018.1510660
Robie C, Meade AW, Risavy SD, et al. Effects of Response Option Order on Likert-Type Psychometric Properties and Reactions. Educational and Psychological Measurement. 2022; 82(6): 1107-1129. doi: 10.1177/00131644211069406
Christian LM, Parsons NL, Dillman DA. Designing Scalar Questions for Web Surveys. Sociological Methods & Research. 2009; 37(3): 393-425. doi: 10.1177/0049124108330004
Copyright (c) 2024 Hamed Taherdoost
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors contributing to this journal agree to publish their articles under the Creative Commons Attribution 4.0 International License, allowing third parties to share their work (copy, distribute, transmit) and to adapt it for any purpose, even commercially, under the condition that the authors are given credit. With this license, authors hold the copyright.