Exploring the impact of response option sequences/order on survey rating scale responses

  • Hamed  Taherdoost University Canada West, Vancouver, BC V6B 1V9, Canada
Ariticle ID: 452
88 Views, 48 PDF Downloads
Keywords: response order effects; rating scale; data quality; response option sequences; cognitive processes

Abstract

In the realm of survey data quality, inaccuracies and nonresponses pose significant challenges. One significant factor affecting this is the order in which response options are presented, leading to what is known as response order effects. This research delves into the extensive studies conducted on how the sequence of answer options influences respondents’ ratings in survey questions. Specifically, we focus on analyzing previous research to understand how the arrangement of scale points on a rating scale impacts the cognitive processes and reaction strategies of respondents. By synthesizing existing studies, this investigation aims to provide insights into the critical role that presentation sequence plays in shaping survey outcomes, thereby offering valuable perspectives for enhancing data quality in future survey designs.

References

[1] Krosnick JA, Judd CW, Wittenbrink B. The measurement of attitudes. In: The Handbook of Attitudes. Routledge; 2018. pp. 45-105.

[2] Nemoto T, Beglar D. Likert-scale questionnaires. In: JALT 2013 conference proceedings. 2014.

[3] Curado MAS, Teles J, Marôco J. Analysis of variables that are not directly observable: influence on decision-making during the research process. Revista da Escola de Enfermagem da USP. 2014; 48(1): 146-152. doi: 10.1590/s0080-623420140000100019

[4] Gure GS. Different scale construction approaches used to attitude measurement in social science research. International Journal of Research in Economics and Social Sciences. 2015; 5(1): 26-44.

[5] Harpe SE. How to analyze Likert and other rating scale data. Currents in Pharmacy Teaching and Learning. 2015; 7(6): 836-850. doi: 10.1016/j.cptl.2015.08.001

[6] Fricker S, Tourangeau R. Examining the Relationship Between Nonresponse Propensity and Data Quality in Two National Household Surveys. Public Opinion Quarterly. 2010; 74(5): 934-955. doi: 10.1093/poq/nfq064

[7] ten Klooster PM, Visser M, de Jong MDT. Comparing two image research instruments: The Q-sort method versus the Likert attitude questionnaire. Food Quality and Preference. 2008; 19(5): 511-518. doi: 10.1016/j.foodqual.2008.02.007

[8] Maloshonok N, Terentev E. The impact of visual design and response formats on data quality in a web survey of MOOC students. Computers in Human Behavior. 2016; 62: 506-515. doi: 10.1016/j.chb.2016.04.025

[9] Schulte-Mecklenbeck M, Kuehberger A, Johnson JG, et al. A Handbook of Process Tracing Methods for Decision Research. Psychology Press; 2011. doi: 10.4324/9780203875292

[10] Calcagnì A, Lombardi L. Dynamic Fuzzy Rating Tracker (DYFRAT): a novel methodology for modeling real-time dynamic cognitive processes in rating scales. Applied Soft Computing. 2014; 24: 948-961. doi: 10.1016/j.asoc.2014.08.049

[11] Ülkümen G, Fox CR, Malle BF. Two dimensions of subjective uncertainty: Clues from natural language. Journal of Experimental Psychology: General. 2016; 145(10): 1280-1297. doi: 10.1037/xge0000202

[12] Kahneman D, Tversky A. Variants of uncertainty. Cognition. 1982; 11(2): 143-157. doi: 10.1016/0010-0277(82)90023-3

[13] Rabinowitz J, Schooler NR, Brown B, et al. Consistency checks to improve measurement with the Montgomery-Asberg Depression Rating Scale (MADRS). Journal of Affective Disorders. 2019; 256: 143-147. doi: 10.1016/j.jad.2019.05.077

[14] von Davier M, Carstensen CH. Detecting response styles and faking in personality and organizational assessments by mixed Rasch models. Multivariate and Mixture Distribution Rasch Models. Springer New York; 2007. doi: 10.1007/978-0-387-49839-3

[15] Muthukumarana S, Swartz TB. Bayesian Analysis of Ordinal Survey Data Using the Dirichlet Process to Account for Respondent Personality Traits. Communications in Statistics - Simulation and Computation. 2013; 43(1): 82-98. doi: 10.1080/03610918.2012.698773

[16] Lombardi L, Pastore M, Nucci M, et al. SGR Modeling of Correlational Effects in Fake Good Self-report Measures. Methodology and Computing in Applied Probability. 2014; 17(4): 1037-1055. doi: 10.1007/s11009-014-9427-2

[17] Furnham A. Response bias, social desirability and dissimulation. Personality and individual differences. 1986; 7(3): 385-400. doi: 10.1016/0191-8869(86)90014-0

[18] Tourangeau R, Couper MP, Conrad FG. Up Means Good. Public Opinion Quarterly. 2013; 77(S1): 69-88. doi: 10.1093/poq/nfs063

[19] Dillman DA, Smyth JD. Design Effects in the Transition to Web-Based Surveys. American Journal of Preventive Medicine. 2007; 32(5): S90-S96. doi: 10.1016/j.amepre.2007.03.008

[20] Weigold A, Weigold IK, Dykema SA, et al. Completing Surveys With Different Item Formats: Testing Equivalence. Social Science Computer Review. 2020; 39(6): 1179-1202. doi: 10.1177/0894439320955143

[21] Yonnie Chyung SY, Kennedy M, Campbell I. Evidence-Based Survey Design: The Use of Ascending or Descending Order of Likert-Type Response Options. Performance Improvement. 2018; 57(9): 9-16. doi: 10.1002/pfi.21800

[22] Tellis GJ, Chandrasekaran D. Extent and impact of response biases in cross-national survey research. International Journal of Research in Marketing. 2010; 27(4): 329-341. doi: 10.1016/j.ijresmar.2010.08.003

[23] Bogner K, Landrock U. Response Biases in Standardised Surveys, Version 2.0. 2016.

[24] Paunonen SV, LeBel EP. Socially desirable responding and its elusive effects on the validity of personality assessments. Journal of Personality and Social Psychology. 2012; 103(1): 158-175. doi: 10.1037/a0028165

[25] Dillman DA. Mail and Internet surveys: The tailored design method—2007 Update with new Internet, visual, and mixed-mode guide. John Wiley & Sons; 2011.

[26] Zeuzem S. Heterogeneous Virologic Response Rates to Interferon-Based Therapy in Patients with Chronic Hepatitis C: Who Responds Less Well? Annals of Internal Medicine. 2004; 140(5): 370. doi: 10.7326/0003-4819-140-5-200403020-00033

[27] Conway MJ. How to collect data: Measurement & Evaluation. American Society for Training and Development; 2006.

[28] Omrani A, Wakefield-Scurr J, Smith J, et al. Survey Development for Adolescents Aged 11–16 Years: A Developmental Science Based Guide. Adolescent Research Review. 2018; 4(4): 329-340. doi: 10.1007/s40894-018-0089-0

[29] De Jong MG, Pieters R, Fox JP. Reducing Social Desirability Bias through Item Randomized Response: An Application to Measure Underreported Desires. Journal of Marketing Research. 2010; 47(1): 14-27. doi: 10.1509/jmkr.47.1.14

[30] Krebs D, Hoffmeyer-Zlotnik JHP. Positive First or Negative First? Effects of the order of answering categories on response behavior. Methodology. 2010; 6(3): 118-127. doi: 10.1027/1614-2241/a000013

[31] Chan JC. Response-Order Effects in Likert-Type Scales. Educational and Psychological Measurement. 1991; 51(3): 531-540. doi: 10.1177/0013164491513002

[32] Rammstedt B, Krebs D. Does Response Scale Format Affect the Answering of Personality Scales? European Journal of Psychological Assessment. 2007; 23(1): 32-38. doi: 10.1027/1015-5759.23.1.32

[33] Weng LJ, Cheng CP. Effects of Response Order on Likert-Type Scales. Educational and Psychological Measurement. 2000; 60(6): 908-924. doi: 10.1177/00131640021970989

[34] Kaplan SA, Luchman JN, Mock L. General and Specific Question Sequence Effects in Satisfaction Surveys: Integrating Directional and Correlational Effects. Journal of Happiness Studies. 2012; 14(5): 1443-1458. doi: 10.1007/s10902-012-9388-5

[35] Lasorsa DL. Question-Order Effects in Surveys: The Case of Political Interest, News Attention, and Knowledge. Journalism & Mass Communication Quarterly. 2003; 80(3): 499-512. doi: 10.1177/107769900308000302

[36] Huang FL, Cornell DG. The impact of definition and question order on the prevalence of bullying victimization using student self-reports. Psychological Assessment. 2015; 27(4): 1484-1493. doi: 10.1037/pas0000149

[37] Huang FL, Cornell DG. Question Order Affects the Measurement of Bullying Victimization Among Middle School Students. Educational and Psychological Measurement. 2016; 76(5): 724-740. doi: 10.1177/0013164415622664

[38] Höhne J, Lenzner T. Investigating response order effects in web surveys using eye tracking. Psihologija. 2015; 48(4): 361-377. doi: 10.2298/psi1504361h

[39] Höhne JK, Krebs D. Scale direction effects in agree/disagree and item-specific questions: a comparison of question formats. International Journal of Social Research Methodology. 2017; 21(1): 91-103. doi: 10.1080/13645579.2017.1325566

[40] Brunel FF, Tietje BC, Greenwald AG. Is the implicit association test a valid and valuable measure of implicit consumer social cognition? Journal of Consumer Psychology. 2004; 14(4): 385-404. doi: 10.1207/s15327663jcp1404_8

[41] Meredith M, Salant Y. On the Causes and Consequences of Ballot Order Effects. Political Behavior. 2012; 35(1): 175-197. doi: 10.1007/s11109-011-9189-2

[42] Yan T, Keusch F. The Effects of the Direction of Rating Scales on Survey Responses in a Telephone Survey. Public Opinion Quarterly. 2015; 79(1): 145-165. doi: 10.1093/poq/nfu062

[43] Stefkovics Á. Are Scale Direction Effects the Same in Different Survey Modes? Comparison of a Face-to-Face, a Telephone, and an Online Survey Experiment. Field Methods. 2022; 34(3): 206-222. doi: 10.1177/1525822x221105940

[44] Garbarski D, Schaeffer NC, Dykema J. The effects of response option order and question order on self-rated health. Quality of Life Research. 2014; 24(6): 1443-1453. doi: 10.1007/s11136-014-0861-y

[45] Malhotra N. Completion Time and Response Order Effects in Web Surveys. Public Opinion Quarterly. 2008; 72(5): 914-934. doi: 10.1093/poq/nfn050

[46] Chen PH. Item order effects on attitude measures. University of Denver; 2010.

[47] Schell KL, Oswald FL. Item grouping and item randomization in personality measurement. Personality and Individual Differences. 2013; 55(3): 317-321. doi: 10.1016/j.paid.2013.03.008

[48] Mackinnon SP, Firth S. The effect of question structure on self-reported drinking: Ascending versus descending order effects. Journal of Research in Personality. 2018; 73: 21-26. doi: 10.1016/j.jrp.2017.10.004

[49] Zhao Q, Linderholm T. Adult Metacomprehension: Judgment Processes and Accuracy Constraints. Educational Psychology Review. 2008; 20(2): 191-206. doi: 10.1007/s10648-008-9073-8

[50] Harrison DA, McLaughlin ME. Cognitive processes in self-report responses: Tests of item context effects in work attitude measures. Journal of Applied Psychology. 1993; 78(1): 129-140. doi: 10.1037/0021-9010.78.1.129

[51] Krosnick JA, Alwin DF. An Evaluation of a Cognitive Theory of Response-Order Effects in Survey Measurement. Public Opinion Quarterly. 1987; 51(2): 201. doi: 10.1086/269029

[52] Tourangeau R. Spacing, Position, and Order: Interpretive Heuristics for Visual Features of Survey Questions. Public Opinion Quarterly. 2004; 68(3): 368-393. doi: 10.1093/poq/nfh035

[53] Harkness J, Schoua-Glusberg A. Questionnaires in translation. 1998; 3: 87-126.

[54] Diener E, Inglehart R, Tay L. Theory and Validity of Life Satisfaction Scales. Social Indicators Research. 2012; 112(3): 497-527. doi: 10.1007/s11205-012-0076-y

[55] Yang Y. Response Option Order Effects in Cross-Cultural Context. An experimental investigation. 2019.

[56] Leon CM, Aizpurua E, van der Valk S. Agree or Disagree: Does It Matter Which Comes First? An Examination of Scale Direction Effects in a Multi-device Online Survey. Field Methods. 2021; 34(2): 125-142. doi: 10.1177/1525822x211012259

[57] Krosnick JA. Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology. 1991; 5(3): 213-236. doi: 10.1002/acp.2350050305

[58] Krosnick JA, Holbrook AL, Berent MK, et al. The Impact of “No Opinion” Response Options on Data Quality. Public Opinion Quarterly. 2001; 66(3): 371-403. doi: 10.1086/341394

[59] Terentev E, Maloshonok N. The impact of response options ordering on respondents’ answers to rating questions: results of two experiments. International Journal of Social Research Methodology. 2018; 22(2): 179-198. doi: 10.1080/13645579.2018.1510660

[60] Robie C, Meade AW, Risavy SD, et al. Effects of Response Option Order on Likert-Type Psychometric Properties and Reactions. Educational and Psychological Measurement. 2022; 82(6): 1107-1129. doi: 10.1177/00131644211069406

[61] Christian LM, Parsons NL, Dillman DA. Designing Scalar Questions for Web Surveys. Sociological Methods & Research. 2009; 37(3): 393-425. doi: 10.1177/0049124108330004

Published
2024-04-12
How to Cite
Taherdoost, H. (2024). Exploring the impact of response option sequences/order on survey rating scale responses. Forum for Philosophical Studies, 1(1), 452. https://doi.org/10.59400/fps.v1i1.452
Section
Original Research Articles