Logo

The Data Daily

Does Conjoint Analysis Mitigate Social Desirability Bias? | Political Analysis | Cambridge Core

Does Conjoint Analysis Mitigate Social Desirability Bias? | Political Analysis | Cambridge Core

Does Conjoint Analysis Mitigate Social Desirability Bias?
Published online by Cambridge University Press:  15 September 2021
Department of Government, Dartmouth College, Hanover, NH03755, USA. E-mail: yusaku.horiuchi@dartmouth.edu
Zachary Markovich*
Department of Political Science, Massachusetts Institute of Technology, Cambridge, MA02139, USA. Email: zmarko@mit.edu
Teppei Yamamoto
Department of Political Science, Massachusetts Institute of Technology, Cambridge, MA02139, USA. Email: teppei@mit.edu
*
Rights & Permissions[Opens in a new window]
Abstract
How can we elicit honest responses in surveys? Conjoint analysis has become a popular tool to address social desirability bias (SDB), or systematic survey misreporting on sensitive topics. However, there has been no direct evidence showing its suitability for this purpose. We propose a novel experimental design to identify conjoint analysis’s ability to mitigate SDB. Specifically, we compare a standard, fully randomized conjoint design against a partially randomized design where only the sensitive attribute is varied between the two profiles in each task. We also include a control condition to remove confounding due to the increased attention to the varying attribute under the partially randomized design. We implement this empirical strategy in two studies on attitudes about environmental conservation and preferences about congressional candidates. In both studies, our estimates indicate that the fully randomized conjoint design could reduce SDB for the average marginal component effect (AMCE) of the sensitive attribute by about two-thirds of the AMCE itself. Although encouraging, we caution that our results are exploratory and exhibit some sensitivity to alternative model specifications, suggesting the need for additional confirmatory evidence based on the proposed design.
Keywords
© The Author(s) 2021. Published by Cambridge University Press on behalf of the Society for Political Methodology
Access options
Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)
Footnotes
Edited by Jeff Gill
References
Archer, K. J., and Kimes, R. V.. 2008. “Empirical Characterization of Random Forest Variable Importance Measures.” Computational Statistics & Data Analysis 52(4):2249–2260. CrossRef Google Scholar
Athey, S., and Imbens, G.. 2016. “Recursive Partitioning for Heterogeneous Causal Effects.” Proceedings of the National Academy of Sciences 113(27):7353–7360. CrossRef Google Scholar PubMed
Athey, S., Tibshirani, J., and Wager, S.. 2019. “Generalized Random Forests.” The Annals of Statistics 47(2):1148–1178. CrossRef Google Scholar
Atzmüller, C., and Steiner, P. M.. 2010. “Experimental Vignette Studies in Survey Research.” Methodology: European Journal of Research Methods for the Behavioral and Social Sciences 60(3):128–138. CrossRef Google Scholar
Bechtel, M. M., Genovese, F., and Scheve, K. F.. 2019. “Interests, Norms and Support for the Provision of Global Public Goods: The Case of Climate Co-operation.” British Journal of Political Science 49(4):1333–1355. CrossRef Google Scholar
Bechtel, M. M., and Scheve, K. F.. 2013. “Mass Support for Global Climate Agreements Depends on Institutional Design.” Proceedings of the National Academy of Sciences 110(34):13763–13768. CrossRef Google Scholar PubMed
Berinsky, A. J., Huber, G. A., and Lenz, G. S.. 2012. “Evaluating Online Labor Markets for Experimental Research: Amazon.com’s Mechanical Turk.” Political Analysis 20(3):351–368. CrossRef Google Scholar
Blair, G., Coppock, A., and Moor, M.. 2020. “When to Worry about Sensitivity Bias: A Social Reference Theory and Evidence from 30 Years of List Experiments.” American Political Science Review 114(4):1297–1315. CrossRef Google Scholar
Blair, G., and Imai, K.. 2012. “Statistical Analysis of List Experiments.” Political Analysis 20(1):47–77. CrossRef Google Scholar
Blair, G., Imai, K., and Zhou, Y.-Y.. 2015. “Design and Analysis of the Randomized Response Technique.” Journal of the American Statistical Association 110(511):1304–1319. CrossRef Google Scholar
Carey, J. M., Clayton, K., and Horiuchi, Y.. 2020. Campus Diversity: The Hidden Consensus. New York: Cambridge University Press. Google Scholar
Carrigan, M., and Attalla, A.. 2001. “The Myth of the Ethical Consumer – Do Ethics Matter in Purchase Behaviour?” Journal of Consumer Marketing 18(7):560–578. CrossRef Google Scholar
Chaudhuri, A. 2011. “Sustaining Cooperation in Laboratory Public Goods Experiments: A Selective Survey of the Literature.” Experimental Economics 14(1):47–83. CrossRef Google Scholar
de la Cuesta, B., Egami, N., and Imai, K.. 2021. “Improving the External Validity of Conjoint Analysis: The Essential Role of Profile Distribution.” Political Analysis, forthcoming. Google Scholar
Eck, K., Hatz, S., Crabtree, C., and Tago, A.. 2021. “Evade and Deceive? Citizen Responses to Surveillance.” Journal of Politics, forthcoming. CrossRef Google Scholar
Hainmueller, J., Hangartner, D., and Yamamoto, T.. 2015. “Validating Vignette and Conjoint Survey Experiments against Real-World Behavior.” Proceedings of the National Academy of Sciences 112(8):2395–2400. CrossRef Google Scholar PubMed
Hainmueller, J., and Hopkins, D. J.. 2015. “The Hidden American Immigration Consensus: A Conjoint Analysis of Attitudes Toward Immigrants.” American Journal of Political Science 59(3):529–548. CrossRef Google Scholar
Hainmueller, J., Hopkins, D. J., and Yamamoto, T.. 2014. “Causal Inference in Conjoint Analysis: Understanding Multidimensional Choices via Stated Preference Experiments.” Political Analysis 22(1):1–30. CrossRef Google Scholar
Hankinson, M. 2018. “When Do Renters Behave Like Homeowners? High Rent, Price Anxiety, and NIMBYism.” American Political Science Review 112(3):473–493. CrossRef Google Scholar
Horiuchi, Y., Markovich, Z., and Yamamoto, T.. 2021. “Replication Data for: Does Conjoint Analysis Mitigate Social Desirability Bias?” Harvard Dataverse, V2. https://doi.org/10.7910/DVN/4WDVDB . CrossRef Google Scholar
Horiuchi, Y., Smith, D. M., and Yamamoto, T.. 2020. “Identifying voter Preferences for Politicians’ Personal Attributes: A Conjoint Experiment in Japan.” Political Science Research and Method 8(1):75–91. CrossRef Google Scholar
Incerti, T. 2020. “Corruption Information and Vote Share: A Meta-Analysis and Lessons for Experimental Design.” American Political Science Review 114(3):761–774. CrossRef Google Scholar
Jenke, L., Bansak, K., Hainmueller, J., and Hangartner, D.. 2021. “Using Eye-Tracking to Understand Decision-Making in Conjoint Experiments.” Political Analysis 29(1):75–101. CrossRef Google Scholar
Kennedy, R., Clifford, S., Burleigh, T., Waggoner, P. D., Jewell, R., and Winter, N. J.. 2020. “The Shape of and Solutions to the Mturk Quality Crisis.” Political Science Research and Methods 8(4):614–629. CrossRef Google Scholar
Klaiman, K., Ortega, D. L., and Garnache, C.. 2016. “Consumer Preferences and Demand for Packaging Material and Recyclability.” Resources, Conservation and Recycling 115:1–8. CrossRef Google Scholar
Krumpal, I. 2013. “Determinants of Social Desirability Bias in Sensitive Surveys: A Literature Review.” Quality & Quantity 47(4):2025–2047. CrossRef Google Scholar
Krupnikov, Y., Piston, S., and Bauer, N. M.. 2016. “Saving Face: Identifying Voter Responses to Black Candidates and Female Candidates.” Political Psychology 37(2):253–273. CrossRef Google Scholar
Künzel, S. R., Sekhon, J. S., Bickel, P. J., and Yu, B.. 2019. “Metalearners for Estimating Heterogeneous Treatment Effects using Machine Learning.” Proceedings of the National Academy of Sciences 116(10):4156–4165. CrossRef Google Scholar PubMed
Künzel, S. R., Walter, S. J., and Sekhon, J. S.. 2019. “Causualtoolbox—Estimator Stability for Heterogeneous Treatment Effects.” Observational Studies 5(2):105–117. CrossRef Google Scholar
Mullinix, K. J., Leeper, T. J., Druckman, J. N., and Freese, J.. 2016. “The Generalizability of Survey Experiments.” Journal of Experimental Political Science 2(2):109–138. CrossRef Google Scholar
Ostrom, E. 1990. Governing the Commons: The Evolution of Institutions for Collective Action. New York: Cambridge University Press. CrossRef Google Scholar
Rodriguez, L. M., Neighbors, C., and Foster, D. W.. 2014. “Priming Effects of Self-reported Drinking and Religiosity.” Psychology of Addictive Behaviors 28(1):1–9. CrossRef Google Scholar PubMed
Teele, D. L., Kalla, J., and Rosenbluth, F.. 2018. “The Ties that Double Bind: Social Roles and Women’s Underrepresentation in Politics.” American Political Science Review 112(3):525–541. CrossRef Google Scholar
Tibshirani, J., et al. 2020. “Package grf: Generalized Random Forests.” Version 1.2.0, available at the Comprehensive R Archive Network. Google Scholar
Tourangeau, R., and Yan, T.. 2007. “Sensitive Questions in Surveys.” Psychological Bulletin 133(5):859–883. CrossRef Google Scholar PubMed
Wager, S., and Athey, S.. 2018. “Estimation and Inference of Heterogeneous Treatment Effects using Random Forests.” Journal of the American Statistical Association 113(523):1228–1242. CrossRef Google Scholar
Wallander, L. 2009. “25 years of Factorial Surveys in Sociology: A Review.” Social Science Research 38(3):505–520. CrossRef Google Scholar
Zaller, J., and Feldman, S.. 1992. “A Simple Theory of the Survey Response: Answering Questions versus Revealing Preferences.” American Journal of Political Science 36(3):579–616. CrossRef Google Scholar
Horiuchi et al. Dataset

Images Powered by Shutterstock