Metascience and Methodology

In this project, we study aspects of psychology’s replication crisis, test and improve methods, and investigate reporting transparency in Human-Computer Interaction publications. For instance, we collaborated with several researchers around the world to examine whether peer review prior to data collection increases the replicability of research findings in psychology (Ebersole et al., 2020). Ensuring and improving the quality of data collected through online surveys is another aspect of this research project and offers several opportunities for research from an HCI perspective (Brühlmann et al., 2020). Finally, methodological literature reviews such as the examination of cultural biases in human-computer interaction research (Linxen et al., 2021) shed new light on potentially understudied populations and research questions.

  • Masterproject: possible
  • Contact: E-Mail


Conference Proceedings

  • Linxen, S., Sturm, C., Brühlmann, F., Cassau, V., Opwis, K., & Reinecke, K. (2021). How WEIRD is CHI? Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems.

Journal Articles

  • Brühlmann, F., Petralito, S., Aeschbach, L. F., & Opwis, K. (2020). The quality of data collected online: An investigation of careless responding in a crowdsourced sample. Methods in Psychology, 2, 100022.
  • Ebersole, C. R., Mathur, M. B., Baranski, E., Bart-Plange, D.-J., Buttrick, N. R., Chartier, C. R., Corker, K. S., Corley, M., Hartshorne, J. K., IJzerman, H., Lazarevic, L. B., Rabagliati, H., Ropovik, I., Aczel, B., Aeschbach, L. F., Andrighetto, L., Arnal, J. D., Arrow, H., Babincak, P., … Nosek, B. A. (2020). Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability. Advances in Methods and Practices in Psychological Science, 0(0), 2515245920958687.