Springe direkt zu Inhalt

Prof. Dr. Steffi Pohl

steffi_neu

Arbeitsbereich Methoden und Evaluation/Qualitätssicherung

Professorin

Adresse
Habelschwerdter Allee 45
Raum KL 23/226
14195 Berlin
Sekretariat
Judith Schmidt

Sprechstunde

Nach Vereinbarung.

Beruflicher Werdegang

  • seit Oktober 2019 Universitätsprofessorin für Methoden und Evaluation/Qualitätssicherung im Fachbereich Erziehungswissenschaft und Psychologie, Freie Universität Berlin
  • 2013-2019 Juniorprofessorin für Methoden und Evaluation/Qualitätssicherung im Fachbereich Erziehungswissenschaft und Psychologie, Freie Universität Berlin
  • 2009-2013   Wissenschaftliche Mitarbeiterin an der Professur für Psychologie mit Schwerpunkt Methoden der empirischen Bildungsforschung (Prof. Dr. Claus Carstensen), Projekt Nationales Bildungspanel, Otto-Friedrich-Universität Bamberg
  • 2005-2009   Wissenschaftliche Mitarbeiterin am Lehrstuhl für Methodenlehre und Evaluationsforschung (Prof. Dr. Rolf Steyer), Friedrich-Schiller Universität Jena
  • 2007   Wissenschaftliche Mitarbeiterin im Projekt kompetenztest.de (Dr. Christof Nachtigall), Friedrich-Schiller-Universität Jena
  • 2004-2005   Wissenschafliche Mitarbeiterin im e-learning Projekt SPSSinteraktiv (Prof. Dr. Renate Soellner), Freie Universität Berlin
  • 2004   Wissenschaftliche Mitarbeiterin (Vertretung) in der Abteilung Methoden der Psychologie (Prof. Dr. Albrecht Iseler), Freie Universität Berlin

Forschungsaufenthalte

  • 2018 (Juli) Forschungsaufenthalt bei Lale Khorramdel, Educational Testing Service, Princeton, USA
  • 2018 (März) Forschungsaufenthalt bei Carolin Strobl, Universität Zürich, Schweiz
  • 2018 (Februar) Forschungsaufenthalt bei Inga Schwabe, Tilburg University, Niederlande
  • 2014 (September-Oktober) Forschungsaufenthalt bei Matthias von Davier, Educational Testing Service, Princeton, NJ, USA
  • 2008 (September-Oktober) Forschungsaufenthalt bei Thomas D. Cook, Northwestern University, Evanston, IL, USA

Ausbildung

  • 2005-2010   Promotion zum Thema "Modeling Method Effects" an der Friedrich-Schiller-Universität Jena (Betreuer: Rolf Steyer)
  • 2002-2003   Studium "MSc Research Methods in Psychology" an der University of Hertfordshire, UK
  • 1998-2004   Studium Diplom-Psychologie an der Freien Universität Berlin

Auszeichnungen

  • 2020 Early Career Award der Psychometric Society
  • 2011 Gustav A. Lienert Dissertations-Preis der Fachgruppe Methoden und Evaluation der DGPs

Editorenschaften

  • seit 2018 Associate Editor of Psychometrika
  • 2016-2019 Associate Editor of the British Journal of Mathematical and Statistical Psychology
  • seit 2016 Editorial Board der Zeitschrift für Psychologie

Funktionen

  • seit 2020 Vertrauensdozentin der Studienstiftung
  • seit 2020 Member of the Editorial Council of the Psychometric Society
  • seit 2018 Member of the Executive Committee in the European Association of Methodology
  • seit 2018 Faculty Member of the ’Berlin School of Mind and Brain’
  • 2017 – 2019 Sprecherin der Fachgruppe Methoden und Evaluation der DGPs
  • 2011 – 2017 Kassenwartin der Fachgruppe Methoden und Evaluation der DGPs

Mitgliedschaften

Aktuelle Lehrveranstaltungen im SoSe 2023

  • Vorlesung: Verfahren der quantitativen Forschung
  • Seminar: Quantitative Forschungsmethoden II
  • Vorlesung: Evaluationsforschung Psychotherapie
  • Seminar: Grundlagen psychologischer Forschung

Forschungsinteressen

  • Psychometrie
  • Log-Daten Analyse
  • Fehlende Werte
  • Kausalität

Forschungsprojekte finden Sie hier.

Publikationen

Zeitschriftenbeiträge (mit peer review)

  • Ulitzsch, E., Pohl, S., Khorramdel, L., Kroehne, U., & von Davier, M. (2024). Using response times for joint modeling of careless responding and attentive response styles. Journal of Educational and Behavioral Statistics, 49(2), 173-206. https://doi.org/10.3102/10769986231173607
  • Boettcher, J., Heinrich, M., Boettche, M., Burchert, S., Glaesmer, H., Gouzoulis-Mayfrank, E., Heeke, C., Hernek, M., Knaevelsrud, C., Konnopka, A., Muntendorf, L., Nilles, H., Nohr, L., Pohl, S., Paskuy, S., Reinhardt, I., Sierau, S., Stammel, N., Wirz, C., … Wagner, B. (2024). Internet-based transdiagnostic treatment for emotional disorders in Arabic- and Farsi-speaking refugees: Study protocol of a randomized controlled trial. Trials, 25(1), 13. https://doi.org/10.1186/s13063-023-07845-5
  • Ranger, J., Wolgast, A., Much, S., Mutak, A., Krause, R., & Pohl, S. (2023). Disentangling different aspects of change in tests with the D-Diffusion model. Multivariate Behavioral Research 58(5). https://doi.org/10.1080/00273171.2023.2171356
  • Schaeuffele, C., Heinrich, M., Behr, S., Fenski, F., Hammelrath, L., Zagorscak, P., Jansen, A., Pohl, S., Boettcher, J., & Knaevelsrud, C. (2022). Increasing the effectiveness of psychotherapy in routine care through blended therapy with transdiagnostic online modules (PsyTOM): Study protocol for a randomized controlled trial. Trials, 23(1), 830. https://doi.org/10.1186/s13063-022-06757-0
  • Schulze, D., Reuter, B., & Pohl, S. (2022). Measurement invariance: Dealing with the uncertainty in anchor item choice by model averaging. Structural Equation Modeling, 29(4), 521–530. https://doi.org/10.1080/10705511.2021.2012785
  • Ulitzsch, E. Pohl, S., Khorramdel, L., Kroehne, U., & von Davier, M. (2022). A response-time-based latent response mixture model for identifying and modeling careless and insufficient effort responding in survey data. Psychometrika, 87(2), 593–619. https://doi.org/10.1007/s11336-021-09817-7
  • Pohl, S., Schulze, D., & Stets, E. (2021). Partial measurement invariance: Extending and evaluating the cluster approach for identifying anchor items. Applied Psychological Measurement. doi.org/10.1177/2F01466216211042809
  • Ulitzsch, E., He, Q., & Pohl, S. (2021). Using sequence mining techniques for understanding incorrect behavioral patterns on interactive tasks. Journal of Educational and Behavioral Statistics, 47(1), 3-35. doi: 10.3102/10769986211010467
  • Pohl, S., Ulitzsch, E., & von Davier, M. (2021). Reframing country rankings in educational assessments. Science, 372(6540), 338-340. doi:10.1126/science.abd3300
  • Ranger, J., Kuhn, T., & Pohl, S. (2021). Effects of motivation on the accuracy and speed of responding in tests: The speed-accuracy tradeoff revisited. Measurement: Interdisciplinary Research and Perspectives19(1), 15-38. doi: 10.1080/15366367.2020.1750934
  • Ulitzsch, E., He, Q., Ulitzsch, V., Molter, H., Nichterlein, A., Niedermeier, R., & Pohl, S. (2021). Combining Clickstream Analyses and Graph-Modeled Data Clustering for Identifying Common Response Processes. Psychometrika, 86, 190-214. doi: 10.1007/s11336-020-09743-0
  • Ulitzsch, E., Penk, C., von Davier, M., & Pohl, S. (2021). Model meets reality: Validating a new behavioral measure for test-taking effort. Educational Assessment,26(2), 104-124. doi: 10.1080/10627197.2020.1858786
  • Maia, D. D. A., Pohl, S., Okuda, P. M. M., Liu, T., Publisi, M. L., Ploubidis, G., Eid, M., Cogo-Moreira, H. (2020).Psychometric properties and optimizing of the Bracken School Readiness Assessment. Educational Assessment, Evaluation and Accountability, 1-13. doi: 10.1007/s11092-020-09339-3
  • Schulze, D., & Pohl, S. (2020). Finding clusters of measurement invariant items for continuous covariates. Structural Equation Modeling: A Multidisciplinary Journal, 1-10. doi: 10.1080/10705511.2020.1771186
  • Pohl, S., & Schulze, D. (2020). Assessing group comparisons or change over time under measurement non-invariance: The cluster approach for nonuniform DIF. Psychological Test and Assessment Modeling62(2), 281-303.
  • Pohl, S. & Becker, B. (2020). Performance of missing data approaches under nonignorable missing data conditions. Methodology16(2), 147-165. doi: 10.5964/meth.2805
  • Ulitzsch, E., von Davier, M., & Pohl, S. (2019). A hierarchical latent response model for inferences about examinee engagement in terms of guessing and item-level nonresponse. British Journal of Mathematical and Statistical Psychology. 73, 83-112doi: 10.1111/bmsp.12188
  • Ulitzsch, E., von Davier, M., & Pohl, S. (2019). A Multiprocess item response model for not-reached items due to time limits and quitting. Educational and Psychological Measurement. 80(3), 522-547. doi:10.1177/0013164419878241
  • Ulitzsch, E., von Davier, M., & Pohl, S. (2019). Using response times for joint modeling of response and omission behavior. Multivariate Behavioral Research55(3), 425-453. doi:10.1080/00273171.2019.1643699
  • Schwabe, I., Gu, Z., Tijmstra, J., Hatemi, P., & Pohl, S. (2019). Psychometric modelling of longitudinal genetically-informative twin data. Frontiers in Genetics, 10, 837. doi:10.3389/fgene.2019.00837
  • Pohl, S., Ulitzsch, E., & von Davier, M. (2019). Using response times to model not-reached items due to time limits.Psychometrika, 84(3), 892-920. doi:10.1007/s11336-019-09669-2
  • Sengewald, M.-A., & Pohl, S. (2019). Compensation and amplification of attenuation bias in causal effect estimates. Psychometrika, 84(2), 589-610. doi:10.1007/s11336-019-09665-6
  • Sachse, K., Mahler, N., & Pohl, S. (2019). When nonresponse mechanisms change: Effects on trends and group comparisons in international large-scale assessments. Educational and Psychological Measurement, 79(4), 699-726. doi:10.1177/0013164419829196
  • Pohl, S. & von Davier, M. (2018) Commentary: On the importance of the speed-ability trade-off when dealing with not reached items. Frontiers in Psychology, 9:1988. doi: 10.3389/fpsyg.2018.01988
  • Sengewald, M.-A., Steiner, P. M., & Pohl, S. (2018). When does measurement error in covariates impact causal effect estimates? - Analytical derivations of different scenarios and an empirical illustration. British Journal of Mathematical and Statistical Psychology, 72(2), 244-270. doi: 10.1111/bmsp.12146
  • Köhler, C., Pohl, S., & Carstensen, C. H. (2017). Dealing with item nonresponse in large-scale cognitive assessments: The impact of missing data methods on estimated explanatory relationships. Journal of Educational Measurement, 54(4), 397-419. doi:10.1111/jedm.12154
  • Haberkorn, K., Pohl, S., & Carstensen, C. H. (2016). Incorporating different response formats of competence tests in an IRT model. Psychological Test and Assessment Modeling, 58(2), 223-252.
  • Pohl, S., Südkamp, A., Hardt, K., Carstensen, C. H., & Weinert, S. (2016). Testing Students with Special Educational Needs in Large-Scale Assessments: Psychometric Properties of Test Scores and Associations with Test Taking Behavior. Frontiers in Psychology, 154(7), 1-14. doi: 10.3389/fpsyg.2016.00154
  • Aßmann, C., Gaasch, C., Pohl, S., & Carstensen, C. H. (2015). Bayesian estimation in IRT models with missing values in background variables. Psychological Test and Assessment Modeling, 57(4), 595-618.
  • Köhler, C., Pohl, S., & Carstensen, C. H. (2015). Investigating mechanisms for missing responses in competence tests. Psychological Test and Assessment Modeling, 57(4), 499-522.
  • Südkamp, A., Pohl, S., & Weinert, S. (2015). Competence assessment of students with special educational needs: Identification of appropriate testing accommodations. Frontline Learning Research, 3 (2), 1-25. doi:10.14786/flr.v3i2.130
  • Köhler, C., Pohl, S., & Carstensen, C. H. (2014). Taking the missing propensity into account when estimating competence scores: Evaluation of IRT models for non-ignorable omissions. Educational and Psychological Measurement, 75(5), 850-875. doi:10.1177/0013164414561785
  • Haberkorn, K., Lockl, K., Pohl, S., Ebert, S., & Weinert, S. (2014). Metacognitive knowledge in children at early elementary school. Metacognition and Learning, 9(3), 239-263. doi:10.1007/s11409-014-9115-1
  • Pohl, S., Gräfe, L., & Rose, N. (2014). Dealing with omitted and not reached items in competence tests - Evaluating approaches accounting for missing responses in IRT models. Educational and Psychological Measurement, 74(3), 423-452. doi:10.1177/0013164413504926
  • Pohl, S. (2014). Longitudinal multi-stage testing. Journal of Educational Measurement, 50(4), 447-468. doi: 10.1111/jedm.12028
  • Pohl, S., & Carstensen, C. (2013). Scaling the competence tests in the National Educational Panel Study – Many questions, some answers, and further challenges. Journal for Educational Research Online, 5(2), 189-216.
  • Raykov, T., & Pohl, S. (2013). Essential unidimensionality examination for multi-component scales: An interrelationship decomposition approach. Educational and Psychological Measurement, 73(4), 581-600. doi:10.1177/0013164412470451
  • Raykov, T., & Pohl, S. (2013). On studying common factor variance in multiple component measuring instruments. Educational and Psychological Measurement, 73(2), 191-209. doi:10.1177/0013164412458673
  • Cook, T. D., Pohl, S., & Steiner, P. M. (2011). Die relative Bedeutung der Kovariatenwahl, Reliabilität und Art der Datenanalyse für die Schätzung kausaler Effekte aus Beobachtungsdaten. Zeitschrift für Evaluation, 10(2), 203-224.
  • Cook, T. D., Steiner, P. M., & Pohl, S. (2010). How bias reduction is affected by covariate choice, unreliability, and mode of data analysis: Results from two types of within-study Comparisons. Multivariate Behavioral Research, 44(6), 828-847. doi:10.1080/00273170903333673
  • Pohl, S., & Steyer, R. (2010). Modeling common traits and method effects in multitrait-multimethod analysis. Multivariate Behavioral Research, 45(1), 45-72. doi:10.1080/00273170903504729
  • Pohl, S., Steiner, P. M., Eisermann, J., Soellner, R., & Cook, T. D. (2009). Unbiased causal inference from an observational study: Results of a within-study comparison. Educational Evaluation and Policy Analysis, 31(4), 463-479. doi:10.3102/0162373709343964
  • Vautier, S., & Pohl, S. (2009). Do balanced scales assess bipolar constructs? The case of the STAI scales. Psychological Assessment, 21(2), 187-193. doi: 10.1037/a0015312
  • Pohl, S., Steyer, R., & Kraus, K. (2008). Modelling method effects as individual causal effects. Journal of the Royal Statistical Society, Series A, 171(1), 41-63. doi:10.1111/j.1467-985X.2007.00517.x

Gastherausgeberschaft

  • Twardawski, M., Gollwitzer, M., Pohl, S. & Bosnjak, M., (2022). Special issue on ’What Drives Second- and Third-Party Punishment? Conceptual Replications of the ‘Intuitive Retributivism’ Hypothesis’. Zeitschrift für Psychologie, 230(2), 77-83.
  • Pohl, S., & Aßmann, C. (2015). Missing values in large scale assessment studies [Special issue]. Psychological Test and Assessment Modeling, 57(4), 469-471.

Buchbeiträge

  • Pohl, S., Sengewald, M.-A., & Steyer, R. (2016). Adjustment when Covariates are Fallible. In W. Wiedermann, & A. von Eye (Hrsg.), Statistics and Causality: Methods for Applied Empirical Research (pp. 363-384). Hoboken, NJ: Wiley.
  • Aßmann, C., Carstensen, C. H., Gaasch, C., & Pohl, S. (2016). Estimation of plausible values considering partially missing background information: A data augmented MCMC approach. In H.-P. Blossfeld, J. von Maurice, M. Bayer, & J. Skopek (Hrsg.), Methodological Issues of Longitudinal Surveys (pp. 503-521). Wiesbaden: Springer VS.
  • Südkamp, A., Pohl, S., Heydrich, J., & Weinert, S. (2016). Including students with special educational needs in the competence assessment – Results on the comparability of test scores in reading. In H.-P. Blossfeld, J. von Maurice, M. Bayer, & J. Skopek (Hrsg.), Methodological Issues of Longitudinal Surveys (pp. 485-501). Wiesbaden: Springer VS.
  • Pohl, S., Haberkorn, K., & Carstensen, C. H. (2015). Measuring competencies across the lifespan: Challenges of linking test scores. In M. Stemmler, A. von Eye, & W. Wiedermann (Hrsg.), Dependent Data in Social Sciences Research: Forms, Issues, and Methods of Analysis (pp. 281-308 ). Cham: Springer.
  • Südkamp, A., Pohl, S., Hardt, K., Jordan, A.-K., & Duchhardt, C. (2015). Kompetenzmessung in den Bereichen Lesen und Mathematik bei Schülerinnen und Schülern mit sonderpädagogischem Förderbedarf. In P. Kuhl, P. Stanat, B. Lütje-Klose, C. Gresch, H. A. Pant, & M. Prenzel (Hrsg.), Inklusion von Schülerinnen und Schülern mit sonderpädagogischem Förderbedarf in Schulleistungserhebungen (pp. 243-272). Wiesbaden: Springer VS.
  • Haberkorn, K., Pohl, S., Carstensen, C., & Wiegand, E. (2015). Scoring of complex multiple choice items in NEPS competence tests. In H.-P. Blossfeld, J. von Maurice, M. Bayer, & J. Skopek (Hrsg.), Methodological Issues of Longitudinal Surveys (pp. 523-540). Wiesbaden: Springer VS.
  • Pohl, S., & Steyer, R. (2012). Modeling traits and method effects as latent variables. In S. Salzborn, E. Davidov, & J. Reinecke (Hrsg.). Methods, Theories, and Empirical Applications in the Social Sciences – Festschrift for Peter Schmidt (pp. 57-65). VS Verlag für Sozialwissenschaften.
  • Rose, N., Pohl, S., Böhme, H. F., & Steyer, R. (2010). Strukturgleichungsmodelle. In H. Holling, & B. Schmitz (Hrsg.), Handbuch Statistik, Methoden und Evaluation (pp. 600-611). Göttingen: Hogrefe.
  • Nachtigall, N., Pohl, S., & Hartenstein, S. (2010). Univariate deskriptive Statistik. In H. Holling, & B. Schmitz (Hrsg.), Handbuch Statistik, Methoden und Evaluation (pp. 275-287). Göttingen: Hogrefe.

Technical Reports

  • Krannich, M., Jost, O., Rohm, T., Koller, I., Pohl, S., Haberkorn, K., Carstensen, C. H., Fischer, L., & Gnambs, T. (2017). NEPS Technical Report for reading: Scaling results of Starting Cohort 3 for grade 7 (NEPS Survey Papers No. 14). Bamberg, Germany: Leibniz Institute for Educational Trajectories, National Educational Panel Study.
  • Pohl, S., Stets, E., & Carstensen, C. H. (2017). Cluster-based anchor item identication and selection (NEPS Working Paper No. 68). Bamberg: Leibniz Institute for Educational Trajectories, National Educational Panel Study.
  • Aßmann, C., Carstensen, C. H., Gaasch, C., & Pohl, S. (2014). Estimation of plausible values using background variables with missing values: A data augmented MCMC approach. (NEPS Working Paper No. 38). Bamberg: Leibniz Institute for Educational Trajectories, National Educational Panel Study.
  • Pohl, S., Haberkorn, K.,& Hardt, K., (2014).NEPS Technical Report for Reading – Scaling results of starting cohort 5 for first-year students in main study 2010/11. (NEPS Working Paper No. 34). Bamberg: Leibniz Institute for Educational Trajectories, National Educational Panel Study.
  • Hardt, K., Pohl, S. & Haberkorn, K. (2013).NEPS Technical Report for Reading – Scaling results of Starting Cohort 6 for adults in main study 2010/11 (NEPS Working Paper No. 25). Bamberg: Otto-Friedrich-Universität, Nationales Bildungspanel.
  • Haberkorn, K., Pohl, S., Hardt, K., & Wiegand, E. (2012).NEPS Technical Report for Reading – Scaling results of starting cohort 4 in ninth grade (NEPS Working Paper No. 16). Bamberg: Otto-Friedrich-University, Nationales Bildungspanel.
  • Pohl, S., Haberkorn, K., Hardt, K., & Wiegand, E. (2012).NEPS Technical Report for Reading – Scaling results of starting cohort 3 in fifth grade (NEPS Working Paper No. 15). Bamberg: Otto-Friedrich-University, Nationales Bildungspanel.
  • Pohl, S., & Carstensen, C. (2012). NEPS Technical Report – Scaling the data of the competence tests (NEPS Working Paper No. 14). Bamberg: Otto-Friedrich-University, Nationales Bildungspanel.