Individual participant data modeling approaches for clinical risk model validation and bias adjustment
Translated title:
Individuelle Teilnehmerdaten Modellierungsansätze für die klinische Risikomodell Validierung und Bias-Anpassung
Author:
Chen, Yiyao
Year:
2022
Document type:
Dissertation
Faculty/School:
TUM School of Life Sciences
Advisor:
Ankerst, Donna (Prof., Ph.D.)
Referee:
Ankerst, Donna (Prof., Ph.D.); Byeongyeob, Choi (Prof., Ph.D.)
Language:
en
Subject group:
BIO Biowissenschaften
TUM classification:
BIO 110; MAT 620
Abstract:
Heterogeneity between validation and training data used to build clinical risk prediction models biases the performance of the models on validation samples. A framework is developed to accommodate selection bias coming from heterogeneous distributions of risk factors and verification bias coming from different verification mechanisms between training and validation cohorts. Adjustments result in weighted versions of the usual performance metrics with different weights addressing different types of bias.
«
Heterogeneity between validation and training data used to build clinical risk prediction models biases the performance of the models on validation samples. A framework is developed to accommodate selection bias coming from heterogeneous distributions of risk factors and verification bias coming from different verification mechanisms between training and validation cohorts. Adjustments result in weighted versions of the usual performance metrics with different weights addressing different types...
»
Translated abstract:
Die Heterogenität zwischen den Validierungsdaten und den Trainingsdaten, die zur Erstellung der klinischen Risikovorhersagemodelle verwendet wurden, verzerrt die Leistung der Modelle in den Validierungsproben. Es wird ein Rahmen entwickelt, um die Selektionsverzerrung aufgrund heterogener Verteilungen von Risikofaktoren und die Verifizierungsverzerrung aufgrund unterschiedlicher Verifizierungsmechanismen zwischen Trainings- und Validierungskohorten zu berücksichtigen.