MoCo Pretraining Improves Representations and Transferability of Chest X ray Models

Momentum Contrast (MoCo) can leverage unlabeled data to produce pretrained models for subsequent fine-tuning on labeled data . While MoCo has demonstrated promising results on natural image classification tasks, its application to medical imaging tasks like chest X-ray interpretation has been limited . We conduct MoCo-pretraining on CheXpert, a large labeled dataset of X-rays, followed by supervised fine-tuning experiments on the pleural effusion task . We find that a linear model trained on Moco-pretrained representations outperforms one trained on representations without MoCo pretraining by an AUC of 0.096 (95% CI 0.061, 0.130) with the 0.1% label fraction . We also observe similar results on a small, target chest Xray dataset (Shenzhen dataset for tuberculosis) with MoCo pre-training done on the source dataset (CheXpert), which suggests that pretraining on the target dataset is a good candidate for such a target task . Our study demonstrates that MoCo provides high-quality representations and transferable initializations for chest X.-ray interpretation. Our study demonstrated that MoC-pretrain provides high quality. It suggests that Pretraining on a large, target dataset was a good predictor of a good predictive model for a good outcome in a good prediction of an effective predictor of good outcomes .

Links: PDF - Abstract

Code :

None

Keywords : moco - dataset - good - pretraining - chest -

Leave a Reply

Your email address will not be published. Required fields are marked *