Automatic Breast Cancer WSI Grading Based on TransMIL Framework under Weak Supervision

Authors

  • Arthur Miller Institute for Biomedical Engineering, ETH Zurich, Zurich 8092, Switzerland Author
  • Jun-Ho Lee Institute for Biomedical Engineering, ETH Zurich, Zurich 8092, Switzerland Author

DOI:

https://doi.org/10.71465/fht688

Keywords:

Breast cancer grading, Whole Slide Images (WSI), TransMIL framework, Weak supervision, Multiple Instance Learning (MIL), Transformer-based self-attention

Abstract

The grading of breast cancer via Whole Slide Images is a pivotal component of histopathological diagnosis and prognosis, directly influencing therapeutic strategies. Traditional manual grading, primarily based on the Nottingham Grading System, is labor-intensive, subjective, and prone to inter-observer variability. While deep learning has shown promise in automating this process, the gigapixel resolution of whole slide images and the scarcity of pixel-level annotations present significant computational and logistical challenges. This paper investigates the application of the TransMIL framework, a Transformer-based Multiple Instance Learning approach, for the automatic grading of breast cancer under weak supervision. Unlike conventional Multiple Instance Learning methods that rely on independent instance assumptions, TransMIL leverages self-attention mechanisms to model morphological and spatial dependencies between patches across the entire slide. By utilizing only slide-level labels, the proposed method eliminates the need for expensive region-of-interest annotations. We present a comprehensive analysis of the framework's architecture, including the incorporation of pyramid position encoding and conditional convolution to capture multi-scale context. Experimental validation on public datasets demonstrates that the TransMIL-based approach achieves superior classification performance compared to standard multiple instance learning baselines, offering a robust and interpretable solution for computational pathology.

Downloads

Download data is not yet available.

Downloads

Published

2026-02-01