Skip to main content

Table 1 Recommendations for validating WSI systems [25, 47]

From: An update on applications of digital pathology: primary diagnosis; telepathology, education and research

Recommendation for validating WSI systems

(PMID:2,363,490,

https://www.fda.gov/media/90791/download.93

Training in WSI should be offered to participants

In a 2022 study by Rizzo et al., in which 45 validation articles were reviewed for diagnostic issues 9% of the articles reported issues with misinterpretation of diagnosis and 6% of the articles reported issues with lack of confidence emphasizing the importance of training [46]. Also, a 2024 study by Koefoed-Nielsen et al., on implementation of digital pathology at two departments stressed the need for more system specific training before implementation [25]

A sample set of atleast 60 routine cases for one application and another 20 cases for each additional application should be used

A washout period of atleast 2 weeks between viewing the slide sets in each condition should be present

The performance of WSI review is considered non-inferior to light microscopy if the upper bound of the two sided 95% confidence interval of the difference between the overall major discrepancy rates of WSI review and light microscopy slides review diagnosis is 4% or less and if the upper bound of the two-sided 95% confidence interval of the overall major discrepancy rate of the WSI review diagnosis (relative to the reference diagnosis) is 7% or less for the same observer. If concordance is less than 95%, laboratories should investigate and take corrective action towards the cause