Reducing Barriers to Clinical Adoption of Open-Source Radiotherapy Linac QA Tools In Resource-Limited Settings: Evaluation and Clinical Benchmarking
Abstract
Purpose
Open-source radiotherapy quality assurance (OSQA) tools are increasingly available and offer potential value in resource-constrained settings. However, clinical adoption remains limited by gaps in validation, standardization, and implementation guidance, rather than software availability alone. We aim to address this problem through a structured identification and implementation-focused characterization of OSQA software packages, coupled with the benchmarking of select candidates on AAPM TG-142-aligned tests.
Methods
A reproducible search of public GitHub repositories (n=804), supplemented by targeted web search, identified candidate OSQA tools. Automated screening grouped related repositories into software families for characterization and benchmarking. These were manually screened for implementation readiness (license, maintenance activity, OS requirements, product resources) and mapped to TG-142 QA domains. A subset was selected for benchmarking based on implementation readiness and overlap with routine linac QA tasks. Benchmarking assessed executability and quantitative agreement with commercial reference QA software using clinical datasets from an established radiotherapy program.
Results
The search resulted in 79 open-source repositories meeting the criteria for linac QA, with 10 software selected for benchmarking. Five software packages demonstrated strong agreement with commercial software for Winston–Lutz analysis (mean deviation 0.14 mm). Four packages met the feasibility criteria (<1% deviation) for field size and MLC positional accuracy analysis, and three completed at least part of the CBCT analysis. Planar imaging and machine log-file analysis were achievable in one package each. Across the array of available software, implementation readiness varied widely. Maintenance activity, documentation quality, OS constraints, and example dataset availability were inconsistent and often independent of advertised QA functionality.
Conclusion
Several OSQA tools can achieve performance comparable to commercial software on key TG-142–aligned QA tests. However, QA test coverage alone does not indicate clinical readiness. These findings support minimum evaluation standards and an implementation toolkit for safe, scalable adoption in resource-constrained settings.