U.S. Department of Energy

01/13/2026 | News release | Distributed by Public on 01/13/2026 15:53

Uncertainty Toolbox: A Software Toolbox for Quantifying Uncertainty and More

Uncertainty Toolbox: A Software Toolbox for Quantifying Uncertainty and More

The Uncertainty Toolbox, a popular open-source library for uncertainty quantification and calibration, is a valuable tool for fusion and other research.

Fusion Energy Sciences

January 13, 2026
min minute read time
The Uncertainty Toolbox provides a suite of evaluation, visualization, and recalibration functionalities for uncertainty quantification.
Image courtesy of Chung, Y., Char, I., Guo, H., Schneider, J., & Neiswanger, W., Uncertainty toolbox: an open-source library for assessing, visualizing, and improving uncertainty quantification. arXiv preprint arXiv:2109.10254 (2021).

The Science

Predictive uncertainty arises when using a learned model to make predictions. Learned models are artificial intelligence (AI) models that change to reflect new data. Predicting uncertainty involves estimating how likely specific events are when a model doesn't fully know a situation. Accurately quantifying the uncertainty is especially important in settings involving safety, such as fusion energy devices. For example, researchers working on fusion energy have recently used learned models called AI/deep reinforcement learning (DRL) algorithms to control fusion plasma. Properly modeling the uncertainty in these models is crucial for the performance of plasma controls. However, evaluating predictive uncertainty is difficult and requires investigating multiple sources of uncertainty sources. To help, researchers have developed the Uncertainty Toolbox. This software code toolbox provides tools to assess the quality of predictive uncertainty. It includes tools for metrics and data visualization, and methods to recalibrate the predictive uncertainty.

The Impact

Uncertainty Toolbox is valuable for researchers not only in uncertainty quantification, but also in machine learning and the physical sciences. It is the most popular open-source code repository on GitHub for uncertainty quantification and calibration. Its abilities have contributed to many applications. For example, the toolbox has aided the research of novel algorithms in calibration, model-based reinforcement learning, uncertainty quantification applications in the physical sciences, and others. The research community contributes to the maintenance and update of the toolbox, and its impact is expected to increase in the future.

Summary

Predictive uncertainty is an inherent challenge when deploying AI and machine learning algorithms, particularly in safety-critical applications such as plasma control. Uncertainty Toolbox is a practical tool in this space. The functionalities of this toolbox include evaluation metrics (e.g., check score, likelihood score, interval score, average calibration, adversarial group calibration), visualization functions (e.g., prediction intervals, reliability diagrams), and implementations of recalibration algorithms. This enables researchers to conduct thorough investigations into the various facets of predictive uncertainty.

The significance of this toolbox extends beyond uncertainty quantification, as it has made contributions to research in general ML as well as the physical sciences. As one of the leading open-source repositories on GitHub for uncertainty quantification and calibration, it is widely utilized by the research community and maintained by community contributors. As novel algorithms are continuously added into the toolbox, its impact in further accelerating research into uncertainty quantification and its applications is expected to continue and expand.

Contact

Jeff SchneiderCarnegie Mellon [email protected]

Funding

This material is based on work supported by the Department of Energy Office of Science, Office of Fusion Energy Sciences and the National Science Foundation.

Publications

Chung, Y., Neiswanger, W., Char, I., & Schneider, J. (2021). Beyond pinball loss: Quantile methods for calibrated uncertainty quantification. Advances in Neural Information Processing Systems, 34, 10971-10984.

Chung, Y., Char, I., Guo, H., Schneider, J., & Neiswanger, W., Uncertainty toolbox: an open-source library for assessing, visualizing, and improving uncertainty quantification. arXiv preprint arXiv:2109.10254 (2021). [DOI: 10.48550/arXiv.2109.10254]

Char, I., Chung, Y., Shah, R., Neiswanger, W., & Schneider, J., Correlated Trajectory Uncertainty for Adaptive Sequential Decision Making. In NeurIPS 2023 Workshop on Adaptive Experimental Design and Active Learning in the Real World (2023).

Related Links

Uncertainty Toolbox web site

U.S. Department of Energy published this content on January 13, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on January 13, 2026 at 21:53 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]