Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Nov 19;10(11):972.
doi: 10.3390/diagnostics10110972.

Statistical Physics for Medical Diagnostics: Learning, Inference, and Optimization Algorithms

Affiliations

Statistical Physics for Medical Diagnostics: Learning, Inference, and Optimization Algorithms

Abolfazl Ramezanpour et al. Diagnostics (Basel). .

Abstract

It is widely believed that cooperation between clinicians and machines may address many of the decisional fragilities intrinsic to current medical practice. However, the realization of this potential will require more precise definitions of disease states as well as their dynamics and interactions. A careful probabilistic examination of symptoms and signs, including the molecular profiles of the relevant biochemical networks, will often be required for building an unbiased and efficient diagnostic approach. Analogous problems have been studied for years by physicists extracting macroscopic states of various physical systems by examining microscopic elements and their interactions. These valuable experiences are now being extended to the medical field. From this perspective, we discuss how recent developments in statistical physics, machine learning and inference algorithms are coming together to improve current medical diagnostic approaches.

Keywords: diagnostic process; disease progression; statistical physics.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Uncovering (learning) macroscopic features (diagnosis) from microscopic sign variables: (a) using a powerful probabilistic model for a selective observation of additional signs and a careful anticipation of a few other sign values by simulating the diagnostic process; (b) using a microscopic model of disease evolution to estimate the likelihood of a disease hypothesis from the history (dynamics) of the observed signs. Here, empty circles indicate the unobserved signs, the filled circles are the observed signs with possibly different levels of activity, and the dashed circles show the anticipated sign values.
Figure 2
Figure 2
Probabilistic models of sign and disease variables: (a) A Bayesian belief network in the form of an acyclic directed graph showing the connections between the disease (Da,Db,...) and sign (Si,Sj,...) variables. The model is completed by the conditional probability distribution P(S|D). (b) An interaction graph of disease variables (leftmost circles) and sign variables (rightmost circles) related by Ma one-disease and Mab two-disease interaction factors (middle squares) in addition to interactions induced by the leak probability (right square) and the prior probability of disease (left square). (Adapted from reference [73]).
Figure 3
Figure 3
The impact of observing the most polarizing signs on the disease probabilities. The numbers of signs and diseases in this example are NS=20 and ND=5, respectively. The probabilistic sign–disease model is constructed by using synthetic conditional probabilities P(Si|Da) and P(Si|Da,Db) that are concentrated around the sign values randomly assigned to the diseases. The disease probabilities are computed exactly by an exhaustive algorithm (more details can be found in reference [84]). Given the NO=4 initially observed signs, the algorithm anticipates the values of T other signs that would make the disease probabilities more decisive (closer to zero or one).
Figure 4
Figure 4
How the initial number of observed signs determines the range of useful predictions with a probabilistic model. (a) The difference δP(t)=P(TRt)P(TWt) between the cumulative probabilities of the first correct and incorrect diagnosis times (TR and TW, respectively) is plotted against the number of observations t for different numbers of initial observations, NO(0). (b) δP(50) is plotted against NO(0) for a sufficiently large value of t. The numbers of signs and diseases in this example are NS=500 and ND=50. (Adapted from reference [84]).
Figure 5
Figure 5
A diagnostic process that starts with the step-by-step approach and then switches to the batch approach after a phase transition to an ordered phase. In the disordered phase, the probability distribution of the signs is described by a single pure Gibbs state in which the observed signs on average give no information about the values of the unobserved signs. More observations can lead to a phase transition to an ordered phase in which there are multiple pure Gibbs states that provide useful information about the unobserved signs.

Similar articles

References

    1. Lynch C.J., Liston C. New machine-learning technologies for computer-aided diagnosis. Nat. Med. 2018;24:13041305. doi: 10.1038/s41591-018-0178-4. - DOI - PubMed
    1. Wainberg M., Merico D., Delong A., Frey B.J. Deep learning in biomedicine. Nat. Biotechnol. 2018;36:829838. doi: 10.1038/nbt.4233. - DOI - PubMed
    1. Yu K.H., Beam A.L., Kohane I.S. Artificial intelligence in healthcare. Nat. Biomed. Eng. 2018;2:719731. doi: 10.1038/s41551-018-0305-z. - DOI - PubMed
    1. Topol E.J. High-performance medicine: the convergence of human and artificial intelligence. Nat. Med. 2019;25:4456. doi: 10.1038/s41591-018-0300-7. - DOI - PubMed
    1. Kelly C.J., Karthikesalingam A., Suleyman M., Corrado G., King D. Key challenges for delivering clinical impact with artificial intelligence. BMC Med. 2019;17:195. doi: 10.1186/s12916-019-1426-2. - DOI - PMC - PubMed

LinkOut - more resources