Help Privacy Policy Disclaimer
  Advanced SearchBrowse





Bridging the gap between classical and quantum many-body information dynamics. (submitted to Phys. Rev. X)


Malz,  Daniel
Theory, Max Planck Institute of Quantum Optics, Max Planck Society;
MCQST - Munich Center for Quantum Science and Technology, External Organizations;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Pizzi, A., Malz, D., Nunnenkamp, A., & Knolle, J. (submitted). Bridging the gap between classical and quantum many-body information dynamics. (submitted to Phys. Rev. X).

Cite as: http://hdl.handle.net/21.11116/0000-000A-63D7-4
The fundamental question of how information spreads in closed quantum many-body systems is often addressed through the lens of the bipartite entanglement entropy, a quantity that describes correlations in a comprehensive (nonlocal) way. Among the most striking features of the entanglement entropy are its unbounded linear growth in the thermodynamic limit, its asymptotic extensivity in finite-size systems, and the possibility of measurement-induced phase transitions, all of which have no obvious classical counterpart. Here, we show how these key qualitative features emerge naturally also in classical information spreading, as long as one treats the classical many-body problem on par with the quantum one, that is, by explicitly accounting for the exponentially large classical probability distribution. Our analysis is supported by extensive numerics on prototypical cellular automata and Hamiltonian systems, for which we focus on the classical mutual information and also introduce a `classical entanglement entropy'. Our study sheds light on the nature of information spreading in classical and quantum systems, and opens new avenues for quantum-inspired classical approaches across physics, information theory, and statistics.