date: 2020-06-16T14:09:53Z pdf:docinfo:custom:lastpage: 1510 pdf:PDFVersion: 1.3 pdf:docinfo:title: Expressive power of tensor-network factorizations for probabilistic modeling access_permission:can_print_degraded: true EventType: Poster pdf:docinfo:custom:firstpage: 1498 subject: Neural Information Processing Systems http://nips.cc/ dc:format: application/pdf; version=1.3 access_permission:fill_in_form: true pdf:encrypted: false dc:title: Expressive power of tensor-network factorizations for probabilistic modeling Book: Advances in Neural Information Processing Systems 32 pdf:docinfo:custom:Date: 2019 modified: 2020-06-16T14:09:53Z Description-Abstract: Tensor-network techniques have recently proven useful in machine learning, both as a tool for the formulation of new learning algorithms and for enhancing the mathematical understanding of existing methods. Inspired by these developments, and the natural correspondence between tensor networks and probabilistic graphical models, we provide a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions. These factorizations include non-negative tensor-trains/MPS, which are in correspondence with hidden Markov models, and Born machines, which are naturally related to the probabilistic interpretation of quantum circuits. When used to model probability distributions, they exhibit tractable likelihoods and admit efficient learning algorithms. Interestingly, we prove that there exist probability distributions for which there are unbounded separations between the resource requirements of some of these tensor-network factorizations. Of particular interest, using complex instead of real tensors can lead to an arbitrarily large reduction in the number of parameters of the network. Additionally, we introduce locally purified states (LPS), a new factorization inspired by techniques for the simulation of quantum systems, with provably better expressive power than all other representations considered. The ramifications of this result are explored through numerical experiments. cp:subject: Neural Information Processing Systems http://nips.cc/ pdf:docinfo:subject: Neural Information Processing Systems http://nips.cc/ pdf:docinfo:custom:Created: 2019 pdf:docinfo:creator: Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, Ignacio Cirac meta:author: Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, Ignacio Cirac meta:creation-date: 2020-06-16T14:09:53Z created: 2020-06-16T14:09:53Z access_permission:extract_for_accessibility: true Creation-Date: 2020-06-16T14:09:53Z lastpage: 1510 pdf:docinfo:custom:Type: Conference Proceedings Editors: H. Wallach and H. Larochelle and A. Beygelzimer and F. d'Alché-Buc and E. Fox and R. Garnett Author: Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, Ignacio Cirac producer: PyPDF2 pdf:docinfo:producer: PyPDF2 pdf:docinfo:custom:Description: Paper accepted and presented at the Neural Information Processing Systems Conference (http://nips.cc/) pdf:unmappedUnicodeCharsPerPage: 0 dc:description: Neural Information Processing Systems http://nips.cc/ Description: Paper accepted and presented at the Neural Information Processing Systems Conference (http://nips.cc/) access_permission:modify_annotations: true firstpage: 1498 dc:creator: Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, Ignacio Cirac description: Neural Information Processing Systems http://nips.cc/ pdf:docinfo:custom:EventType: Poster dcterms:created: 2020-06-16T14:09:53Z Last-Modified: 2020-06-16T14:09:53Z dcterms:modified: 2020-06-16T14:09:53Z title: Expressive power of tensor-network factorizations for probabilistic modeling xmpMM:DocumentID: uuid:4e62eb7b-e215-4eb0-ae73-1d67fd7cf4f3 Last-Save-Date: 2020-06-16T14:09:53Z Created: 2019 pdf:docinfo:modified: 2020-06-16T14:09:53Z pdf:docinfo:custom:Book: Advances in Neural Information Processing Systems 32 Language: en-US pdf:docinfo:custom:Language: en-US meta:save-date: 2020-06-16T14:09:53Z Content-Type: application/pdf X-Parsed-By: org.apache.tika.parser.DefaultParser creator: Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, Ignacio Cirac access_permission:assemble_document: true xmpTPg:NPages: 13 Publisher: Curran Associates, Inc. pdf:charsPerPage: 2987 access_permission:extract_content: true Date: 2019 access_permission:can_print: true Type: Conference Proceedings pdf:docinfo:custom:Editors: H. Wallach and H. Larochelle and A. Beygelzimer and F. d'Alché-Buc and E. Fox and R. Garnett pdf:docinfo:custom:Description-Abstract: Tensor-network techniques have recently proven useful in machine learning, both as a tool for the formulation of new learning algorithms and for enhancing the mathematical understanding of existing methods. Inspired by these developments, and the natural correspondence between tensor networks and probabilistic graphical models, we provide a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions. These factorizations include non-negative tensor-trains/MPS, which are in correspondence with hidden Markov models, and Born machines, which are naturally related to the probabilistic interpretation of quantum circuits. When used to model probability distributions, they exhibit tractable likelihoods and admit efficient learning algorithms. Interestingly, we prove that there exist probability distributions for which there are unbounded separations between the resource requirements of some of these tensor-network factorizations. Of particular interest, using complex instead of real tensors can lead to an arbitrarily large reduction in the number of parameters of the network. Additionally, we introduce locally purified states (LPS), a new factorization inspired by techniques for the simulation of quantum systems, with provably better expressive power than all other representations considered. The ramifications of this result are explored through numerical experiments. pdf:docinfo:custom:Published: 2019 Published: 2019 pdf:docinfo:custom:Publisher: Curran Associates, Inc. access_permission:can_modify: true pdf:docinfo:created: 2020-06-16T14:09:53Z