Tensor networks for active inference with discrete observation spaces

Samuel T. Wauthier, Bram Vanhecke, Tim Verbelen, Bart Dhoedt

Publications: Contribution to conferencePaper

Abstract

In recent years, quantum physics-inspired tensor networks have seen an explosion in use cases. While these networks were originally developed to model many-body quantum systems, their usage has expanded into the field of machine learning, where they are often used as an alternative to neural networks. In a similar way, the neuroscience-based theory of active inference, a general framework for behavior and learning in autonomous agents, has started branching out into machine learning. Since every aspect of an active inference model, such as the latent space structure, must be manually defined, efforts have been made to learn state space representations automatically from observations using deep neural networks. In this work, we show that tensor networks can be employed to learn an active inference model with a discrete observation space. We demonstrate our method on the T-maze problem and show that the agent acts Bayes optimal as expected under active inference.
Original languageEnglish
Number of pages6
Publication statusPublished - 29 Nov 2022
EventMachine Learning and the Physical Sciences: Workshop at the 36th conference on Neural Information Processing Systems (NeurIPS) - New Orleans Ernest N. Morial Convention Center, New Orleans, United States
Duration: 3 Dec 20223 Dec 2022
https://ml4physicalsciences.github.io/2022/

Seminar/Workshop

Seminar/WorkshopMachine Learning and the Physical Sciences
Country/TerritoryUnited States
CityNew Orleans
Period3/12/223/12/22
Internet address

Austrian Fields of Science 2012

  • 103036 Theoretical physics
  • 103025 Quantum mechanics
  • 102019 Machine learning

Keywords

  • Tensor networks

Fingerprint

Dive into the research topics of 'Tensor networks for active inference with discrete observation spaces'. Together they form a unique fingerprint.

Cite this