Learning Generative Models for Active Inference using Tensor Networks

Samuel T Wauthier, Bram Vanhecke, Tim Verbelen, Bart Dhoedt

Publications: Contribution to conferencePaper

Abstract

Abstract. Active inference provides a general framework for behavior
and learning in autonomous agents. It states that an agent will attempt
to minimize its variational free energy, defined in terms of beliefs over
observations, internal states and policies. Traditionally, every aspect of a
discrete active inference model must be specified by hand, i.e. by manually
defining the hidden state space structure, as well as the required distributions such as likelihood and transition probabilities. Recently, efforts
have been made to learn state space representations automatically from
observations using deep neural networks. In this paper, we present a novel
approach of learning state spaces using quantum physics-inspired tensor
networks. The ability of tensor networks to represent the probabilistic
nature of quantum states as well as to reduce large state spaces makes
tensor networks a natural candidate for active inference. We show how
tensor networks can be used as a generative model for sequential data.
Furthermore, we show how one can obtain beliefs from such a generative
model and how an active inference agent can use these to compute the
expected free energy. Finally, we demonstrate our method on the classic
T-maze environment.
Original languageEnglish
Number of pages14
Publication statusPublished - 24 Oct 2022
Event3rd International Workshop on Active Inference (IWAI 2022) - Grenoble, France
Duration: 19 Sept 202219 Sept 2022
https://iwaiworkshop.github.io/

Seminar/Workshop

Seminar/Workshop3rd International Workshop on Active Inference (IWAI 2022)
Country/TerritoryFrance
CityGrenoble
Period19/09/2219/09/22
Internet address

Austrian Fields of Science 2012

  • 102019 Machine learning

Keywords

  • tensor networks
  • magine learning

Fingerprint

Dive into the research topics of 'Learning Generative Models for Active Inference using Tensor Networks'. Together they form a unique fingerprint.

Cite this