Sahil Patel and Faris Sbahi and Antonio Martinez and Dmitri Saberi and Jae Yoo and Geoffrey Roeder and Guillaume Verdon

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2022-256

December 1, 2022

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-256.pdf

Generatively modelling properties of a single quantum system can already be computationally expensive, and often, one wishes to model several different scenarios to see how the dynamical or equilibrium properties of a quantum system evolve as certain parameters, such as time, temperature, or the Hamiltonian, are continuously modified. For such parametric families of tasks, there is often an inherent information geometry for this space of tasks and within each task; one would ideally leverage awareness of such a geometry to guide the optimization of generative models from task to task. Here we explore the use of quantum-probabilistic hybrid representations that combine probabilistic generative models with quantum neural networks, paired with optimization strategies which convert between the geometry of the task space and that of the parameter space of our models, in order to achieve an optimization advantage. We specifically study Riemannian metrics defined on the space of density operators, in particular the Bogoliubov-Kubo-Mori (BKM) metric, which can be well-estimated in an unbiased fashion for our class of quantum-probabilistic models, namely quantum Hamiltonian-based models (QHBMs). We show that natural gradient descent with respect to this construction attains quantum Fisher efficiency of parameter estimation. We further present an alternative first-order formulation of mirror descent that is conducive to improvements in quantum sample complexity. We also derive conditional initialization strategies for simulating time evolution processes and equilibrium states for various values of the problem space parameters. We demonstrate both theoretically and numerically that such techniques may enable accelerated convergence to more optimal solutions of quantum generative modelling tasks.

Advisors: Umesh Vazirani


BibTeX citation:

@mastersthesis{Patel:EECS-2022-256,
    Author= {Patel, Sahil and Sbahi, Faris and Martinez, Antonio and Saberi, Dmitri and Yoo, Jae and Roeder, Geoffrey and Verdon, Guillaume},
    Title= {Generative Modelling of Quantum Processes via Quantum-Probabilistic Information Geometry},
    School= {EECS Department, University of California, Berkeley},
    Year= {2022},
    Month= {Dec},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-256.html},
    Number= {UCB/EECS-2022-256},
    Abstract= {Generatively modelling properties of a single quantum system can already be computationally expensive, and often, one wishes to model several different scenarios to see how the dynamical or equilibrium properties of a quantum system evolve as certain parameters, such as time, temperature, or the Hamiltonian, are continuously modified. For such parametric families of tasks, there is often an inherent information geometry for this space of tasks and within each task; one would ideally leverage awareness of such a geometry to guide the optimization of generative models from task to task. Here we explore the use of quantum-probabilistic hybrid representations that combine probabilistic generative models with quantum neural networks, paired with optimization strategies which convert between the geometry of the task space and that of the parameter space of our models, in order to achieve an optimization advantage. We specifically study Riemannian metrics defined on the space of density operators, in particular the Bogoliubov-Kubo-Mori (BKM) metric, which can be well-estimated in an unbiased fashion for our class of quantum-probabilistic models, namely quantum Hamiltonian-based models (QHBMs). We show that natural gradient descent with respect to this construction attains quantum Fisher efficiency of parameter estimation. We further present an alternative first-order formulation of mirror descent that is conducive to improvements in quantum sample complexity. We also derive conditional initialization strategies for simulating time evolution processes and equilibrium states for various values of the problem space parameters. We demonstrate both theoretically and numerically that such techniques may enable accelerated convergence to more optimal solutions of quantum generative modelling tasks.},
}

EndNote citation:

%0 Thesis
%A Patel, Sahil 
%A Sbahi, Faris 
%A Martinez, Antonio 
%A Saberi, Dmitri 
%A Yoo, Jae 
%A Roeder, Geoffrey 
%A Verdon, Guillaume 
%T Generative Modelling of Quantum Processes via Quantum-Probabilistic Information Geometry
%I EECS Department, University of California, Berkeley
%D 2022
%8 December 1
%@ UCB/EECS-2022-256
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-256.html
%F Patel:EECS-2022-256