4.6 Article

Encoding-dependent generalization bounds for parametrized quantum circuits

Journal

QUANTUM
Volume 5, Issue -, Pages -

Publisher

VEREIN FORDERUNG OPEN ACCESS PUBLIZIERENS QUANTENWISSENSCHAF
DOI: 10.22331/q-2021-11-17-582

Keywords

-

Funding

  1. DFG [CRC 183]
  2. Einstein Foundation (Einstein Research Unit on quantum devices)
  3. EU's Horizon 2020 research and innovation programme [817482]
  4. TopMath Graduate Center of the TUM Graduate School at the Technical University of Munich, Germany
  5. TopMath Program at the Elite Network of Bavaria
  6. German Academic Scholarship Foundation (Studienstiftung des deutschen Volkes)

Ask authors/readers for more resources

Recent research has explored the potential of parametrized quantum circuits (PQCs) as machine learning models, proposing generalization bounds that depend on data-encoding strategies. These results emphasize the importance of well-considered data-encoding strategies for PQC-based models and facilitate optimal strategy selection through structural risk minimization.
A large body of recent work has begun to explore the potential of parametrized quantum circuits (PQCs) as machine learning models, within the framework of hybrid quantum-classical optimization. In particular, theoretical guarantees on the out-of-sample performance of such models, in terms of generalization bounds, have emerged. However, none of these generalization bounds depend explicitly on how the classical input data is encoded into the PQC. We derive generalization bounds for PQC-based models that depend explicitly on the strategy used for data-encoding. These imply bounds on the performance of trained PQC-based models on unseen data. Moreover, our results facilitate the selection of optimal data-encoding strategies via structural risk minimization, a mathematically rigorous framework for model selection. We obtain our generalization bounds by bounding the complexity of PQC-based models as measured by the Rademacher complexity and the metric entropy, two complexity measures from statistical learning theory. To achieve this, we rely on a representation of PQC-based models via trigonometric functions. Our generalization bounds emphasize the importance of well-considered data-encoding strategies for PQC-based models.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available