Stochastic Primal Dual Hybrid Gradient Algorithm with Adaptive Step-Sizes - Université Paris Dauphine Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Stochastic Primal Dual Hybrid Gradient Algorithm with Adaptive Step-Sizes

Résumé

In this work we propose a new primal-dual algorithm with adaptive step-sizes. The stochastic primal-dual hybrid gradient (SPDHG) algorithm with constant step-sizes has become widely applied in large-scale convex optimization across many scientific fields due to its scalability. While the product of the primal and dual step-sizes is subject to an upper-bound in order to ensure convergence, the selection of the ratio of the step-sizes is critical in applications. Upto-now there is no systematic and successful way of selecting the primal and dual step-sizes for SPDHG. In this work, we propose a general class of adaptive SPDHG (A-SPDHG) algorithms, and prove their convergence under weak assumptions. We also propose concrete parametersupdating strategies which satisfy the assumptions of our theory and thereby lead to convergent algorithms. Numerical examples on computed tomography demonstrate the effectiveness of the proposed schemes.
Fichier principal
Vignette du fichier
Adaptive_SPDHG_Arxiv.pdf (1.15 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03927644 , version 1 (06-01-2023)
hal-03927644 , version 2 (10-07-2023)
hal-03927644 , version 3 (04-12-2023)

Identifiants

Citer

Antonin Chambolle, Claire Delplancke, Matthias J Ehrhardt, Carola-Bibiane Schönlieb, Junqi Tang. Stochastic Primal Dual Hybrid Gradient Algorithm with Adaptive Step-Sizes. 2023. ⟨hal-03927644v1⟩
92 Consultations
116 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More