Generalization Bounds for Dependent Data using Online-to-Batch Conversion

Sagnik Chatterjee, Manuj Mukherjee, Alhad Sethi

Research output: Contribution to journalConference articlepeer-review

Abstract

In this work, we give generalization bounds of statistical learning algorithms trained on samples drawn from a dependent data source both in expectation and with high probability, using the Online-to-Batch conversion paradigm. We show that the generalization error of statistical learners in the dependent data setting is equivalent to the generalization error of statistical learners in the i.i.d. setting up to a term that depends on the decay rate of the underlying mixing stochastic process. Our proof techniques involve defining a new notion of stability of online learning algorithms based on Wasserstein distances and employing "near-martingale" concentration bounds for dependent random variables to arrive at appropriate upper bounds for the generalization error of statistical learners trained on dependent data. Finally, we prove that the Exponential Weighted Averages (EWA) algorithm satisfies our new notion of stability and instantiate our bounds using the EWA algorithm.

Original languageEnglish
Pages (from-to)2152-2160
Number of pages9
JournalProceedings of Machine Learning Research
Volume258
StatePublished - 2025
Externally publishedYes
Event28th International Conference on Artificial Intelligence and Statistics, AISTATS 2025 - Mai Khao, Thailand
Duration: 3 May 20255 May 2025

Bibliographical note

Publisher Copyright:
Copyright 2025 by the author(s).

Fingerprint

Dive into the research topics of 'Generalization Bounds for Dependent Data using Online-to-Batch Conversion'. Together they form a unique fingerprint.

Cite this