In the well-studied framework of prediction with expert advice,

the aim is to combine the forecasts of a set of experts so as to predict

almost as well as the best available expert, no matter what the experts'

predictions or the signal's values are. A simple algorithm of bayesian

inspiration, the exponentially weighted average forecaster, provides

strong regret guarantees for this problem. Several refinements have been

studied in the literature with an eye to non-stationarity, such as

comparing favorably to sequences of experts. However, in previous works

the set of experts itself is typically assumed to be fixed, which can be

limiting in some situations, e.g. when one wants to incorporate new models

or algorithms on the fly.

We address the problem of efficiently combining the predictions of a

growing number of experts, which arrive in a streaming fashion. We

introduce three different classes of sequences of experts (timely,

arbitrary, or sparse), for which we provide efficient algorithms with no

parameter to tune and with tight regret guarantees that are adaptative to

the parameters of the class. We put a strong emphasis on anytime

strategies, which assume no time horizon nor any a priori knowledge on the

number of experts to come. Some general ideas emerge of our analysis, such

as the use of unnormalized weights to add more flexibility, or the

emphasis put on the choice of the prior to design algorithms.