Determinantal point processes (DPPs) are distributions over configurations of points that encode repulsiveness in a kernel function. Important statistical quantities associated to DPPs have geometric and algebraic interpretations, which makes them a fun object to study. Besides, since their formalization by Macchi in 1975 as models for fermions in particle physics, specific instances of DPPs have half-mysteriously appeared in fields such as probability, number theory, or statistical physics.
More recently, the mathematical tractability of DPPs has led to powerful models and algorithmic tools in statistics and machine learning. Yet statistical applications of DPPs require being able to sample realizations from a DPP and to infer a DPP from data. These two tasks are intractably costly for most DPPs, akin to other kernel machines. After having introduced DPPs and highlighted a few applications, I will discuss how these computational challenges have been tackled so far.