Many problems in different scientific domains can be described through statistical models that relate observed data to a set of hidden parameters of interest. In the Bayesian framework, the probabilistic estimation of the unknowns is represented by the posterior distribution of these parameters. However, in most of the realistic models, the posterior is intractable and must be approximated. Monte Carlo methods are computational tools that allow for approximating intractable posteriors by drawing random samples. Importance Sampling (IS) is a Monte Carlo methodology that has shown a satisfactory performance in many problems of Bayesian inference. Compared to other Monte Carlo methods, the IS method has sound theoretical properties, but its performance can be poor if the proposal distribution used for drawing the samples is not adequately selected. In this talk, we review recent advances in IS based on the use of multiple proposals, the adaptation of these proposals, and the automatic adjustment of the computational complexity.