Book - bandit algorithms
Bandits for recommender systems by Eugene Yan
Rolling out multi armed bandits for fast adaptive experimentation
Last updated 1 year ago