Bandit Algorithms in Information Retrieval

Bandit Algorithms in Information Retrieval - Foundations and Trends in Information Retrieval

Paperback (30 May 2019)

Save $15.86

  • RRP $98.96
  • $83.10
Add to basket

Includes delivery to the United States

10+ copies available online - Usually dispatched within 7-10 days

Publisher's Synopsis

This monograph provides an overview of bandit algorithms inspired by various aspects of Information Retrieval (IR), such as click models, online ranker evaluation, personalization or the cold-start problem. Using a survey style, each chapter focuses on a specific IR problem and explains how it was addressed with various bandit approaches. Within each section, all the algorithms are presented in chronological order. The monograph shows how specific concepts related to bandit algorithms. This comprehensive, chronological approach enables the author to explain the impact of IR on the development of new bandit algorithms as well as the impact of bandit algorithms on the development of new methods in IR. The survey is primarily intended for two groups of readers: researchers in Information Retrieval or Machine Learning and practicing data scientists. It is accessible to anyone who has completed introductory to intermediate level courses in machine learning and/or statistics.

Book information

ISBN: 9781680835748
Publisher: Now Publishers
Imprint: Now Publishers
Pub date:
DEWEY: 025.524
DEWEY edition: 23
Language: English
Number of pages: 131
Weight: 230g
Height: 155mm
Width: 234mm
Spine width: 13mm