Elicitation for Aggregation

  • Rafael Frongillo ,
  • Yiling Chen ,
  • Ian Kash

AAAI |

We study the problem of eliciting and aggregating probabilistic information from multiple agents. In order to successfully aggregate the predictions of agents, the principal needs to elicit some notion of confidence from agents, capturing how much experience or knowledge led to their predictions. To formalize this, we consider a principal who wishes to learn the distribution of a random variable. A group of Bayesian agents has each privately observed some independent samples of the random variable. The principal wishes to elicit enough information from each agent, so that her posterior is the same as if she had directly received all of the samples herself. Leveraging techniques from Bayesian statistics, we represent confidence as the number of samples an agent has observed, which is quantified by a hyperparameter from a conjugate family of prior distributions. This then allows us to show that if the principal has access to a few samples, she can achieve her aggregation goal by eliciting predictions from agents using proper scoring rules. In particular, with access to one sample, she can successfully aggregate the agents’ predictions if and only if every posterior predictive distribution corresponds to a unique value of the hyperparameter, a property which holds for many common distributions of interest. When this uniqueness property does not hold, we construct a novel and intuitive mechanism where a principal with two samples can elicit and optimally aggregate the agents’ predictions.