Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

More Dominantly Truthful Multi-task Peer Prediction with a Finite Number of Tasks (2103.02214v3)

Published 3 Mar 2021 in cs.GT

Abstract: In the setting where we ask participants multiple similar possibly subjective multi-choice questions (e.g. Do you like Bulbasaur? Y/N; do you like Squirtle? Y/N), peer prediction aims to design mechanisms that encourage honest feedback without verification. A series of works have successfully designed multi-task peer prediction mechanisms where reporting truthfully is better than any other strategy (dominantly truthful), while they require an infinite number of tasks. A recent work proposes the first multi-task peer prediction mechanism, Determinant Mutual Information (DMI)-Mechanism, where not only is dominantly truthful but also works for a finite number of tasks (practical). However, the existence of other practical dominantly-truthful multi-task peer prediction mechanisms remains to be an open question. This work answers the above question by providing 1. a new family of information-monotone information measures: volume mutual information (VMI), where DMI is a special case; 2. a new family of practical dominantly-truthful multi-task peer prediction mechanisms, VMI-Mechanisms. To illustrate the importance of VMI-Mechanisms, we also provide a tractable effort incentive optimization goal. We show that DMI-Mechanism may not be not optimal but we can construct a sequence of VMI-Mechanisms that are approximately optimal. The main technical highlight in this paper is a novel geometric information measure, Volume Mutual Information, that is based on a simple idea: we can measure an object A's information amount by the number of objects that is less informative than A. Different densities over the object lead to different information measures. This also gives Determinant Mutual Information a simple geometric interpretation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Yuqing Kong (35 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.