Dynamic Pricing with Limited Supply
- Moshe Babaioff ,
- Shaddin Dughmi ,
- Robert Kleinberg ,
- Aleksandrs Slivkins
ACM Conference on Electronic Commerce (EC'12) |
Published by ACM
We consider the problem of designing revenue maximizing online posted-price mechanisms when the seller has limited supply. A seller has k identical items for sale and is facing n potential buyers («agents») that are arriving sequentially. Each agent is interested in buying one item. Each agent’s value for an item is an independent sample from some fixed (but unknown) distribution with support [0,1]. The seller offers a take-it-or-leave-it price to each arriving agent (possibly different for different agents), and aims to maximize his expected revenue. We focus on mechanisms that do not use any information about the distribution; such mechanisms are called «detail-free». They are desirable because knowing the distribution is unrealistic in many practical scenarios. We study how the revenue of such mechanisms compares to the revenue of the optimal offline mechanism that knows the distribution («offline benchmark»).
We present a detail-free online posted-price mechanism whose revenue is at most O((k log n)2/3) less than the offline benchmark, for every distribution that is regular. In fact, this guarantee holds without any assumptions if the benchmark is relaxed to fixed-price mechanisms. Further, we prove a matching lower bound. The performance guarantee for the same mechanism can be improved to O(√k log n), with a distribution-dependent constant, if the ratio k/n is sufficiently small. We show that, in the worst case overall demand distributions, this is essentially the best rate that can be obtained with a distribution-specific constant. While dynamic pricing with unlimited supply can easily be seen as a multi-armed bandit (MAB) problem, the intuition behind MAB approaches breaks when applied to the setting with limited supply. Our high-level conceptual contribution is that even the limited supply setting can be fruitfully treated as a bandit problem.