Скачать книгу

demand for the Giffen good (bread). The existence of Giffen goods in practice has been contentiously debated for over a century [Edgeworth 1909, Stigler 1947], with evidence ranging from econometric analysis of historical market data to experiments in lab rats [Battalio et al. 1991].

      Despite counterintuitive issues like Giffen goods, Arrow and Debreu proved that, under quite general conditions, a market equilibrium always exists [Debreu and Arrow 1954]. In particular, in the Arrow-Debreu model, agents sell the goods they own and use the revenue to buy other goods they want; this is in contrast to Fisher markets, where agents with a monetary budget purchase from a centralized seller. Market equilibria in both models exist, but can we find them? As in the case of Nash equilibrium, this question is of particular importance because if a centralized, omniscient algorithm cannot compute an equilibrium, it is hard to expect a distributed market with selfish agents to converge to one. In the words of Kamal Jain [Jain 2004, Nisan 2009b]:

      If your laptop cannot find it, neither can the market.

      In the same paper, Jain also gave an algorithm for computing Arrow-Debreu’s equilibrium when consumers’ utilities are linear. Since then, there has been a long sequence of algorithms for computing or approximating market equilibria (e.g., [Birnbaum et al. 2011, Codenotti et al. 2005, Cole and Fleischer 2008, Devanur et al. 2008, Garg et al. 2015, Garg and Kapoor 2006, Jain 2004, Jain and Vazirani, 2010]), for certain models and utility functions, as well as intractability results in other settings [Chen et al. 2009a, Deng and Du 2008, Garg et al. 2017, Vazirani and Yannakakis 2011]. In particular, Chen et al. [2013] consider a setting of markets that exhibit non-monotonicity: when the price of one product increases, its demand increases. Giffen goods, described above, are one example of non-monotone markets; Chen et al. [2013] construct examples in Arrow-Debreu markets when the increased price of a product increases the revenue of the seller, who now has more available money and demands more of the same good. Chen et al. show that for essentially any class of utility function that exhibits some non-monotonicity, computing the Arrow-Debreu market equilibrium is PPAD-hard.

      We extend the PPAD-hardness proof of Chen et al. [2013] and show that even approximate market equilibrium is PPAD-hard (Theorem 9.1). We note that although our inapproximability factor is stronger than that showed by Chen et al., the results are incomparable as ours only holds for the stronger notion of “tight” approximate equilibrium, by which we mean the more standard definition that bounds the two-sided error of the market equilibrium. Chen et al., in contrast, prove that even if we allow arbitrary excess supply, finding a (1/n)-approximate equilibrium is PPAD-hard. Furthermore, for the interesting case of constant elasticity of substitution (CES) utilities with parameter ρ < 0, they show that there exist markets where every (1/2)-tight equilibrium requires prices that are doubly exponentially large (and thus require an exponential-size representation). Indeed, for a general non-monotone family of utility functions, the problem of computing a (tight or not) approximate equilibrium may not belong to PPAD. Nevertheless, the important family of additively separable, concave piecewise-linear utilities is known to satisfy the non-monotone condition [Chen et al. 2013], and yet the computation of (exact) market equilibrium is in PPAD [Vazirani and Yannakakis 2011]. Therefore, we obtain as a corollary that computing an ε-tight approximate equilibrium for Arrow-Debreu markets with additively separable, concave piecewise-linear utilities is PPAD-complete.

       1.1.3 A-CEEI (CourseMatch)

      University courses have limited capacity, and some are more popular than others. This creates an interesting allocation problem. Imagine that each student has ordered all the possible schedules—bundles of courses—from most desirable to least desirable, and the capacities of the classes are known. What is the best way to allocate seats in courses to students? There are several desiderata for a course allocation mechanism:

      Fairness. In what sense is the mechanism “fair”?

      Efficiency. Are seats in courses allocated to the students who want them the most?

      Feasibility. Are any courses oversubscribed?

      Truthfulness. Are students motivated to honestly report their preferences to the mechanism?

      Computational efficiency. Can the allocation be computed from the data in polynomial time?

      Competitive Equilibrium from Equal Incomes (CEEI) [Foley 1967, Thomson and Varian 1985, Varian 1974] is a venerable mechanism with many attractive properties: In CEEI all agents are allocated the same amount of “funny money”; next they declare their preferences, and then a price equilibrium is found that clears the market. The market clearing guarantees Pareto efficiency and feasibility. The mechanism has a strong, albeit technical, ex post fairness guarantee that emerges from the notion that agents who miss out on a valuable, competitive item will have extra funny money to spend on other items at equilibrium. Truthfulness is problematic—as usual with market mechanisms—but potential incentives for any individual agent to deviate are mitigated by the large number of agents. However, CEEI only works when the resources to be allocated are divisible and the utilities are relatively benign. This restriction has both benefits and drawbacks. It ensures computational feasibility, because CEEI can be computed in polynomial time with a linear or convex program, depending on the utilities involved [Devanur et al. 2008, Ghodsi et al. 2011, Varian 1974]; on the other hand, it is easy to construct examples in which a CEEI does not exist when preferences are complex or the resources being allocated are not divisible. Indeed, both issues arise in practice in a variety of allocation problems, including shifts to workers, landing slots to airplanes, and the setting that we focus on here, courses to students [Budish 2011, Varian 1974].

      It was shown in Budish [2011] that an approximation to a CEEI solution, called A-CEEI, exists even when the resources are indivisible and agent preferences are arbitrarily complex, as required by the course allocation problems one sees in practice. The approximate solution guaranteed to exist is approximately fair (in that the students are given almost the same budget), and approximately Pareto efficient and feasible (in that all courses are filled close to capacity, with the possible exception of courses with more capacity than popularity). This result seems to be wonderful news for the course allocation problem. However, there is a catch: Budish’s proof is non-constructive, as it relies on Kakutani’s fixed point theorem.

      A heuristic search algorithm for solving A-CEEI was introduced in Othman et al. [2010]. The algorithm resembles a traditional tâtonnement process, in which the prices of courses that are oversubscribed are increased and the prices of courses that are undersubscribed are decreased. A modified version of this algorithm that guarantees courses are not oversubscribed is currently used by the Wharton School (University of Pennsylvania) to assign their MBA students to courses [Budish et al. 2014]. While it has been documented that the heuristic algorithm often produces much tighter approximations than the theoretical bound, on some instances it fails to find even the guaranteed approximation [Budish 2011 (Section 9)].

      Thus A-CEEI is a problem where practical interest motivates theoretical inquiry. We have a theorem that guarantees the existence of an approximate equilibrium—the issue is finding it. Can the heuristic algorithms currently used to assign Wharton MBAs to their courses be replaced by a fast and rigorous algorithm for finding an approximate CEEI? Theorem 10.3 answers this question in the negative, showing that computing an A-CEEI is PPAD-complete.

      The following bilinear optimization meta-problem captures a wide range of applications, from areas like statistics Sparse Principle Component Analysis (Sparse PCA), graph theory (Clique), and game theory (Nash equilibrium):

      For all the applications we consider, once we fix some y*, finding the best feasible x that maximizes xAy* is a tractable problem. (Similarly, if we were given a good x*, finding a matching y is easy.) But optimizing x and y simultaneously is NP-hard. What about approximations?

Скачать книгу