FOCS 2022 Test of Time Awards

FOCS 2012

  • A Tight Linear Time (1/2)-Approximation for Unconstrained Submodular Maximization
    Niv Buchbinder, Moran Feldman, Joseph Naor, Roy Schwartz
    This paper gives a ½-approximation algorithm for the fundamental problem of maximizing a submodular set function. The approximation factor of ½ matches the lower bound established by Feige, Mirrokni, and Vondrák for any algorithm that makes a subexponential number of calls to a value oracle. The algorithm runs in linear time and the algorithm and analysis are simple, elegant and innovative. The generality and importance of the problem, and the optimality and ingenious simplicity of the algorithm makes this a classic result in the theory of algorithms.

  • Optimal Multi-dimensional Mechanism Design: Reducing Revenue to Welfare Maximization
    Yang Cai, Constantinos Daskalakis, S. Matthew Weinberg
    This paper presents a general reduction that converts a large class of revenue maximization problems in multidimensional mechanism design to a class of welfare maximization problems in the general framework of the Vickrey-Clarke-Groves (VCG) mechanism in the Bayesian setting. The VCG mechanism is incentive compatible, so the transformation takes care of all the incentive and pricing issues hence freeing the designer from handling these issues allowing him to focus on the algorithmic problem of solving the resulting optimization problem. The methods introduced in this paper provide a crucial bridge between two areas of economic mechanism design, and play an important role in the modern treatment of algorithmic methods in mechanism design.

FOCS 2002

  • Generalized Compact Knapsacks, Cyclic Lattices, and Efficient One-Way Functions from Worst-Case Complexity Assumptions
    Daniele Micciancio

    This paper was an inflection point for lattice-based cryptography, helping transform it from a relatively small and fragmented research area into a thriving subfield of cryptography with immense theoretical and practical impact. It showed that cryptography based on lattices can be both efficient and secure under worst-case complexity assumptions, a feat never reached by number-theory based cryptography.

    With remarkable foresight, the paper first boldly put forward a conjecture on the worst-case hardness of "algebraically structured" lattices, then rigorously proved that such hardness gives rise to similarly structured average-case hardness, and finally convincingly argued that this structure admits fast implementation on modern microprocessors.

    The paper's worst-case, and hence average-case, hardness conjectures have withstood the test of time. The techniques introduced in this paper have evolved and grown to an immense body of work, shaping many future results in the area.

  • LT Codes
    Michael Luby

    The paper "LT Codes" describes the first implementation of so-called "universal erasure codes", which are almost optimal codes for any channel over which data may be lost under any loss model. More specifically, the paper shows how to realize the powerful "digital fountain" concept introduced in earlier work of Byers, Luby, Mitzenmacher and Rege, whose implementation had proved elusive. The actual implementation is extremely elegant and is a perfect example of the power of analytical techniques from CS theory to shape the design of an important real-world artifact.

    The ideas in the paper have been hugely influential in both the academic and the technological worlds. Among its thousands of academic citations are works in CS theory, information theory, network coding, optimal and wireless communications, sensor networks, content distribution and data storage. On the practical side, LT codes and their derivatives have become integral parts of several standards for wired and wireless networks.

FOCS 1992

  • Proof verification and hardness of approximation problems
    Sanjeev Arora, Carsten Lund, Rajeev Motwani, Madhu Sudan, Mario Szegedy

    This seminal paper proves one of the fundamental theorems of modern computational complexity, the remarkable PCP Theorem. That is, every proof has a robust version only polynomially larger that can be probabilistically verified by reading just a constant number of bits. The probabilistic verifier queries just a constant number of bits, and if the input is in the language, then the verifier always accepts; if it is not in the language, then the verifier rejects with high probability. This was the culmination of a line of work that has been highlighted in recent Test of Time Awards, including the related work of Arora-Safra that is a co-winner of this year's award.

    Besides being central to our modern understanding of the class NP, this work has had ramifications for many other areas of TCS, notably hardness of approximation and foundations of cryptography. As observed in the paper, the PCP theorem implies that MAXSNP-hard problems are NP-hard to approximate to within a constant factor, and by work of Feige et al. it also implies that MAX CLIQUE is NP-hard to approximate to within a power of n. The PCP Theorem has had a huge influence on subsequent work on inapproximability, and increasingly tight bounds on the best achievable approximation ratios for problems have been obtained by designing PCPs tailored to particular approximation problems. PCP's are not just a theoretical concept, but are currently being implemented in order to have efficient cryptographic proofs of knowledge as an ingredient in blockchain technology. While this paper was immediately recognized as a breakthrough, its implications are still being explored today, and continually surprise and delight.

  • Probabilistic Checking of Proofs: A new characterization of NP
    Sanjeev Arora, Shmuel Safra

    This exceptional paper was the first to characterize NP in a form very close to the PCP theorem. Specifically, it characterizes NP as the class of languages whose proofs of membership in the language can be made robust in that they can be probabilistically verified by reading just o(log n) bits. If the input is in the language, then the probabilistic verifier always accepts; if it is not in the language, then it rejects with high probability. This was a major step in a line of work that has been highlighted in recent Test of Time Awards, including the related work of Arora-Lund-Motwani-Sudan-Szegedy that improved the number of queried bits to constant and is a co-winner of this year's award.

    Besides being central to our modern understanding of the class NP, the PCP theorem has had ramifications for many other areas of TCS, notably hardness of approximation and foundations of cryptography. By work of Arora et al., the PCP theorem implies that MAXSNP-hard problems are NP- hard to approximate to within a constant factor, and by work of Feige et al. it also implies that MAX CLIQUE is NP-hard to approximate to within a power of n. The PCP Theorem has had a huge influence on subsequent work on inapproximability, and increasingly tight bounds on the best achievable approximation ratios for problems have been obtained by designing PCPs tailored to particular approximation problems. PCP's are not just a theoretical concept, but are currently being implemented in order to have efficient cryptographic proofs of knowledge as an ingredient in blockchain technology. While this paper was immediately recognized as a breakthrough, its implications are still being explored today, and continually surprise and delight.

  • Communication on Noisy Channels: A coding theorem for computation
    Leonard J. Schulman
    This seminal paper formally introduced the problem of error correction in the setting of interactive communication, extending the problem of error correction for one way transmission of information that is addressed by the classical theory of error correcting codes. In addition to defining an important model and problem, the paper provides a general protocol for making interactive communication robust to noise that incurs only a constant factor overhead. The lasting impact of this paper is demonstrated by the many papers in the past decade that were inspired by, and built upon, the important insights of this paper, and by the ongoing research challenges that grew out of it.

Call for Nominations

The 2022 FOCS Test of Time Awards, awarded annually, recognize papers published in the Proceedings of the Annual IEEE Symposium on Foundations of Computer Science. This is the fourth annual award. The target years for the Test of Time Awards in 2022 are for papers presented at the FOCS conferences in 1992, 2002, and 2012, which are each considered for separate awards. While papers in the target years will always be considered by the award committee, in each of these award categories it is possible to nominate FOCS conference papers published up to four conferences earlier than the targeted conference. Thus, papers published at FOCS 89-92 can compete for the award targeting the 1992 conference and similar options are available for the other two awards.

Nomination Procedure

Nominations should be sent by July 31, 2022 to focs2022tot@gmail.com with a subject line of "TOT nomination". Nominations should contain an explanation of the impact of the nominated paper(s), including references to follow-on work. Self-nominations are discouraged.

Selection

The winners will be selected by a committee appointed by the FOCS Steering Committee, which for 2022 consists of Russell Impagliazzo (UCSD), Yael Kalai (Microsoft Research and MIT), Alistair Sinclair (UC Berkeley), Éva Tardos (Cornell University), David Zuckerman (UT Austin), and committee chair Michael Saks (Rutgers University).

In selecting the Test of Time Award winners, the Committee will pay particular attention to long-term impact. This impact can come in many forms, including:

  1. Opening up a new area of research
  2. Introducing new techniques
  3. Solving a problem of lasting importance, among others

The committee expects to select exactly one paper for the award associated with each of the targeted years 1992, 2002 and 2012 conferences. However, the committee may select up to three papers in each category, and include papers from up to four years prior to the targeted year. The committee may give awards to papers that are not nominated.

Sponsors