I was at FCRC (the CS conference conglomerate that happens once every 4 years), June 13-19. Here are some of the talks I found particularly memorable.

Personal notes on FCRC talks are at https://workflowy.com/s/wkI79JfN0N and on STOC/CCC/EC talks (very rough) are at https://dl.dropboxusercontent.com/u/27883775/wiki/math/pdfs/stoc.pdf. Note that neither has been edited.

**FCRC award/plenary talks**

**Turing award lecture (The Land sharks are on the squawk box), Michael Stonebraker**(https://www.youtube.com/watch?v=BbGeKi6T6QI): Stonebraker draws parallels between his work in building Postgres, a relational database system, and his cross-country bike trip. He described numerous challenges and how they were overcome in both situations, concluding that they were both about “making it happen”.**Interdisciplinary research from the view of theory, Andrew Yao**: Yao comes from the viewpoint of theoretical computer science but has then worked on connections to physics (quantum computation), economics (auction theory), and cryptography (certifiable randomness). Theoretical computer science started with computability and complexity theory, but has since then grown to involve algorithms and data structures, information theory, combinatorics, cryptography, quantum computing, bio-informatics, game theory and mechanism design… Yao argues that interdisciplinary research not only impacts the development of other disciplines but also has a transformational effect on the nature of CS research. He distinguishes between surface-level interdisciplinarity – ex. outsourcing a computational problem to CS researchers – vs. core-to-core interaction – where researchers cooperate not just to solve a problem at the computational level but also the conceptual level. The trend is increasingly that computer scientists are becoming specialists in different fields on a part-time basis; he is optimistic about a return to the “Renaissance man.”**Hardware neural nets: from inflated expectations to a plateau of productivity, Olivier Temam**: Temam refers to the “tuning curve” for reactions to innovation, consisting of- peak of inflated expectations
- trough of disillusionment
- slope of enlightenment (when people understand what’s useful for)
- plateau of productivity

This happened with neural nets, which were initially hyped as “emulating the brain in hardware”; now the focus is on the modest goals of doing well in concrete tasks like classifying images, which we have made a lot of progress in. (There is a “bimodal” response to the idea of “emulating the brain in hardware”: bullshit vs. awesome. People should be able to engage in more level conversation without fearing for their careers.) Three communities need to work together on neural nets: neuroscience – to understand what the brain is doing, computer architecture – to implement hardware, and machine learning/algorithms – to translate brain functionality into algorithms to pass on to the architects, the “middleman” in this game.

**The F# path to relaxation, Donald Syme**: Syme gives his perspective on designing and implementing a programming language (F#). F# is notable in that it is a functional language that incorporates features from OOP. People can often be polarized towards different programming language paradigms (ex. functional vs. object-oriented (OOP)). But rather than create opposing camps, it’s good to look at the advantages of each and see if there’s a way to include all those advantages (which Syme calls “relaxation” or “synthesis from contradiction”) in as simple a way as possible.

**STOC (Symposium on the Theory of Computing) **

**Forrelation, Scott Aaronson, Andris Ambainis**: Prove that a certain problem, forrelation (Fourier correlation), gives (essentially) the largest possible separation between quantum and classical query complexities (how many queries it would take to solve the problem if the queries are classical vs. quantum). http://dl.acm.org/authorize.cfm?key=N98526**Super-resolution, extremal functions and the condition number of Vandermonde matrices, Ankur Moitra**: Investigates the very general question: When we recover fine-grained structure from coarse-grained measurements (called “super-resolution”)? He finds there is a sharp threshold for the cutoff frequency (we’re allowed to make measurements below this frequency) above which measurements can be recovered from noise, and below which measurements cannot be recovered from exponentially small noise. The analysis involves calculating the condition number of a Vandermonde matrix (and surprisingly, involves the Beurling-Selberg majorant, which has traditionally been used in analytic number theory, e.g., to count primes in intervals). More generally, many problems about well-conditioning turn out to be much nicer when studied over the complex numbers. For example, the Discrete Fourier Transform is a particularly well-conditioned Vandermonde matrix. Then tools from complex analysis naturally come into play. There may be more opportunities to apply complex analysis to problems in machine learning/signal processing/statistics. http://dl.acm.org/authorize.cfm?key=N98679**Knuth Prize lecture, Laszlo Babai**: Among other things, Babai talks about the many surprising applications of group theory to theoretical computer science, including the following:- “Width-5 branching programs can count” because is not solvable (https://people.cs.umass.edu/~barring/publications/bwbp.pdf).
- The idea of a semidirect product inspires the zig-zag product of graphs used to construct explicit expanders for derandomization (SL=L). (http://arxiv.org/pdf/math/0406038.pdf)
- The automorphism group of a graph has consequences for its structure: vertex transitivity implies facts about perfect matchings, cycle length…possible Hamiltonicity.
- A nontrivial monotone boolean function invariant under a transitive permutation group conjecturally has the maximum decision tree complexity (is “evasive”).
- What is the complexity of questions about groups (specific classes of group (non)membership questions, etc.)? They have appeared as candidate separations for complexity classes.

**EC (Economics and computation)**

I went to just the award lectures for EC.

**Algorithms for strategic agents (Matthew Weinberg)**: One big difference between computer science and economics is that:- in CS (algorithm design), we assume that inputs to algorithms are correct, vertex connectivity, expansion… possibly Hamiltonicity.
- in economics (mechanism design), we have to deal with the possibilities that people will lie to take advantage of the system. More precisely, the inputs – ex. people’s bids in an auction, or preferences in a lottery – are given by agents acting in their self-interest, i.e., to maximize their own utility.

Not only do you have to design an algorithm that gives the correct answer, you also have to design it in such a way that people can’t take advantage of it by lying about their values or rankings. A natural question from a TCS perspective, which Weinberg explores, is whether there is a black-box reduction from certain classes of algorithms to corresponding mechanisms.

**Test of Time talk**(influential paper from 10-25 years ago):**The social cost of cheap pseudonyms, Eric Friedman, Paul Resnick**: The Internet allows cheap pseudonymity. At the dawn of the Internet age, Friedman and Resnick establish that there is a definite economic price for pseudonymity. They first model pseudonymity through a repeated prisoner’s dilemma game where players are allowed to leave and come in with new identities. This model, however, gives rise to strange equilibria, and little can be said. The authors then adds the realistic assumption of “trembling,” i.e., accidentally defecting when you attempt to cooperate (e.g. a package you send through eBay gets lost in the mail). One can then prove quite a few things about this simple model. For example, it’s natural for old-timers with good reputation to be occasionally hostile against newcomers (which they calls Pay Your Dues).

Reblogged this on John Pappas's blog.

By:

johnpappason July 10, 2015at 6:05 pm