### What papers should everyone read?

• This question is (inspired by)/(shamefully stolen from) a similar question at MathOverflow, but I expect the answers here will be quite different.

We all have favorite papers in our own respective areas of theory. Every once in a while, one finds a paper so astounding (e.g., important, compelling, deceptively simple, etc.) that one wants to share it with everyone. So list these papers here! They don't have to be from theoretical computer science -- anything that you think might appeal to the community is a fine answer.

You can give as many answers as you want; please put one paper per answer! Also, notice this is community wiki, so vote on everything you like!

(Note there has been a previous question about papers in recursion-theoretic complexity but that is quite specialized.)

In the answers, I'd like to see more emphasis on whether it really is a good idea to read the **original** paper nowadays (or if it makes much more sense to read a modern textbook exposition of it). I have too often seen TCS papers that are truly seminal, but I'd rather save my colleagues from the pain of trying to decipher the original write-up – which is far too often a hastily-written 10-page conference abstract, with references to a "full version" that never appeared...

Yes, I hope it is clear that papers of this type are not good for the list (if you want to share it with everyone, then it shouldn't be a pain to read)

Too many people are just posting one-liners. Any one can post 100s of unique papers without putting any thought into it. Please post *why* you think *everyone* should read those papers. This means justifying why they should *read that paper instead of someone else's writeup of that result*, and what is so awesome about the paper that *everyone* should read it.

Good question. My opinion is that if you want to understand the minds of the inventors, and possibly understand how to invent things, you have to read their own words. The more you labor, the closer you get to their actual thought process.

I would like to answer the question.

• "A mathematical theory of communication" by Claude Shannon, classics of information theory. Very readable.

(Mirror)

The safest general characterization of the Internet is that it consists of a series of footnotes to this paper.

• The 1936 paper that arguably started computer science itself:

• Alan Turing, "On Computable Numbers, with an Application to the Entscheidungsproblem", Proceedings of the London Mathematical Society s2-42, 230–265, 1937. doi: 10.1112/plms/s2-42.1.230

In just 36 pages, Turing formulates (but does not name) the Turing Machine, recasts Gödel's famous First Incompleteness Theorem in terms of computation, describes the concept of universality, and in the appendix shows that computability by Turing machines is equivalent to computability by $\lambda$-definable functions (as studied by Church and Kleene).

It is also very accessible and readable...

and with it The Annotated Turing by Charles Petzold [Highly Recommended]

• Ken Thompson's "Reflections on Trusting Trust". Short, sweet, and mind-blowing.

Also, very approachable. I read it quite some time ago, when I had basically no CS background, no programming experience and didn't even know what a compiler was.

just read it -- awesome indeed!

"Last week, Googler Ken Thompson was awarded the Japan Prize in Information and Communications for his early work on the UNIX operating system." (src: Buzz post from Life at Google)

I would think this paper would be pretty difficult to digest without at least knowing what a compiler is.

In the paper, I think figures 2.1 and 2.2 are swapped.

Disagree - nothing awesome or mindblowing in this paper. TL;DR 6 pages from mid-80s about "need to change criminal code to start punishing hackers [just like thieves or burglars]". O yeah, mentions a _quine_, without calling it by name.

• What Every Computer Scientist Should Know About Floating-Point Arithmetic

This paper explains and reinforces the notion that floating point isn't magic. It explains overflow, underflow, what denormalized numbers are, what NaNs are, what inf is, and all the things these imply. After reading this paper, you'll know why a == a + 1.0 can be true, why a==a can be false, why running your code on two different machines can give you two different answers, why summing numbers in a different order can give you an order of magnitude difference and all the wacky stuff that happens in the world of mapping an uncountably infinite set of numbers onto a countably finite set.

An edited version is also available on the web.

Since Oracle acquired Sun, it ruined most of the links from Sun's web page. Although you can reach the original paper from here.

I always think that CS research papers are written in some foreign language.

Very good! It is worth to be put on tagline banner on the site to be sure no one student miss that.

The second link is currently broken

@ChristopherManning thanks for pointing that out. I've updated the link.

This is my favourite from the the list. Also note that this is a living document, unlike most papers which do not receive updates after being published.

Though I've stumbled across this more than a year later, it's been invaluable to me. I would recommend this paper to everyone!

I like this ones the best, three-pass approach really helped me comprehending first few papers easily.

• Paths, Trees and Flowers by J. Edmonds. This paper about classic combinatorial optimization problem is not only well written, but also states that the notion of "polynomial-time algorithms" is essentially a synonym for efficiency.

• Reducibility Among Combinatorial Problems by Richard Karp. The paper contains what's often referred to as Karp's "original 21 NP-complete problems." In many ways, this paper truly motivated the study of NP-completeness by demonstrating its applicability to a wider domain. Very readable.

I like this paper, but some of the reductions are really sketchy and hard to follow. See any complexity text for more details.

@Andras Salamon I agree 100%.

• Hartmanis and Stearns, "On the computational complexity of algorithms", Transactions of the American Mathematical Society 117: 285–306 (1965)

This was the first paper that took the study of time complexity seriously, and surely was the primary impetus for Hartmanis and Stearns' joint Turing award. While their initial definitions are not quite what we use today, the paper remains extremely readable. You really get the feeling of how things were in the old "Wild West" frontier of the 60's.

• Quantum Mechanical Computers (PDF) by Richard Feynman.

He introduces the idea of quantum computation, describes quantum circuits, explains how classical circuits can be simulated by quantum circuits, and shows how quantum circuits can compute functions without lots of garbage qubits (using uncomputation).

He then shows how any classical circuit can be encoded into a time-independent Hamiltonian! His proof goes through for quantum circuits too, therefore showing that time evolving Hamiltonians is BQP-hard! His Hamiltonian construction is also used in the proof of the quantum version of the Cook-Levin theorem, proved by Kitaev, which shows that k-local Hamiltonian is QMA-complete.

The link isn't valid. Do you have another source? edit> Searched on google : http://www.wjzeng.net/Ref/Feynman_QuantumMechanicalComputers.pdf Is it this one?

That's the one. I added a new link and a link to it's page on the publisher's website.

Did the notions of BQP and QMA exist when Feynman wrote this paper? Or is there a recent paper which draws this connection? Any reference/exposition of this fact that k-local Hamiltonian is QMA complete?

• Expander graphs and their applications, S. Hoory, N. Linial, and A. Wigderson is an extremely nice survey on expander graphs. No surprise that it won the 2008 AMS Conant Prize.

I want to recall that expander graphs are the key ingredient in recent breakthroughs in TCS, eg.

and not so recent:

You should watch for combinatorial or support preconditioners. Expander graphs are even used in numerical analysis today.

@shuhalo Can you give an example?