Found in 11 comments on Hacker News
lisper · 2022-11-25 · Original thread
> How many good people were discouraged into going into this area of research because of this widespread belief that this pursuit is proven to be unachievable in the general case?

You need to read this book:

https://www.amazon.com/Quantum-Computing-since-Democritus-Aa...

(The title notwithstanding, this book is about a lot more than just quantum computing. In particular, it is about computational complexity theory, which is exactly where your line of inquiry leads. It is a very productive area of research.)

IlliOnato · 2021-02-08 · Original thread
Perhaps not exactly what you ask for, i.e. not that much on hardware/engineering, rather on physics and mathematics the quantum computing is built on: https://www.amazon.com/Quantum-Computing-since-Democritus-Aa...
formalsystem · 2019-08-18 · Original thread
There's so much more to CS theory than big O notation and you're in for a treat if you check out any of the below - happy to share more if people are interested.

1. https://www.amazon.com/Computational-Complexity-Approach-San...

2. https://www.amazon.com/Quantum-Computing-since-Democritus-Aa...

3. https://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden...

4. https://www.amazon.com/Introduction-Theory-Computation-Micha...

formalsystem · 2019-04-17 · Original thread
I organized a quantum computing seminar back in school and two resources I thought were excellent

* https://people.eecs.berkeley.edu/~vazirani/f16quantum.html

* https://www.amazon.com/Quantum-Computing-since-Democritus-Aa...

This reference also looks solid and I'm looking forward to reading it in more depth.

losvedir · 2016-10-12 · Original thread
Ha, this immediately calls to mind Scott Aaronson's approach to teaching Quantum Mechanics in one of his lectures:

> There are two ways to teach quantum mechanics. The first way -- which for most physicists today is still the only way -- follows the historical order in which the ideas were discovered. So, you start with classical mechanics and electrodynamics, solving lots of grueling differential equations at every step. Then you learn about the "blackbody paradox" and various strange experimental results, and the great crisis these things posed for physics. Next you learn a complicated patchwork of ideas that physicists invented between 1900 and 1926 to try to make the crisis go away. Then, if you're lucky, after years of study you finally get around to the central conceptual point: that nature is described not by probabilities (which are always nonnegative), but by numbers called amplitudes that can be positive, negative, or even complex.

> Today, in the quantum information age, the fact that all the physicists had to learn quantum this way seems increasingly humorous. For example, I've had experts in quantum field theory -- people who've spent years calculating path integrals of mind-boggling complexity -- ask me to explain the Bell inequality to them. That's like Andrew Wiles asking me to explain the Pythagorean Theorem.

> As a direct result of this "QWERTY" approach to explaining quantum mechanics - which you can see reflected in almost every popular book and article, down to the present -- the subject acquired an undeserved reputation for being hard. Educated people memorized the slogans -- "light is both a wave and a particle," "the cat is neither dead nor alive until you look," "you can ask about the position or the momentum, but not both," "one particle instantly learns the spin of the other through spooky action-at-a-distance," etc. -- and also learned that they shouldn't even try to understand such things without years of painstaking work.

> The second way to teach quantum mechanics leaves a blow-by-blow account of its discovery to the historians, and instead starts directly from the conceptual core -- namely, a certain generalization of probability theory to allow minus signs. Once you know what the theory is actually about, you can then sprinkle in physics to taste, and calculate the spectrum of whatever atom you want. This second approach is the one I'll be following here.

Here's the full lecture.[0] The approach was interesting enough that I bought his full book[1], but unfortunately it was a little over my head.

[0] http://www.scottaaronson.com/democritus/lec9.html [1] https://www.amazon.com/Quantum-Computing-since-Democritus-Aa...

Strilanc · 2015-08-03 · Original thread
Michael Nielsen, co-author of the de-facto standard textbook for quantum computing [1], has an accessible "Quantum Computing for the Determined" video series on youtube [2].

(There used to be a pdf of the textbook online at [3], but it seems to have been removed...)

Scott Aaronson's Quantum Computing Since Democritus [4] is also good, but at a more abstract level. The well-written lecture notes it's based on are on his site [5].

General quantum physics knowledge can also help, but physics-focused content tends to focus more on the calculus whereas quantum computing mostly only uses the linear algebra. I liked The Theoretical Minimum [6].

1: http://www.amazon.com/Quantum-Computation-Information-Annive...

2: https://www.youtube.com/playlist?list=PL1826E60FD05B44E4

3: http://www.johnboccio.com/research/quantum/notes/QC10th.pdf

4: http://www.amazon.com/Quantum-Computing-since-Democritus-Aar...

5: http://www.scottaaronson.com/democritus/lec1.html

6: http://www.amazon.com/Quantum-Mechanics-The-Theoretical-Mini...

jerf · 2015-07-09 · Original thread
"The weight of opinion seems against him on quantum though he remains empirically correct for the foreseeable future."

Then you're misjudging the weights "for" and "against" him by putting way too much weight on people who don't know what they're talking about and way too little on those who do. It is likely that your "for" group are still operating under the idea that "quantum" works by "trying all possible answers then returning the correct one". This is empirically, mathematically-provably wrong. Anyone operating under this idea deserves for you to weight them at zero.

Despite the fact we still can't build a very big quantum computer, we actually do know quite a bit about what they can do and not do. And as Scott Aaronson points out very frequently, if in fact they prove either able to do something our current theories say they can not or unable to do something that our current theories say they can, either way that will be very interesting, precisely because it will imply that there is something wrong with quantum mechanics, which for all its "woo woo" reputation is one of the most solid math-to-reality theories we have ever had in the history of mankind.

Scott Aaronson isn't "holding" the line... he and his fellow-travelers are drawing the line.

Edit: I'm also unsure on why you think Aaronson believes quantum "won't work"... he's on the optimist side that quantum computers can be made practical. If you mean that he doesn't think "quantum" can solve NP-completeness, well, of course he doesn't... he understands the mathematical proof that it doesn't, so that's hardly surprising.

Edit edit: A positive followup to this negative message: Consider reading Quantum Computing Since Democritus [1], or if you don't want to spend the dough, read through the class notes that turned into that book [2].

[1]: http://www.amazon.com/Quantum-Computing-since-Democritus-Aar...

[2]: http://www.scottaaronson.com/democritus/ , see "Lecture Notes" section.

javajosh · 2014-09-27 · Original thread
FYI Scott Aaronson wrote Quantum Computing since Democritus[1] and it is an amazing book. Highly recommended for anyone interested in math, physics, theoretical computer science, quantum computing or just good science writing.

[1] http://www.amazon.com/Quantum-Computing-since-Democritus-Aar...

shadowmint · 2013-06-29 · Original thread
Recommend reading: http://www.amazon.com/Quantum-Computing-since-Democritus-Aar...

Lots of interesting stuff about quantum computing and quantum algorithms in there. Some interesting tidbits:

    - var x; var y = x 
    - print(x) <--- Impossible, quantum information can't be duplicated. x no longer exists.
And:

    - var y
    - var x = f(y) <--- The value of y is now changed. You must undo f with f' to return y to it's original state. 
I'm sure some QPL's will build this sort of stuff in so you don't have to worry about it, but definitely interesting reading~

(this is probably more directly relevant to programming though, and free: http://sneezy.cs.nott.ac.uk/qml/compiler/jjg-thesis.pdf)

crm416 · 2013-05-16 · Original thread
Aaronson is fabulous. In my opinion, possibly the best technical author around these days. If you haven't already, his new book[1] is absolutely worth checking out, especially if you're unfamiliar with quantum.

---

[1] : http://www.amazon.com/Quantum-Computing-since-Democritus-Aar...

Fresh book recommendations delivered straight to your inbox every Thursday.