This reference also looks solid and I'm looking forward to reading it in more depth.
> There are two ways to teach quantum mechanics. The first way -- which for most physicists today is still the only way -- follows the historical order in which the ideas were discovered. So, you start with classical mechanics and electrodynamics, solving lots of grueling differential equations at every step. Then you learn about the "blackbody paradox" and various strange experimental results, and the great crisis these things posed for physics. Next you learn a complicated patchwork of ideas that physicists invented between 1900 and 1926 to try to make the crisis go away. Then, if you're lucky, after years of study you finally get around to the central conceptual point: that nature is described not by probabilities (which are always nonnegative), but by numbers called amplitudes that can be positive, negative, or even complex.
> Today, in the quantum information age, the fact that all the physicists had to learn quantum this way seems increasingly humorous. For example, I've had experts in quantum field theory -- people who've spent years calculating path integrals of mind-boggling complexity -- ask me to explain the Bell inequality to them. That's like Andrew Wiles asking me to explain the Pythagorean Theorem.
> As a direct result of this "QWERTY" approach to explaining quantum mechanics - which you can see reflected in almost every popular book and article, down to the present -- the subject acquired an undeserved reputation for being hard. Educated people memorized the slogans -- "light is both a wave and a particle," "the cat is neither dead nor alive until you look," "you can ask about the position or the momentum, but not both," "one particle instantly learns the spin of the other through spooky action-at-a-distance," etc. -- and also learned that they shouldn't even try to understand such things without years of painstaking work.
> The second way to teach quantum mechanics leaves a blow-by-blow account of its discovery to the historians, and instead starts directly from the conceptual core -- namely, a certain generalization of probability theory to allow minus signs. Once you know what the theory is actually about, you can then sprinkle in physics to taste, and calculate the spectrum of whatever atom you want. This second approach is the one I'll be following here.
Here's the full lecture. The approach was interesting enough that I bought his full book, but unfortunately it was a little over my head.
also free online
(There used to be a pdf of the textbook online at , but it seems to have been removed...)
Scott Aaronson's Quantum Computing Since Democritus  is also good, but at a more abstract level. The well-written lecture notes it's based on are on his site .
General quantum physics knowledge can also help, but physics-focused content tends to focus more on the calculus whereas quantum computing mostly only uses the linear algebra. I liked The Theoretical Minimum .
Then you're misjudging the weights "for" and "against" him by putting way too much weight on people who don't know what they're talking about and way too little on those who do. It is likely that your "for" group are still operating under the idea that "quantum" works by "trying all possible answers then returning the correct one". This is empirically, mathematically-provably wrong. Anyone operating under this idea deserves for you to weight them at zero.
Despite the fact we still can't build a very big quantum computer, we actually do know quite a bit about what they can do and not do. And as Scott Aaronson points out very frequently, if in fact they prove either able to do something our current theories say they can not or unable to do something that our current theories say they can, either way that will be very interesting, precisely because it will imply that there is something wrong with quantum mechanics, which for all its "woo woo" reputation is one of the most solid math-to-reality theories we have ever had in the history of mankind.
Scott Aaronson isn't "holding" the line... he and his fellow-travelers are drawing the line.
Edit: I'm also unsure on why you think Aaronson believes quantum "won't work"... he's on the optimist side that quantum computers can be made practical. If you mean that he doesn't think "quantum" can solve NP-completeness, well, of course he doesn't... he understands the mathematical proof that it doesn't, so that's hardly surprising.
Edit edit: A positive followup to this negative message: Consider reading Quantum Computing Since Democritus , or if you don't want to spend the dough, read through the class notes that turned into that book .
: http://www.scottaaronson.com/democritus/ , see "Lecture Notes" section.
Lots of interesting stuff about quantum computing and quantum algorithms in there. Some interesting tidbits:
- var x; var y = x
- print(x) <--- Impossible, quantum information can't be duplicated. x no longer exists.
- var y
- var x = f(y) <--- The value of y is now changed. You must undo f with f' to return y to it's original state.
(this is probably more directly relevant to programming though, and free: http://sneezy.cs.nott.ac.uk/qml/compiler/jjg-thesis.pdf)
 : http://www.amazon.com/Quantum-Computing-since-Democritus-Aar...
Fresh book recommendations delivered straight to your inbox every Thursday.