I worked at one of the quantum computing co's on their compiler stack (so pretty much pure classical compute stuff), but in order to have even a baseline understanding of the computations and programming using qubits, I had to first get a better intuition for the underlying quantum mechanics at play. This was a great introduction to the physics underpinning the computations:
It's long, and the subject matter is intimidating at times, but watch, re-watch, then go deep by finding papers on subjects like superposition and entanglement, which are the key quantum phenomena that unlock quantum computing.
It also helps to understand a bit about how various qubit modalities are physically operated and affected by the control systems (e.g. how does a program turn into qubit rotations, readouts, and other instruction executions). Some are superconducting chips using electromagnetic wave impulses, some are suspending an ion/atom and using lasers to mutate states, or photonic chips moving light through gates - among a handful of other modalities in the industry and academia.
IBM's Qiskit platform may still have tooling, simulators, and visualizers that help you write a program and step through the operations on the qubit(s) managed by the program:
> how does a program turn into qubit rotations, readouts, and other instruction executions
What is actually involved in the "instruction set" for a quantum computer? How do you "compile" to it? If i treat everything below a "logical qubit" (https://en.wikipedia.org/wiki/Physical_and_logical_qubits) as a black-box since from programming pov it does not(?) matter can i think of it using classical computation models?
This is analogous to how one does not need to know Semiconductor Physics (which is Quantum Physics), Electronic Component Physics to understand the logical boolean framework built on top of it which is then synthesized into an instruction set to program against.
It does! They also still have all their summer schools up that you can go through step by step. Although I must promote Strawberry fields as I believe photonic integrated systems really is the better option.
A few comments have been kind enough to recommend the textbook I wrote with Ike Chuang. I'm glad they find it useful!
It's worth noting: the book assumes a fair bit of mathematical background, especially in linear algebra. If you don't have the equivalent of an undergrad CS/math/physics degree (with some linear algebra), it may be better to start with gentler sources.
One such gentler source is the free online text I wrote with Andy Matuschak -- https://quantum.country. I'm sure there are others which are very good, but perhaps that's helpful!
Both books focus on foundations of the field, and don't cover recent innovations -- the book with Ike Chuang is 26 years old! Still, many of the foundations have remained quite stable.
Michael Nielsen himself! Thanks for the pointer to your gentler introduction. Though i did my B.Sc. Honours in Chemistry decades ago (before switching to CS) i might have to bone up on the requisite mathematics, which is fine.
Given your experience in this domain; i would appreciate your take on Quantum Computing hype vs. reality? There is a lot of contradictory information like for example; The Case Against Quantum Computing - https://spectrum.ieee.org/the-case-against-quantum-computing
Do you think quantum computing will ever become mainstream? Will the "common folk" be able to program and use it with the same ease with which we do classical computers by using layers of abstractions?
Potentially my only opportunity to say thank you for your efforts in creating your textbook, so thank you. It helped me get to the position I am today as an academic researcher in QC although I focus on Photonics.
A helpful way to learn this is to separate models, machines, and practice.
For computation models, the circuit model and measurement-based computation cover most real work. Aaronson’s Quantum Computing Since Democritus and Nielsen & Chuang explain why quantum differs from classical (interference, amplitudes, complexity limits).
For computers/architecture, think of qubits as noisy analog components and error correction as how digital reliability is built on top. Preskill’s NISQ notes are very clear here.
For programming, most work is circuit construction and simulation on classical hardware (Qiskit, Cirq). That’s normal and expected.
Beyond Shor, look at Grover, phase estimation, and variational algorithms—they show how quantum advantage might appear, even if it’s limited today.
1) Generally the two models of QC are the digital/circuit model (analogous to digital logic gates, with some caveats, such as reversibility of operations, no-cloning theorem), and analog computation (tuning the parameters of a continuous-time quantum system in your lab such that the system produces useful output)
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.
The Nielsen/Chuang book is what i see recommended everywhere and so am definitely going to get it. What others would you recommend?
I had recently asked a similar question about books on "Modern Physics" (essentially Quantum Physics + Relativity) here https://news.ycombinator.com/item?id=46473352 so given your profile, what would be your recommendations?
PS: You might want to add your website url to your HN profile since your Physics Notes might be helpful to a lot of other folks too. :-)
Most of what I use day to day in research is either specialized to my subfield or can be found in Nielsen and Chuang, so I've actually never looked at any other textbooks specifically for quantum computation. If you're interested in more of the information theory aspect, I have heard that "The Theory of Quantum Information" by John Watrous is a good text, but I have not personally read any of it.
As for Modern Physics, if you have the math prerequisites and you want a broad overview, the series of textbooks by Landau and Lifshitz would be my go-to. However, the problems are quite challenging and the text is relatively terse. I think the only other textbook that I've used personally would be Halliday, Resnick, and Krane. I didn't read a great deal of the textbook, but I do recall finding it relatively well-written.
The classic text is Nielsen and Chuang's "Quantum Computation and Quantum Information" [0]. Whatever else you choose to supplement this book with, it is worth having in your library.
Nielsen and Chuang has the clearest exposition of quantum mechanics I've seen anywhere. Last year I was trying to learn quantum mechanics, not necessarily quantum computation, just out of a general interest in theoretical physics. I started with physics textbooks (Griffiths and Shankar) but it only really "clicked" for me when I read the first few chapters of Nielsen and Chuang.
For learning the theory behind quantum computing, I usually recommend Watrous's lecture notes [1] - they start out by immediately giving a helpful analogy to ordinary probabilistic computation.
The online tutorial [2] is a good followup, especially if you want to understand Clifford gates / stabilizer states, which are important for quantum error correction.
If you have a more theoretical bent, you may enjoy learning about the ZX-calculus [3] - I found this useful for understanding how measurement-based quantum computing is supposed to work.
Check out 1blue3brown’s excellent YouTube video lesson on quantum computing and Grover’s algorithm:
https://youtu.be/RQWpF2Gb-gU
Goes through qubits, state vectors and Grovers algorithm in a highly visual and intuitive fashion. Doesn’t discuss the underlying quantum mechanics in depth, but does mention and link out to resources for the interested viewer to delve deeper.
I have "Essential Mathematics for Quantum Computing" by Woody and "Non-Standard Computation" by Gramß, et al. Both were worth reading, but assumed a bit of background with "foundations of computation."
These are the two books i had zeroed in on before asking here.
However, how approachable is the "Classical and Quantum Computation" book? Mathematics is fine as long as it is accessible. Also how good is the explanation of analogy/comparison between concepts from "Classical Computation" vs. "Quantum Computation"? I believe this is the best way to learn this subject and hence am quite interested to know more about how this book does it.
1/ Digital and analog - where digital equals qubits and analog equals photonics, diamonds, or a range of other bit replacements.
2/ Qubits and gates are the building blocks and operations in digital. Photons, diamonds, electrons, and so on are the bits in analog, you can encode any of these with information in various ways.
3/Strawberry fields for analog qc, and IBM's qiskit for digital
I work on photonic integrated circuits and adapt them to remove the physical limitations on capacity, such as heat, and information loss.
I worked at one of the quantum computing co's on their compiler stack (so pretty much pure classical compute stuff), but in order to have even a baseline understanding of the computations and programming using qubits, I had to first get a better intuition for the underlying quantum mechanics at play. This was a great introduction to the physics underpinning the computations:
https://www.youtube.com/watch?v=lZ3bPUKo5zc&list=PLUl4u3cNGP...
It's long, and the subject matter is intimidating at times, but watch, re-watch, then go deep by finding papers on subjects like superposition and entanglement, which are the key quantum phenomena that unlock quantum computing.
It also helps to understand a bit about how various qubit modalities are physically operated and affected by the control systems (e.g. how does a program turn into qubit rotations, readouts, and other instruction executions). Some are superconducting chips using electromagnetic wave impulses, some are suspending an ion/atom and using lasers to mutate states, or photonic chips moving light through gates - among a handful of other modalities in the industry and academia.
IBM's Qiskit platform may still have tooling, simulators, and visualizers that help you write a program and step through the operations on the qubit(s) managed by the program:
https://www.ibm.com/quantum/qiskit
I've resently learnt about Q# https://en.wikipedia.org/wiki/Q_Sharp though I haven't tried it yet.
Very Nice!
> how does a program turn into qubit rotations, readouts, and other instruction executions
What is actually involved in the "instruction set" for a quantum computer? How do you "compile" to it? If i treat everything below a "logical qubit" (https://en.wikipedia.org/wiki/Physical_and_logical_qubits) as a black-box since from programming pov it does not(?) matter can i think of it using classical computation models?
This is analogous to how one does not need to know Semiconductor Physics (which is Quantum Physics), Electronic Component Physics to understand the logical boolean framework built on top of it which is then synthesized into an instruction set to program against.
It does! They also still have all their summer schools up that you can go through step by step. Although I must promote Strawberry fields as I believe photonic integrated systems really is the better option.
A few comments have been kind enough to recommend the textbook I wrote with Ike Chuang. I'm glad they find it useful!
It's worth noting: the book assumes a fair bit of mathematical background, especially in linear algebra. If you don't have the equivalent of an undergrad CS/math/physics degree (with some linear algebra), it may be better to start with gentler sources.
One such gentler source is the free online text I wrote with Andy Matuschak -- https://quantum.country. I'm sure there are others which are very good, but perhaps that's helpful!
Both books focus on foundations of the field, and don't cover recent innovations -- the book with Ike Chuang is 26 years old! Still, many of the foundations have remained quite stable.
Michael Nielsen himself! Thanks for the pointer to your gentler introduction. Though i did my B.Sc. Honours in Chemistry decades ago (before switching to CS) i might have to bone up on the requisite mathematics, which is fine.
Given your experience in this domain; i would appreciate your take on Quantum Computing hype vs. reality? There is a lot of contradictory information like for example; The Case Against Quantum Computing - https://spectrum.ieee.org/the-case-against-quantum-computing
Do you think quantum computing will ever become mainstream? Will the "common folk" be able to program and use it with the same ease with which we do classical computers by using layers of abstractions?
Potentially my only opportunity to say thank you for your efforts in creating your textbook, so thank you. It helped me get to the position I am today as an academic researcher in QC although I focus on Photonics.
A helpful way to learn this is to separate models, machines, and practice.
For computation models, the circuit model and measurement-based computation cover most real work. Aaronson’s Quantum Computing Since Democritus and Nielsen & Chuang explain why quantum differs from classical (interference, amplitudes, complexity limits).
For computers/architecture, think of qubits as noisy analog components and error correction as how digital reliability is built on top. Preskill’s NISQ notes are very clear here.
For programming, most work is circuit construction and simulation on classical hardware (Qiskit, Cirq). That’s normal and expected.
Beyond Shor, look at Grover, phase estimation, and variational algorithms—they show how quantum advantage might appear, even if it’s limited today.
> A helpful way to learn this is to separate models, machines, and practice.
Yep, that is how i framed my question; glad to see it validated.
Thanks for the pointer to Preskill's NISQ notes.
I went through the comments, and one nice resource that's not on the list:
Quantum Mechanics and Quantum Computation by Umesh Vazirani (UC Berkeley course) - https://youtube.com/playlist?list=PL74Rel4IAsETUwZS_Se_P-fSE...
It's old, but really good.
Another nice one is:
Introduction to Classical and Quantum Computation by Wong - https://www.thomaswong.net/introduction-to-classical-and-qua... [PDF]
These are really nice.
My favorite QM book is the one by Eisberg, Resnick. I recommend it to other people.
There are some nice recommendations in this thread:
- Nielsen, Chuang
- quantum.country by Nielsen
- The IBM Qiskit ecosystem, community, platform, etc. are active and welcoming
Manning Publication has some books on the theme. It's worth it to search through them.
The Umesh Vazirani lectures look great so thanks for pointing it out.
For a sampler, just watched the qubit ones and they are excellent.
1) Generally the two models of QC are the digital/circuit model (analogous to digital logic gates, with some caveats, such as reversibility of operations, no-cloning theorem), and analog computation (tuning the parameters of a continuous-time quantum system in your lab such that the system produces useful output)
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.
Very Nice. Your comment meshes nicely with ktallett's comment here - https://news.ycombinator.com/item?id=46610185
The Nielsen/Chuang book is what i see recommended everywhere and so am definitely going to get it. What others would you recommend?
I had recently asked a similar question about books on "Modern Physics" (essentially Quantum Physics + Relativity) here https://news.ycombinator.com/item?id=46473352 so given your profile, what would be your recommendations?
PS: You might want to add your website url to your HN profile since your Physics Notes might be helpful to a lot of other folks too. :-)
Most of what I use day to day in research is either specialized to my subfield or can be found in Nielsen and Chuang, so I've actually never looked at any other textbooks specifically for quantum computation. If you're interested in more of the information theory aspect, I have heard that "The Theory of Quantum Information" by John Watrous is a good text, but I have not personally read any of it.
As for Modern Physics, if you have the math prerequisites and you want a broad overview, the series of textbooks by Landau and Lifshitz would be my go-to. However, the problems are quite challenging and the text is relatively terse. I think the only other textbook that I've used personally would be Halliday, Resnick, and Krane. I didn't read a great deal of the textbook, but I do recall finding it relatively well-written.
The classic text is Nielsen and Chuang's "Quantum Computation and Quantum Information" [0]. Whatever else you choose to supplement this book with, it is worth having in your library.
[0] https://a.co/d/aPsexRB
Nielsen and Chuang has the clearest exposition of quantum mechanics I've seen anywhere. Last year I was trying to learn quantum mechanics, not necessarily quantum computation, just out of a general interest in theoretical physics. I started with physics textbooks (Griffiths and Shankar) but it only really "clicked" for me when I read the first few chapters of Nielsen and Chuang.
I came across A free introduction to quantum computing and quantum mechanics by Nielsen/Matuschak which seems accessible - https://quantum.country/
For learning the theory behind quantum computing, I usually recommend Watrous's lecture notes [1] - they start out by immediately giving a helpful analogy to ordinary probabilistic computation.
The online tutorial [2] is a good followup, especially if you want to understand Clifford gates / stabilizer states, which are important for quantum error correction.
If you have a more theoretical bent, you may enjoy learning about the ZX-calculus [3] - I found this useful for understanding how measurement-based quantum computing is supposed to work.
[1] https://cs.uwaterloo.ca/~watrous/QC-notes/QC-notes.pdf [2] https://qubit.guide/ [3] https://zxcalculus.com/
Thanks for the pointers.
hershkumar pointed to Watrous' book so the notes you point to might be a good introduction to the book itself.
I didn't know of "ZX-calculus" so that goes from my unknown-unknowns to known-unknowns and so there a bunch of reading to be done there too.
Check out 1blue3brown’s excellent YouTube video lesson on quantum computing and Grover’s algorithm: https://youtu.be/RQWpF2Gb-gU
Goes through qubits, state vectors and Grovers algorithm in a highly visual and intuitive fashion. Doesn’t discuss the underlying quantum mechanics in depth, but does mention and link out to resources for the interested viewer to delve deeper.
Didn't know 3blue1brown had some videos on Quantum Computing so thanks for the pointer.
https://en.wikipedia.org/wiki/Quantum_Computing_Since_Democr...
Quantum computation and information, by Nielsen and Chung
I'd do zero requisites "Quantum Computing for Computer Scientists" by Yanofsky. That is a nice base.
Good recommendation; definitely something which might meet my needs.
Thanks for the pointer.
I have "Essential Mathematics for Quantum Computing" by Woody and "Non-Standard Computation" by Gramß, et al. Both were worth reading, but assumed a bit of background with "foundations of computation."
How approachable and good is the first book? Does it marry the Mathematics to the Physics or is it simply a book on Linear Algebra etc.?
Standard textbook: Isaac Chuang and Michael Nielsen, "Quantum Computation and Quantum Information"
More mathy: A. Yu. Kitaev, A. H. Shen, M. N. Vyalyi, "Classical and Quantum Computation"
A killer app: Peter Shor, "Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer"
Some course notes: https://math.mit.edu/~shor/435-LN/
These are the two books i had zeroed in on before asking here.
However, how approachable is the "Classical and Quantum Computation" book? Mathematics is fine as long as it is accessible. Also how good is the explanation of analogy/comparison between concepts from "Classical Computation" vs. "Quantum Computation"? I believe this is the best way to learn this subject and hence am quite interested to know more about how this book does it.
QC Researcher here!
1/ Digital and analog - where digital equals qubits and analog equals photonics, diamonds, or a range of other bit replacements.
2/ Qubits and gates are the building blocks and operations in digital. Photons, diamonds, electrons, and so on are the bits in analog, you can encode any of these with information in various ways.
3/Strawberry fields for analog qc, and IBM's qiskit for digital
I work on photonic integrated circuits and adapt them to remove the physical limitations on capacity, such as heat, and information loss.
Very nice and succinct points.
What are some good resources that you would recommend to study and understand the above?
Also do you think QC will ever become mainstream like classical computing?