Spectral crowding, not a metaphor

There was a conversation on LinkedIn about quantum computing and its possibility or impossibility. I was feeling feisty so I replied. The main argument in the thread was that each qubit in a quantum computer has an energy that needs to be individually addressed and that each energy has an uncertainty defined by quantum mechanics. This means that if we naively tried to stack qubits with equal coupling into energy space there would be a limit to how many would fit. I disagree with this argument not because it is false, but because the current experimental systems are becoming smarter and have developed ways to avoid this problem. Here is my reply.

This was quite a complex thread and I did my best to follow in a ten minute read, but perhaps missed some aspects of the arguments. I would say the energy crowding discussed with the 2^64 states is a problem of hardware architecture. The Martinis group at UCSB has a very nice solution for this using what they call the RezQu architecture, which is simply a Von Nuemann computing system that has a RAM and processor. There is also great work being done on solid state tunable couplers with ON/OFF ratios sufficient to controllably decouple qubits beyond the threshold for fault tolerance. Both of these technological advances would avoid any frequency crowding issues, including the problems of measurement crosstalk and individual qubit operations that would affect other qubits. In the RezQu architecture with tunable couplers this in principle would be overkill for maintaining fault tolerance of unwanted errors. From within the field I can say everyone who is working seriously on quantum computing is well aware of all the foreseeable issues and are working brilliantly to solve them on the fly. I would say most objections to quantum computing come not from people who are working to disprove its possibility of realization, but rather from the researchers who are discovering new and exciting research questions as the field advances. The most daunting hindrance at this point is engineering and material science rather than any fundamental physics limitations.

Although, I would point out the real issue with further advances in quantum computing will be in reliable demonstrations of development. Full quantum process tomography will be a greatly challenging question in the coming years, but this is research and there are many resources from image processing and large scale data analysis that will need to infiltrate the field of quantum computing as we need to collect and process incredible amounts of data in order to verify that the quantum mechanics of the system is what we think it is. At some point I think the proof will be in the pudding, algorithms will work reliably even though we cannot simulate them, but that is the point of quantum computing! That is to say, if we as a community choose to continue developing the technology needed. I cannot say I think qc is possible or impossible, but I have seen nothing yet that convinces me that it is impossible. My disclaimer is that I have worked in weak measurement reversal and proposed an application with my phd advisor which was realized in experiment and extended to reverse entanglement sudden death, which was thought to be impossible, so I am a bit of an optimist.