{"id":196715,"date":"2023-12-07T21:39:24","date_gmt":"2023-12-07T21:39:24","guid":{"rendered":"https:\/\/tokenstalk.info\/?p=196715"},"modified":"2023-12-07T21:39:24","modified_gmt":"2023-12-07T21:39:24","slug":"harvard-scientists-claim-breakthrough-advent-of-early-error-corrected-quantum-computation","status":"publish","type":"post","link":"https:\/\/tokenstalk.info\/crypto\/harvard-scientists-claim-breakthrough-advent-of-early-error-corrected-quantum-computation\/","title":{"rendered":"Harvard scientists claim breakthrough, \u2018advent of early error-corrected quantum computation\u2019"},"content":{"rendered":"
When industry insiders talk about a future where quantum computers are capable of solving problems that classical, binary computers can\u2019t, they\u2019re referring to something called \u201cquantum advantage.\u201d<\/p>\n
In order to achieve this advantage, quantum computers need to be stable enough to scale in size and capability. By-and-large, quantum computing experts believe the largest impediment to scalability in quantum computing systems is noise.<\/p>\n
Related: <\/em><\/strong>Moody\u2019s launches quantum-as-a-service platform for finance<\/em><\/strong><\/p>\n The Harvard team\u2019s research paper, titled \u201cLogical quantum processor based on reconfigurable atom arrays,\u201d describes a method by which quantum computing processes can be run with error-resistance and the ability to overcome noise.<\/p>\n Per the paper:<\/p>\n \u201cThese results herald the advent of early error-corrected quantum computation and chart a path toward large-scale logical processors.\u201d<\/p><\/blockquote>\n Insiders refer to the current state of quantum computing as the Noisy Intermediate-Scale Quantum (NISQ) era. This era is defined by quantum computers with less than 1,000 qubits (the quantum version of a computer bit) that are, by-and-large, \u201cnoisy.\u201d<\/p>\n Noisy qubits are a problem because, in this case, it means they\u2019re prone to faults and errors.<\/p>\n The Harvard team is claiming to have reached \u201cearly error-corrected quantum computations\u201d that overcome noise at world-first scales. Judging by their paper, they haven\u2019t reached full error-correction yet, however. At least not as most experts would likely view it. <\/p>\n Quantum computing is difficult because, unlike a classical computer bit, qubits basically lose their information when they\u2019re measured. And the only way to know whether a given physical qubit has experienced an error in calculation is to measure it. Th<\/p>\n Full error-correction would entail the development of a quantum system capable of identifying and correcting errors as they pop up during the computational process. So far, these techniques have proven very hard to scale.<\/p>\n What the Harvard team\u2019s processor does, rather than correct errors during calculations, is add a post-processing error-detection phase wherein erroneous results are identified and rejected.<\/p>\n This, according to the research, provides an entirely new and, perhaps, accelerated pathway for scaling quantum computers beyond the NISQ era and into the realm of quantum avantage.<\/p>\n While the work is promising, a DARPA press release indicated that at least an order of magnitude greater than the 48 logical qubits used in the team’s experiments will be needed to “solve any big problems envisioned for quantum computers.\u201d<\/p>\n The researchers claim the techniques they\u2019ve developed should be scalable to quantum systems with over 10,000 qubits.<\/p>\nNoisy qubits<\/h2>\n
Errors and measurements<\/h2>\n