Jump to content

Talk:Quantum computing

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Former featured articleQuantum computing is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed.
Article milestones
DateProcessResult
January 19, 2004Refreshing brilliant proseKept
May 9, 2006Featured article reviewKept
May 13, 2007Featured article reviewDemoted
Current status: Former featured article


A bit is not "physical"

[edit]

One sentence reads as follows:

"A classical bit, by definition, exists in either of two physical states, which can be denoted 0 and 1."

This is misuse of the word "physical".

A bit is a concept, not a physical entity. 2601:204:F181:9410:2191:ADEC:5EE:A9CD (talk) 21:30, 23 February 2025 (UTC)[reply]

The light switch in my room disagrees. Johnjbarton (talk) 23:50, 16 April 2025 (UTC)[reply]
I would agree with "physical two states" is incorrect. In a CPU a bit isn't physical, the physical level are transistor based circuits of gates, and the difference between a digital bit ZERO / ONE is a matter of a potential difference of around 1.5 volts DC. The physical level are the (tiny) transistors. EditorÆ (talk) 11:23, 23 July 2025 (UTC)[reply]

Speculation in "Potential applications"

[edit]

I deleted two paragraphs of "Potential applications" as not encyclopedic. These are just reports of investments or industrial puffery. Johnjbarton (talk) 23:50, 16 April 2025 (UTC)[reply]

Instead of "classical" ?

[edit]

Instead of "classical" computers, couldn't we use "Transistor based computers" (or semiconducting) ? CPU's are transistor based technology EditorÆ (talk) 03:20, 23 July 2025 (UTC)[reply]

Being based on transistors is not relevant. Classical computer is an abstraction, rather than literal hardware, based on the idea that a bit can be in one of two deterministic states. The classical computer/quantum computer distinction is a standard one in the literature. Tito Omburo (talk) 10:58, 23 July 2025 (UTC)[reply]
I agree in general, but this is a comparing, not an article on "other computers". And "classical" could equally mean "electron tubes", it's all a matter of perspective. Don't you think ? Quantum mechanic based computers are here now, operational ones soon. And also like the transistor once was replacing electron tubes, transistors will be replaced with qubits. EditorÆ (talk) 11:11, 23 July 2025 (UTC)[reply]
Classical computing is based on bits, quantum computing is based on qubits. Nothing about hardware is relevant to this distinction. Tito Omburo (talk) 11:16, 23 July 2025 (UTC)[reply]
No the very first computer, Alan Turing's was electro mechanical, and could only run one single program, to run a different , cables had to be manually switched. That's classical, isn't it ? Then more electron tubes made it possible to become programmable. Classical too. It's the word classical I find unsuitable. It's a subjective word. EditorÆ (talk) 11:54, 23 July 2025 (UTC)[reply]
A Turing machine is a classical computer. Tito Omburo (talk) 11:55, 23 July 2025 (UTC)[reply]
Yes. We have classical, more classical and the most classical. Suggestion - replace "classical computers" with "soon old computers" - or in line with that. If you do not agree, I give you the ball. Thanks EditorÆ (talk) 12:06, 23 July 2025 (UTC)[reply]
Look, the distinction between classical and quantum computing is well-attested in high-quality sources. That's what matters here, not whether quantum hardware will soon replace transistor technology (it will not). Tito Omburo (talk) 12:13, 23 July 2025 (UTC)[reply]
I agree with Tito here: the sources use "classical computers" to mean "computers based on physical principles predating quantum mechanics" rather than "previous generation computers". Johnjbarton (talk) 14:43, 23 July 2025 (UTC)[reply]
I agree with both of you, classical is the appropriate and established word. In physics the contrast quantum vs. classical is well established (and has been for 100 years) and that distinction has been inherited (in a probably even more clearly defined manner) by information theory and theory of computation. It's all over textbooks such as Nielsen/Chuang (The bit is the fundamental concept of classical computation and classical information. Quantum computation and quantum information is built upon an analogous concept, the quantum bit.) or Mermin, Quantum Computer Science (who builds all on the distinction of cbits and qbits) of Wilde: Quantum Information Theory (The history of classical information theory began with Claude Shannon...). Qcomp (talk) 16:49, 23 July 2025 (UTC)[reply]

We've made a few changes to the lede paragraph. Initially, I was unhappy with the idea that a "classical computer" uses only classical physics. Classical computation is not tied to the type of physics involved in the hardware principles. E.g., MOSFETs do not function classically. "Quantum computing" is a different paradigm in theoretical computing, not tied to whether quantum mechanics operates in the real world. Indeed, "classical" computing as we know it, would not be possible without nm scale quantum phenomena. Tito Omburo (talk) 20:29, 23 July 2025 (UTC)[reply]

Sorry but I think the most recent changes are leading off the topic and do not summarize the article. The intro should focus on "what quantum computing is" rather than get tangled up in details that might be covered in the article. Specifically too much of the first paragraph concerns QM in classical computing. Johnjbarton (talk) 21:49, 23 July 2025 (UTC)[reply]
I think the focus in the article quantum computing should be on how quantum computation differs from classical computation, rather than the precise physics involved. Quantum computing exists without any physics at all, in fact. And in the physical world "classical" computers are, in fact, quantum devices. Tito Omburo (talk) 22:37, 23 July 2025 (UTC)[reply]
Again I disagree. An article on quantum computing should cover all notable aspects of the topic, include the physics of quantum computing. In other words, the article is not "Comparison of quantum and classical computing". I don't think you will find any source to support the idea that "Quantum computing exists without any physics at all", whatever that means.
But the only point in your comment below which I want to focus on is the last one:
  • And in the physical world "classical" computers are, in fact, quantum devices.
My point above is simple: this claim is not a critical fact about the article topic and does not belong in the introduction unless it forms a significant section in the article per WP:INTRO. Johnjbarton (talk) 22:45, 23 July 2025 (UTC)[reply]
But it's wrong to say that ordinary computers are purely classical devices. This is, in fact, mentioned in the article. And it is simply wrong that any computer using quantum mechanics is a quantum computer. Also, the prior version of the lede incorrectly suggested that quantum computers are physical, hardware, devices, when the overwhelming consensus of the literature is that they are theoretical devices. Fwiw [[[User:Tito Omburo|Tito Omburo]] (talk)
fwiw here is a diff. Much more accurate than what preceded it. Tito Omburo (talk) 22:58, 23 July 2025 (UTC)[reply]
Sorry wrong link, I meant WP:LEAD.
Rather than generalities let's focus on this specific sentence:
  • While ordinary ("classical") computers may use quantum mechanics on a nanometer scale, because of the physics of transistors, the operation of an ordinary computer can, up to a constant factor of time, be replicated by a (classical) mechanical device, like the original Turing machine, in which all logic is deterministic.
Whether or not this is correct is not relevant: this sentence neither discusses the article topic nor summarizes any content in the article. Here we say almost the same thing but focus on the topic:
  • Unlike ordinary ("classical") computers, quantum computers cannot be replicated with any mechanical device; quantum computers are not deterministic Turing machines.
Introductions need to be compact and focused. Things like constant factor of time for a ordinary computer belong in the body if at all. Johnjbarton (talk) 04:28, 24 July 2025 (UTC)[reply]
I agree that the "While ordinary computers..." sentence is not needed here. I think the "constant factor" is wrong: in general, different classical computers are only thought to be equivalent up to a polynomial overhead. It is also not (as the lede makes it sound) mainly the non-deterministic nature of the quantum computer, which makes it more powerful: it is also thought to be exponentially faster than probabilistic classical Turing machines (at some tasks), while these latter ones are not known to be more powerful than deterministic Turing machines. Moreover, the lede suggests that it is known that quantum computers are faster than classical ones. But this is not the case. They are widely believed to be faster, because quantum mechanics is exponentially hard to simulate on classical devices. And because for certain tasks we know quantum algorithms that exponentially outperform all known classical ones (but there is no proof that better classical ones do not exist). My proposal:
A quantum computer is a (real or theoretical) computer that uses quantum mechanical phenomena in an essential way: a quantum computer exploits superposed and entangled states and the non-determinism of the outcomes of quantum measurement as features of its computation. A scalable quantum computer is thought to be able to perform some calculations exponentially faster than any classical computer. Theoretically, a large-scale quantum computer could break some widely used encryption schemes and aid physicists in performing physical simulations. However, current hardware implementations of quantum computation are largely experimental and impractical, with several obstacles to useful applications.
Actually, the last sentence sounds a bit outdated to me: after all, there are now various "commercial" products. Maybe it could be changed to:
However, current hardware implementations of quantum computers are still too noisy and small for useful applications.
--Qcomp (talk) 21:10, 25 July 2025 (UTC)[reply]
Basically agree, although you are wrong about "polynomial overhead" (the current lede doesn't stress this point). It is, in fact, constant (a transistor can, for example, be replaced by a pressure-sensitive valve in an isomorphic manner.) Here I have revised it. Tito Omburo (talk) 22:29, 25 July 2025 (UTC)[reply]
(this leads away from the discussion of quantum computing) No doubt that some classical computers can simulate each other with constant overhead, but in general more is needed. E.g., the number of steps a one-tape Turing machine needs to simulate a k-tape TM scales quadratically with the steps needed by the latter (p534). And what a word-RAM machine does in T steps takes a one-tape deterministic TM steps (p4). --Qcomp (talk) 17:19, 26 July 2025 (UTC)[reply]
Not to dwell on this too much, but you're reading these sources wrong. The tape in a Turing machine may use any finite alphabet (including words). Any finite collection of Turing machines can be replaced by a single Turing machine on a larger alphabet in constant time. But the main point is that, although modern computers do not operate according to classical physics, their transistors can isomorphically be replaced by mechanical devices (e.g., fluid-controlled valves, mechanical-demon controlled switches) with only constant overhead. Tito Omburo (talk) 18:50, 27 July 2025 (UTC)[reply]
Sorry, but by classical physics - are You referring to Newton (and others before Einstein and Bohr) ? EditorÆ (talk) 10:29, 29 July 2025 (UTC)[reply]
See Classical physics. Johnjbarton (talk) 15:50, 29 July 2025 (UTC)[reply]
thanks for the edit; I modified it a bit to give equal weight to the three mentioned "features" of QM --Qcomp (talk) 17:25, 26 July 2025 (UTC)[reply]
Looks good. Tito Omburo (talk) 20:31, 26 July 2025 (UTC)[reply]
Agree to "exists without any physics at all" - I guess the origin of such comments comes down to the difficult level of physics, related to quantum computers. They come down to Quantum Mechanic. EditorÆ (talk) 00:23, 24 July 2025 (UTC)[reply]

Possible AI-generated content

[edit]

Hi -- I've added the AI generated tag to the section added in this diff, as it displays several common tells of AI writing (as does their user talk page). Just a suspicion and not proof, but flagging anyway. I am nowhere near technologically knowledgeable enough to factcheck this topic, sorry. Gnomingstuff (talk) 04:25, 20 August 2025 (UTC)[reply]

I replace the section with content from a reliable source. Johnjbarton (talk) 15:38, 20 August 2025 (UTC)[reply]

Quantum Computation on a C-based Classical Computer

[edit]

All you need to do is create a Class which has an up spin flag and a down spin flag and manipulate that on a classical computer framework. Since we have computers with more than 12 gigabytes of RAM now, it should be possible to simply classically model quantum computation on a (for example) 12 core 12 gigs computer. I don't see how a 100 million dollar 1000 qubit computer is ever going to pay for itself in real-world useful computation. Wade Smith0078 (talk) 16:16, 21 August 2025 (UTC)[reply]

12 GB is nowhere near enough. This is a case of exponential growth. At 50 qubits you're already into terabytes of storage needed. 1000 qubits is well beyond what a classical computer is capable of. MrOllie (talk) 16:30, 21 August 2025 (UTC)[reply]
50 qubits requires petabytes, and the latency issues on such an enormous system would be a significant limiting factor. Tito Omburo (talk) 20:54, 21 August 2025 (UTC)[reply]
Citing or discussing sources is essential in Talk pages since they are not forums. Johnjbarton (talk) 16:43, 21 August 2025 (UTC)[reply]
The dimension of the Hilbert space you need to simulate is where n is the number of qbits. When your computer can store floating point numbers on the CPU, it might be interesting. This is about 9444732965739290427392 gigabytes. A 1000-bit quantum computer, if simulated on a classical computer, would instantly form a supermassive black hole which would eventually swallow the entire universe (and instantly destroy the solar system). The amount of entropy reuired for such a classical system far exceeds the total gravitational entropy of the observable universe. Tito Omburo (talk) 20:36, 21 August 2025 (UTC)[reply]