The human brain is not a quantum computer: Explaining the speed of factorization using analog circuit theory


1. The Hypothesis of the “Quantum Brain”

For many years, the hypothesis that “the human brain performs quantum computation” has been discussed.
One piece of supporting evidence has been that “only humans can intuitively and quickly perform factorization.”

However, I have recently come to believe that this hypothesis can almost be refuted.
Why? Because the phenomenon can be fully explained by the analog circuit theory of the brain—without invoking quantum computation at all.


2. Why Did the “Quantum Brain” Hypothesis Arise?

To be honest, until recently I too believed that “the human brain must be performing quantum computation.”
Factorization, in particular, was often cited as evidence.

Factorization is notoriously difficult for classical computers. In fact, its computational difficulty has been the very foundation of many modern encryption systems.
For nearly half a century since the invention of RSA cryptography, RSA-2048 remains unbroken.

The potential disrupter of this foundation is the quantum computer. It is said that in the future, quantum computers will be able to break today’s encryption schemes instantly.

And yet—many people can look at a high school–level polynomial factorization problem and solve it at a glance.
Here lies a curious similarity between the human brain and quantum computers.

But most likely, humans are not performing quantum computation.
Instead, the brain appears to rely on analog circuitry that enables intuitive pattern matching.


3. Quantum Computers Are Not That Impressive (Yet)

Who would win a factorization contest: today’s quantum computers or a fast high-school mental calculator?
The answer is clear: the high school student.

The Limits of Current Quantum Computers

  • Implementations of Shor’s algorithm so far (IBM, Google, etc.) have only managed to factor numbers like 15 or 21.
  • The bottleneck: high error rates and the lack of stable “logical qubits,” which remain at only a few dozen.
  • They cannot yet handle 4–5 digit integers, let alone something like RSA-2048.

Human Mental Calculation

  • With tricks and experience, small numbers can be factored extremely quickly.
  • Example: 12321 = 111 × 111 can be spotted in seconds.
  • Numbers in the hundreds can often be solved in tens of seconds, and even 4–5 digit numbers may be solved in under a minute if they have small prime factors.

Conclusion:

  • Small numbers (high school level, a few digits): humans dominate
  • Large numbers (hundreds of digits, RSA-scale): humans are hopeless, but quantum computers are also not there yet
  • Future (millions of stable qubits): quantum computers might surpass humans

In short, quantum computers may be theoretically optimal for large-number factorization—but in practice, their current level is closer to basic multiplication drills.


4. Explaining It with Analog Circuit Theory

On the other hand, humans seem to solve factorization problems not with quantum computation but through interference of analog signals. Possible mechanisms include:

  • Pattern Resonance
    When presented with a number or expression, multiple “factor candidates” are activated in parallel, and the one with the strongest resonance emerges.
    → This is not digital search but an analog interference of signal strength.
  • Continuous Analog Representation
    Instead of testing divisibility step by step, the brain embeds numbers or expressions as vectors and “slides” toward factorable directions.
    → Very similar to how LLMs embed tokens into continuous vector space.
  • Noise Tolerance
    Because the process is analog, even noisy or complex inputs (e.g., large coefficients) still yield approximate candidate factors.
    → Unlike digital computers, humans can produce “near factorization” intuitively.

5. Why Can Humans Do It but Computers Cannot?

Even in the age of AI, there remains a vast gap between humans and machines. The key difference lies in the presence of sensory inputs and the brain’s ability to build unique learning circuits from them.

  • Linguistic Pattern Learning
    Through math education, humans internalize a dictionary of patterns (e.g., perfect squares, sum–difference products).
  • Analog Parallel Processing
    Multiple candidates activate simultaneously, and the most fitting one emerges via interference.
  • Efficient Elimination
    Non-fitting candidates quickly decay, much like the natural attenuation of unnecessary signals in analog circuits.

In short:
Humans factor numbers quickly not through sequential digital search, but through parallel resonance in analog circuits, where the optimal solution naturally rises to the surface.

This alone explains much of what has been attributed to a “quantum brain.” In fact, at present, humans outperform quantum computers in many practical cases.


6. Conclusion

That said, the “quantum brain” hypothesis has not been entirely disproven.
There remain possibilities such as:

  • Quantum effects within microtubules
  • Quantum coherence influencing probabilistic synaptic transmission
  • The non-deterministic nature of unconscious intuition

These ideas are not yet proven—or disproven.

But from a practical perspective, these have little impact on our daily work.
And above all, human mistakes, ambiguity, and flashes of genius can all be more elegantly explained by analog circuit theory.

Thus, it seems more accurate to conclude:

The human brain may not rely on quantum computation at all—yet it inherently holds potential beyond even today’s quantum computers.

Scroll to Top