Professional Documents
Culture Documents
May 3, 2022
2
Abstract
The authors Johnson and Winlow in an academic research article (see the citation below) suggest
something very startling based on their investigation as Physiological Scientists of retinal function and
coordination in the brain. They speculate from their research of the retina and its function that the human
brain is a type of Quantum Phase Computer using phase ternary computation. Johnson and Winlow's
investigations on the deconstruction of brain neural network makes a case for the Human Brain as a type
of Quantum Phase Computer, the simplest of which is the Turing Machine today's computers are based
on. The authors go on to assert Turing-based mechanics fail to explain the coding of the retina or for that
matter the computation of intelligence. Their work suggests that coding in the brain neural network is
quantum-based wherein the quanta possess a temporal (time-based) variable and phase-base variable as
the basis for phase ternary computation as they found in the retina in their earlier work. In other work the
authors have pointed out that phase ternary computation is significantly more appropriate in modeling
nervous activity threshold as the trigger of CAP (computational action potential) consisting of three
phases – resting potential, threshold and the time-dependent refractory period, an analog variable as
natural neural networks and for that matter the human brain lacks a central clock which is a critical part of
current conventional computers. In the end the authors point out the possibility of creating a practical
Quantum Phase Computer using phase ternary computation based on the design and functioning of the
Human Brain. This brief paper provides an accessible explanation of is a Quantum Computer and
continues on to investigate the potential to create a new kind of Quantum Computer that uses the
functioning of the Human Brain as a template. In essence practical Quantum Computers that operate in
normal working environments and can be mass produced in quantity at an affordable price point
depending upon their application much like today’s computers that were launched by the IBM PC
There is an enormous amount of interest in quantum computers and much in the news
about the potential and implications of quantum computing. What often is lacking is a clear and
succinct explanation of exactly what is a quantum computer which is a segway into what is
“entanglement” and so forth. That does no answer the “what it is” – just gives details of some
The best way to gain an understanding of quantum computing is to contrast the “classic”
model our current computers are based on with the “quantum” model a quantum computer is
based on. The classic model encompasses the familiar “Turning Machine” model where binary
math is used, 8 bits are grouped in bytes and bytes in turn grouped to represent data. “Gates” are
easily implemented in classic model to create the decision-making scheme of Boolean Algebra –
The classic model our current computers are based on uses the very reliable “bit
mode” where a bit a bit is either on or off with its state persisting until the charge is removed or
state changed. Though very reliable and thoroughly mastered, the classic model is limited in that
it has to process a program sequentially and requires running portions of the program many times
until the conditions of a successful completion are met. Some readers will bring up “what about
parallel processing?” “Parallel processing” in the classic model uses multiple sequential parallel
paths that are moving forward in the same time period to process parts of a program (Preskill,
2021). Parallel processing is a powerful and effective method to speed up running programs.
4
reached. At some point the classic model hits a wall where gaining any additional speed in
processing hits a wall (Preskill, 2021). Today’s most powerful exascale super computers based
on the classic model are unable to run simulations of large models in the physical sciences in
time periods that make sense. That is a good example of the limitations of the classic computer
Dr. Richard Feynman, the famous physicist who originated “string theory” in Physics
was also the originator of the quantum computer. In his 1981 lecture on the topic “Simulating
physics with computers” he floated the idea to simulate “quantum systems” with a “quantum
computer.” He went on to observe “that the number of computer elements required to simulate a
large physical system is only to be proportional to the space-time volume of the physical system”
(Preskill, 2021). Bottom line – our current digital (classical) computers are inadequate for the
task – there is no concise way to classically describe a quantum system consisting of many
particles. An in-depth description of how a quantum system could be described with a computer
Dr. Feynman stated required a different kind of computer – a quantum computer (Preskill, 2021).
The requirement for this new kind of computer stemmed from the nature of a quantum system.
Quantum systems exhibit “quantum mechanical effects” described for a large system of “R
particles” as given by a function called the “amplitude to find the particles at x1, x2…. xR, and
therefore because it has too many variables cannot be simulated with a normal computer.” For a
computer system to simulate a large physical system (e.g., quantum system) it has to possess the
number of computer elements proportional to the “space time volume” of that system. Quantum
computers are systems that take the “amplitude to find the particles at x1, x2…. xR… principle
5
and use it to create a computing paradigm of a very different nature than classical computers
(Preskill, 2021). It is actually quite easy to understand – our knowledge of Physics informs us
that certain types of particles can carry charges and exhibit characteristics that are measurable to
determine their state at a given point in time. The “beautiful twist” that makes a quantum
computer work is that the particles can be carriers of two states (or three or four or more…)
simultaneously that in essence represent the “0” and “1” of bits in classical computing. A
quantum computer’s basic unit is called a “qubit.” A qubit is “two-level quantum mechanical”
construct where the two states in the particle used as the basic part of data for a quantum
computer just as a bit is the basic part of data for a classical computer (Goldfarb & Melko, 2021).
“Two-level” or “superposition” is the key with a qubit. The amplitude of the wave emanating
from the particle that holds the qubit’s values in the two levels can be measured and values of 0
and 1 derived through a well-defined system of probabilities. “Superposition” also is linear in its
nature when quantum values are summed the results are always valid quantum results (Goldfarb
& Melko, 2021). Unlike classic computers which are limited by the hardware
(processingmemory/storage) available to hold the 0s and 1s of bits, the particles holding the two-
level state of the qubit can be infinite in number which creates a time-space limited only by the
speed of light and related physical characteristics. In essence a quantum computer is simply a
computer that is based on the quantum system which for our purposes is unlimited in its capacity
whereas we all very familiar with the limitations of the capacity of the classic computers we use
daily.
Because qubits are linear as superimposed, they can be “entangled” to create the logical gates
and circuits to carry out the usual Boolean mathematics, branching and decision making required
6
to construct and employ useful algorithms through programming using high level language
abstraction (Goldfarb & Melko, 2021). Though currently there are limitations with the stability
of qubits and in turn quantum computers progress is being made. Quantum mechanics informs
us that the number of qubits that can be superimposed and entangled can scale to unimaginable
levels and that the hyper-exponential nature of a computer founded on a quantum system allows
simulations and processing in real time that is impossible with our classic computers.
A Different and Compelling Model of Computation in the Human Brain and Neurons
Luigi Galvani, the Italian Scientist that investigated many types of electrochemical
phenomena ascribed nerve impulses as one of them. This was accepted thinking for over 250
years and inhibited the consideration of computation in the human brain. This assumption led to
the belief that communications between neurons is ionically based electrical and the basis of
computation in the brain (Johnson & Winlow, 2021a). What followed were models of
computation and intelligence using this paradigm leading to the design/employment of neural
networks in the computer sciences and most recently AI (artificial intelligence) in use currently
The current models of nerve conductivity rest on the original research of two UK Scientists
– Hodgkin and Huxley with their pioneering work in 1952 on the “action potential” in neurons.
Hodgkin and Huxley led to the peak of action potential being the temporal marker of computing
and the propagation of action potential explained by cable theory. Cable theory is the traditional
model of how impulses along nerves and in neurons and are detected and quantified by
7
measuring electrical activity (Johnson & Winlow, 2021a). Though not widely as accepted at
present research by physiologists Johnson and Winlow assert that action potentials in nerves
throughout the body and importantly in the brain are not adequately explained by cable theory –
there is simply not enough electrical activity to account for computation in the brain by cable
theory. Johnson and Winlow’s research points to an alternative explanation of what constitutes
action potentials - their association with underlying pressure pulses known as solitons. Their
research has found that action potential is constantly in the company of a synchronized coupled
soliton pressure pulse in the cell membrane of the neuron when combined constitute the action
potential pulse which starts channel opening and triggers very rapid computation in the neural
network. This a critical concept given the multiple forms and plasticity of action potentials
Cognitive scientists are suggesting that memories are molecularly stored in neurons and
after storage weighed plastic synaptic changes occurred and in turn stored in persistent memory.
In neurons computation as it relates to action potential initiation occurs at threshold and memory
is reinforced after learning (Johnson & Winlow, 2021b). The computers we work with daily
work in nano or microseconds whereas neural networks in the brain and elsewhere in the body
have been demonstrated to work in milliseconds and memory is either volatile or more
permanently is a storage medium (e.g. SSD [solid state drive], conventional hard drive, etc.)
(Johnson & Winlow, 2021a). In our computers processing is carried out by grouping bits into
bytes and implemented by circuits into gates to establish the logic to execute programs that
adheres to the Turing Machine paradigm (Preskill, 2021). Computation in neurons and
consequently in the brain is carried out using a very different scheme based on Johnson and
8
Winlow’s research that is gaining more acceptance by physiologists as time goes on. The
observation that neural networks work in milliseconds tracks with Johnson and Winlow’s
research in that computation in neural networks follow a different path than cable theory
(Johnson & Winlow, 2021a). The sheer quantity and variety of processing in the brain and the
observation that movement in its neural networks is slower than our classic computers points to a
very different type of computation in the brain at a far greater level of efficiency. There is
mounting evidence for the soliton pressure pulse in neuronal membranes termed the “Action
Potential Pulse (APPulse).” The ion channels are activated by the soliton with channels adding
entropy to the pulse – the speed of the pulse being defined by static membrane components. The
temporal plasticity of membrane transmission happens at a far slower rate than in the ionic
exchanges represented in the work of Hodgkin and Huxley (Johnson & Winlow, 2021a). As a
consequence, temporal error is minimized in the APPulse. What follows is deconstructing the
structural components of the APPulse into its computational component parts forms
ternary quantum pulse if two pulses collide with the threshold crossing a refractory period the
threshold is annulled (Johnson & Winlow, 2021b). The CAP importantly is a mathematical
depiction of a quantum ternary pulse when a collision occurs between two pulses if a threshold
crosses a refractory period the threshold is canceled. The CAP is valid equally for ternary
quantum computation by traditional Hodgkin and Huxley or the APPulse models. The difference
between the two resides in the temporal precision of successive impulses at a factor up to ten
thousand times greater with the pressure pulse than Hodgkin and Huxley and their cable theory
(Johnson & Winlow, 2021b). Prominently the CAP assumes that the temporal pulse starts on
9
activation and not from the spike peak. Johnson and Winlow’s research is strong evidence that
quantum computation is occurring in the brain activated by a temporal pressure pulse (pulses =
quanta) and the traditional model of an electrical spike as the activator of computation in the
Quantum computation networks are very different than the classic model of the
computers of various types and purposes we use daily (Goldfarb & Melko, 2021). One big
difference is quantum computation does not employ an external clock. Quantum computation
requires input to achieve consistent derived output values compared to Turing (classic)
computation where clock speed is used to synchronize input with output (Johnson & Winlow,
2021b). In the classic model “gating” represents a single command to be executed within its set
time allocated among the input and output orchestrated by an external clock. In a complex
network (a classic computer bus is an excellent example) flow in a network is defined node to
node by exact timing between the nodes that is precisely the same and the outputs always
synchronized to the speed of the system clock. Binary computation in a classic (Turing as we use
know it) is divided between input nodes that hold the binary values of 0 and 1 respectively
outputting a value of 0 (Johnson & Winlow, 2021b). The system grants time to react to input
nodes holding 0 and 1 and time used for the “gated” node to react to input based on a set
program “that defines the output from these specific inputs as being (0).” Termination time in the
classic system is allocated for “gated” output to the exit node. In the current neural models on
our classic computers as used in AI (Artificial Intelligence) for example, timing latencies are
fixed and organized so that events in the flow uniform and equivalent to system clock speed
effectively synchronizing each command (Johnson & Winlow, 2021b). The definition of the
10
Turing-type machine our classic computers are based on follows this system clock-based schema
and is used in all practical applications of the computers we work with daily. Furthermore,
because all synchronizations are performed by an external (system) clock and are therefore
inapplicable to a brain neural network (Johnson & Winlow, 2021b). A non-Turing quantum
machine such as the neural network in the input flow (quanta [pulses]) flow moves to a point of
convergence and consequently computation in the network and timing between nodes dictated by
the speed of the quanta flow between nodes. At the convergence point computation is dependent
upon the quantal information (in a base – binary, ternary, base10, etc.), the mechanism at the
convergence point, and “the timing of interference between quanta” (Johnson & Winlow,
2021b). In a binary-based Turning-type machine the quanta are fixed at the convergence
(computation) point by the system clock speed. The timing in a neural network in a brain is
determined not by a clock rather the speed of the neurons. The flow of inputs in the brain’s
neural network for the action potential or APPulse are determined by the threshold to refractory
periods of each quantum respectively – computation at the convergence point is “defined by the
timing of each quantum and how the interact on collision.” The collision between threshold and
refractory periods with both action potential and APPulse results in cancellation of the
succeeding quanta (Johnson & Winlow, 2021b). Though the rules may differ in different
quantum computation systems this is a fundamental process and applies to all temporal and base
computation in a neural network. It must be kept in mind that the process is not simple in
As Johnson and Winlow write - “When two asynchronous CAPs (Computational Action
Potential – mathematical depiction of a quantum ternary) pulse collide the temporal component
of the refractory period of the leading CAP will annul the second CAP.” It is ditto for two action
potentials. They go on to write that it follows that CAPs are “therefore ternary [three part]
Considering temporal computation is key in the brain neural network where timing is not by a
clock as with our current classical computers. It allows temporal computation resulting in
changes in frequency across the network. The dynamic structure of actual neural networks in the
reinforcing wave packets) that are stable in form and over time providing the basis in the brain
for computation. The phase shift in the collisions between pressure pulses/solitons in the brain
aligned temporally result in massive parallel computation in microseconds. The quantum levels
are massive in number, stable, persistent and fully capable of refreshment and reinforcement
As Johnson and Winlow write “that the fundamental basis of nervous communication is
derived from a pressure pulse/soliton capable of computation with sufficient temporal precision
to overcome any processing errors” that is currently one of the biggest challenges in quantum
computing. The computation process in the brain is based on temporal computation that effects
changes in frequency across the brain’s neural network and is iterative. This type of temporal
computation overcomes one the biggest challenges of quantum computing – error correction.
Additionally, the computation occurs at a temperature within the range for humans and their
12
environments – a really big deal. The question follows is it feasible to create a general-purpose
The potential design of a quantum computer based on quantum ternary computation that
is becoming gradually accepted as how computation occurs in the brain is feasible because of
advances in material science and engineering. Due to the slowing of advances in classic
computing new types of computing are being very actively pursued. Most prominently are
quantum computers and another type of computer paradigm is neuromorphic (Mehonic &
Kenyon, 2022). “Neuromorphic” was originated by Carver Mead at the California of Technology
in the late 1980s and describes systems and devices that imitate some functions of natural neural
systems and in essence moves to creation of analog computers (Mehonic, & Kenyon, 2022).
“Neuromorphic” computers follow the traditional Hodgkin and Huxley model of electrical spikes
in neurons. A quantum ternary computer has a neuromorphic basis but then goes in a different
direction. As was written in Part 1 of this paper current quantum computers use “qubits” to
represent two-states (two-level) quantum bits as classical computers uses “bits.” The difference
quantum which keeps it in the digital realm as with quantum computers that use qubits. (Roberts,
2021). Qutrits lend themselves nicely to the creation of “universal quantum circuits” that can run
arbitrary algorithms. The logic gates produced enable pairs of qutrits to interact with each other
with the three different levels produced by the phase shift of the solitons at the point of
convergence in the CAP resulting in computation (Roberts, 2021). The outputs of CAPs create
ongoing entanglement allowing computation to scale to a level that is temporal and massively
13
parallel (Johnson & Winlow, 2021a). The measurement of one of the pairs of qutrits as a result
of their entanglement ensures that their state is passed on to another qutrit and so forth until a
particular program ends (Roberts, 2021). In sum, qutrits and qubits afford computer scientists
and engineers the ability to create quantum algorithms and programming environments that
leverage the processing power of quantum computers and make it accessible to software
engineers and programmers that work in many fields. In the end using qutrits may offer some
One potential way to create a quantum ternary computer can be based on the efforts of a
team of engineers at Penn State University. Their focus is on using graphene, a carbon-based
cutting-edge material that is gaining more and more uses including creating artificial neural
networks (Barkan, 2020). Graphene is a single plane of sp2 carbon bonded atoms arranged in a
hexagonal lattice. Graphene refers to material that can build up to ten layers to create dense
electrically conductive structures with excellent heat dissipation (Barkan, 2020). Graphene
neural networks are seeking to achieve the energy and processing efficiencies of the human
brain. Graphene neural networks as with synapses in the brain are reconfigurable by applying a
brief electrical field to the one atomic thick layer of carbon atoms of the graphene sheets
potentially comprising the graphene computer. Moving data from memory to logic and back
again eats up a lot of energy and slows computing speed. Like the brain the graphene neural
networks do not have to be differentiated between processing and memory resulting a very
efficient resource utilization plus low power consumption (Barkan, 2020). The Penn State Team
has created simple graphene field effect transistors to precisely control a large number of
memory states (Barkan, 2020). The field effect transistors can serve as phase shift oscillators
14
creating the phase differences as the basis for CAPs and convergence for quantum ternary
using quantum ternary computation would be very light, compact and scalable for applications
that could range from tiny embedded to applications now covered by tablets, laptop computers,
industrial control, servers and so on. Graphene can be potentially layered to build extremely
Conclusion
Our current classic computers are little doubt one of greatest technical achievements in
the history of humanity. Our networks, software, devices and embedded computers have become
increasingly more powerful, faster, useful and ubiquitous. Leaving home without our
smartphone is unthinkable for most of us. Our technology generally relies on computers and
computing but behind the scenes a major challenge is developing – our classical computer
paradigm is beginning to experience challenges with its speed and scalability to meet our
We are accustomed to the progression of Moore’s Law with ever increasing processing
power and lower cost with our classic computers – that is gradually coming to an end. The
answer to the challenge is quantum computers which are being aggressively worked on globally
and offer the promise to allow computation to scale to an unimaginable level overcoming current
limitation we have in areas from AI to simulations of large physical systems. Technical problems
are holding quantum computing back (Mehonic & Kenyon, 2022). Two challenges are
particularly daunting – maintained a readable state for the subatomic particles/atoms that hold the
two-level state of qubits which requires maintaining supporting hardware at very low
15
temperatures and error correction. Assuring that the results of quantum computing processing
A ternary quantum computer based on the design of the human brain offers an alternative
that can operate at normal temperatures not requiring a special environment, uses materials to
create its neural networks like graphene that enables the “bits” in qutrits to be maintained in a
stable state, can be packaged and distributed from thumb-sized computers to compact server
packages in levels of power tailored to the quantum computer’s application. A material like
graphene can be fashioned into a computer using existing manufacturing technologies. Error
precision to overcome any processing error based on the design of the neurons in the brain. This
design can overcome many of challenges with quantum computing and may be take the role the
References
Bova, F., Goldfarb, A., & Melko, R. (2021). Quantum computing is coming. What can it do?
ebscohostcom.ezproxy.umgc.edu/login.aspx?
direct=true&db=heh&AN=151711226&site=edslive&scope=site
Barkan, Terrance. (2020). Graphene-based memory resistors show promise for brain-based
https://www.thegraphenecouncil.org/blogpost/1501180/359621/Graphene-basedmemory-
resistors-show-promise-for-brain-based-computing
IEEE Computer Society (2021). Quantum Technology - Scaling for Applicability. The
16
https://www.computer.org/publications/tech-news/research/quantum-technology-
scalingfor-applicability?source=homepage
Johnson, Andrew S., & Winlow, W. (2021a). Does the brain function as a quantum phase
https://doi.org/10.3389/fphys.2021.572041
Johnson, Andrew S., & Winlow, W. (2021b). Nerve Impulses Have Three Interdependent
https://www.researchgate.net/publication/352309033_Nerve_Impulses_Have_Three_Inte
rdependent_Functions_Communication_Modulation_And_Computation
Mehonic, A., & Kenyon, A.J. Brain-inspired computing needs a master plan. Nature 604, 255–
Preskill, John. (2021). Quantum computing 40 years later. ArXiv.org - Cornell University.
https://doi.org/10.48550/arXiv.2106.10522
Roberts, J. (2021). Going Beyond Qubits: New Study Demonstrates Key Components for a
https://newscenter.lbl.gov/2021/04/26/going-beyond-quibits/
17
Critique
In this article the author tried to know about the feasibility of Quantum Computer, Based
on the Design of the Human Brain.in simple word, Quantum deep learning neural networks. The
author investigates the potential to create a new kind of Quantum Computer that uses the
functioning of the Human Brain as a template i.e. Deep learning based Quantum Computer. The
author explained the different view but did not mentioned the state of art in the given article
A neural network is a data processing model that is based on the design of mammalian
brains in nature, which are made up of neurons connected by many connections. Natural learning
in the brain causes these neurons to develop unique linkages, transformations, and mappings.
These neurons, known as units, nodes, or perceptron in computer science, are artificially
produced neurons, AI- based Quantum Computer state of art.
A discussion of where the sources that you read believe the technology is heading in the near
future
Litt, A., Eliasmith, C., Kroon, F. W., Weinstein, S., & Thagard, P. (2006). Is the brain a quantum
computer?. Cognitive Science, 30(3), 593-603.
Zhang, Y., & Ni, Q. (2021). Design of quantum neuron model for quantum neural networks.
Quantum Engineering, 3(3), e75.
Kamruzzaman, A., Alhwaiti, Y., Leider, A., & Tappert, C. C. (2019, March). Quantum deep
learning neural networks. In Future of information and communication conference (pp. 299-
311). Springer, Cham.
Hu, F., Wang, B. N., Wang, N., & Wang, C. (2019). Quantum machine learning with D‐wave
quantum computer. Quantum Engineering, 1(2), e12.
Neumann, N., Phillipson, F., & Versluis, R. (2019). Machine learning in the quantum era.
Digitale Welt, 3(2), 24-29.
A discussion of how this technology will affect the choices you would make if you were making
purchase recommendations for a client.
quantum differentiation function and a hybrid quantum-classic design. We run extensive tests on
quantum simulators and IBM-quantum Q's platform, as well as assessing performance on
IonQ.QuClassi outperforms state-of-the-art quantum-based solutions Tensorflow-Quantum and
QuantumFlow by up to 53.75 percent and 203.00 percent for binary and multi-class
classifications, respectively, according to the evaluation results. QuClassi delivers comparable
performance with 97.37 percent less parameters than typical deep neural networks.