What Is Mathematical Knowledge?
Interpreting the world remains a large objective, and one important unanswered question concerns the
nature of mathematical knowledge. Why is it true that 3 + 4 = 7? In chapters 2 and 4, I briefly
mentioned how mathematical knowledge has spurred numerous philosophers and mathematicians to
reject naturalism. For Plato and many successors, there must be an a priori basis for truths of
arithmetic, geometry (such as the Pythagorean theorem), and many other branches of mathematics.
They think that it is necessarily true (in all possible worlds) that 3 + 4 = 7, in a way that natural
science cannot explain.
Puzzles about how people manage to grasp mathematical truths have long been a source of the view
that ideas are supernatural. A full-blown plausible naturalistic alternative requires learning much
more about the nature of mathematical concepts as they develop in human brains. Already there is
some understanding of concepts of number in animals and infants, but the neural underpinnings of
mathematical knowledge are just beginning to be investigated.
As a first pass, we can say that mathematical concepts from three to right triangle to infinite
number are all patterns of brain activation of the sort discussed in chapter 4. This does not assume
that such concepts are derived directly from experience, because we have seen that new concepts can
be formed by conceptual combination that go far beyond perception. Moreover, some basic concepts
like object may be innate. Activation of concepts like number and addition may begin with specific
examples when children observe collections of objects and are taught to count and add, but
conceptual combination can quickly generate abstract combinations such as number divisible only by
itself and 1. The kinds of neural mechanisms I mentioned in discussing creativity should suffice
equally well for producing representations of mathematical abstractions.
But there is a crucial difference between theoretical entities such as sound wave and mathematical
entities such as infinite number. Even though we cannot observe sound waves, we are justified in
believing that they exist by inference to the best explanation. We cannot hear or see sound waves, but
we can observe their causal effects whenever we hear sounds. In contrast, purely mathematical
entities like numbers do not have any direct causal effects, so how can we be justified in thinking they
exist?
I was once tempted to say that numbers exist because numbers are concepts, and concepts are
patterns of neural activation that exist in real brains. The problem with this view is that there would
seem to be far more numbers than patterns of brain activation. Assuming that neurons can fire or not
fire about 100 times per second, and that there are 100 billion neurons, then we can calculate that
there are at least (2
100
)
100,000,000,000 possible patterns of activation in the human brain. This is an
extraordinarily large number, far greater than the number of kinds of things there are in the universe,
which is usually estimated to contain only about 10
80 elementary particles. But the number of integers
(1, 2,…) is infinite, because we can always produce a greater integer just by adding 1. (A similar
proof shows that there are an infinite number of reality TV shows, because an even worse one is
always coming along.)
Intensifying the problem, the nineteenth-century mathematician Georg Cantor showed that there are
more real numbers (e.g., pi, 3.14159…) than there are integers, and indeed that there are an infinite
number of sets of infinite numbers of different sizes, an infinity of infinities. Clearly the brain cannot
hold an infinite number of patterns. So numbers cannot all be brain concepts, any more than they can
be theoretical entities inferred by inference to the best explanation.
I think the most plausible way out of this impasse is to conclude that numbers and other
mathematical objects are just fictions: they don't exist in the real world, any more than Harry Potter,
Hamlet, and angels do. Then purely mathematical claims are fictional too, although they can be
plausible or implausible within the context of the fictional worlds they describe. Fictionally, Harry
Potter is a boy wizard rather than a dog, and angels have wings rather than jet engines. Similarly,
within the context of the axioms of number theory, numbers can be infinitely large or small; and within
the context of set theory, there is an infinity of infinite sets. But numbers, sets, and wizards do not
exist in the real world.
The major problem with understanding mathematical objects as fictions resides in comprehending
how mathematics can be so useful in describing and explaining the world. It seems that there are
straightforward arithmetical truths such as 2 + 2 = 4, and many branches of mathematics, such as
algebra and calculus, that are invaluable in scientific fields ranging from physics to theoretical
neuroscience. How can mathematical models of brain functioning tell us anything about thinking if
math is fictional?
The most plausible answer is that many mathematical claims can be understood as being about the
real world rather than about some abstract domain of objects. I think that the following claim is true:
Putting 2 objects together with 2 other objects makes a total of 4 objects. This is a claim about
objects, not about numbers, so it can be true of the real world. Similarly, algebra and calculus are
neither true nor false, but they are used to express evaluable claims about physical systems, claims
that can be judged to be true or false on the basis of experimental evidence and inference to the best
explanation. Mathematical statements are not true a priori, nor are they generalizations about the
world; but we can combine mathematical concepts with concepts about things and processes to make
claims about the world. Abstract mathematical statements such as those in set theory and number
theory are fictional assertions rather than necessary truths.
Yet these fictions do sometimes turn out to be very useful for describing the real world. Imaginary
numbers and group theory, for example, were ideas developed in pure mathematics that turned out to
be important for theories in physics. I think that pure mathematics sometimes turns out to be
scientifically useful for the same reason that good fiction can tell us much about human psychology
and social relations. Harry Potter and wizards do not exist, but J. K. Rowling's characters are based
on her familiarity with and understanding of human social relations. My favorite authors (such as
Shakespeare, Tolstoy, and Carol Shields) produce intensely interesting fictional characters and
events because they know so much about human nature derived from their own experience. Similarly,
the abstractions that mathematicians produce are often not pure creations; rather, mathematicians
develop them by imaginatively combining concepts that originated in reflections on aspects of the real
world. The writer Julian Barnes said that the novel tells beautiful, shapely lies which enclose hard,
exact truths. Mathematics tells beautiful, exact lies that sometimes approximate to messy truths.
To make this view of mathematics plausible, we need to know much more about the nature of
mathematical concepts. A wealth of experimental evidence is accumulating concerning the nature of
numerical thinking in human adults and infants, as well as in other animals. In accord with the view of
concepts defended in chapter 4, mathematical concepts are patterns of neural activation that encode
many different kinds of representation—visual and spatial as well as verbal and formal. But the
development of mathematics will not be well understood until we have a better account, to be
provided by theoretical neuroscience, of the mechanisms by which neural populations in multiple
brain areas can generate new, more abstract mathematical concepts.